Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less
Posted 2 months ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 2 months ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be proficient in Databricks Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptoms Manages problems that require involvement of others to solve Reaches sound decisions quickly Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Roles Responsibilities Provides innovative and cost effective solution using databricks Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Learn adapt quickly to new Technologies as per the business need Develop a team of Operations Excellence building tools and capabilities that the Development teams leverage to maintain high levels of performance scalability security and availability Skills The Candidate must have 710 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 2 months ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, Seeking a Cloud Monitoring Specialist to set up observability and real-time monitoring in cloud environments. Key Responsibilities: Configure logging and metrics collection. Set up alerts and dashboards using Grafana, Prometheus, etc. Optimize system visibility for performance and security. Required Skills & Qualifications: Familiar with ELK stack, Datadog, New Relic, or Cloud-native monitoring tools. Strong troubleshooting and root cause analysis skills. Knowledge of distributed systems. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less
Posted 2 months ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
4.0 - 7.0 years
3 - 8 Lacs
Bengaluru
Work from Office
JLL empowers you to shape a brighter way . Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people and empowering them to thrive, grow meaningful careers and to find a place where they belong. Whether you’ve got deep experience in commercial real estate, skilled trades or technology, or you’re looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward. We are currently seeking a Software Engineer II to join our JLL Technologies Leasing Engineering team. About JLL Technologies – JLL Technologies is a specialized group within JLL. We deliver unparalleled digital advisory, implementation, and services solutions to organizations globally. We provide best-in-class technologies to bring digital ambitions to life aligning technology, people, and processes. Our goal is to leverage technology to increase the value and liquidity of the worlds buildings, while enhancing the productivity and the happiness of those that occupy them. We are seeking candidates that are self-starters who can work in a diverse and fast-paced environment that can join our team to manage and deliver software. What this job involves As a Full Stack Engineer at JLL Technologies, your responsibilities are to: Develops technical understanding of existing architecture to design and expand its capability with new business requirement Independently develops, executes, and monitors complex web and business components, web services, and reports for assigned projects. Maintaining and improving existing codebase and perform peers code review Exploring and evaluating new technologies where relevant Perform unit testing, performance testing, system integration testing and assist with user acceptance testing. Providing on-going support (Troubleshoot, identify, and rectify production issue) to application used within the organization Providing written technical documentation Participating in weekend deployments when required Technical Skills & Competencies Mandatory: Experienced with React.js (with Redux) frontend technologies Knowledge of Node.js development. Strong knowledge of C#, .NET Core, WEB API Strong proficiency in JavaScript (ES6), including DOM manipulation and the JavaScript object model Strong proficiency in MS-SQL, procedure and performance tuning Write unit tests and integration tests using code coverage tools Proficiency in Material UI, HTML5 & CSS web design language Familiarity with Azure Cloud offering and Dev Ops, Github platform Preferable: Experience in development on PaaS offering such as Azure Function, Azure Logic apps, APIM, Data Factory Experience in Elastic Search or Azure cognitive search Experience in adoption of code quality tool such as SonarQube Sound like the job you’re looking for Before you apply it’s also worth knowing what we’re looking for : Education and experience A Bachelors degree in computer science, information systems, software engineering, or a related field. 3-5 years of experience in application development, integration, implementation, and maintenance Reliable, self-motivated, and self-disciplined individual. Effective written and verbal communication skills. Excellent technical, analytical and organizational skills. What you can expect from us We succeed together—across the desk and around the globe and believe the best inspire the best, so we invest in supporting each other, learning together and celebrating our success. Our Total Rewards program reflects our commitment to helping you achieve your career ambitions, recognizing your contributions, investing in your well-being and providing competitive benefits and pay. Apply today! Location On-site –Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements. We’re interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. Candidate Privacy Statement . For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy here. Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may contact us at Accommodation Requests . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page I want to work for JLL.
Posted 2 months ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Azure Databricks/Datafactory Working with event based / streaming technologies to ingest and process data Working with other members of the project team to support delivery of additional project components (API interfaces, Search). Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Strong knowledge of Data Management principles Experience in building ETL / data warehouse transformation processes Direct experience of building data piplines using Databricks. Experience using geospatial frameworks on Apache Spark and associated design and development patterns Experience working in a Dev/Ops environment with tools such as Terraform
Posted 2 months ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, Looking for a Cloud Data Engineer to build cloud-based data pipelines and analytics platforms. Key Responsibilities: Develop ETL workflows using cloud data services. Manage data storage, lakes, and warehouses. Ensure data quality and pipeline reliability. Required Skills & Qualifications: Experience with BigQuery, Redshift, or Azure Synapse. Proficiency in SQL, Python, or Spark. Familiarity with data lake architecture and batch/streaming. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
8.0 - 10.0 years
10 - 20 Lacs
Chennai
Work from Office
Job Title: Azure Solutions Architect Location: Chennai (On-site) Experience: 8 - 10 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience • Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.
Posted 2 months ago
8 - 12 years
0 Lacs
Pune, Maharashtra, India
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less
Posted 2 months ago
8 - 12 years
0 Lacs
Gurugram, Haryana, India
On-site
Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less
Posted 2 months ago
3 - 8 years
11 - 16 Lacs
Bengaluru
Work from Office
About The Role : Key Responsibilities: Own, manage and prioritize requirements in the product life cycle from definition to phase-out. Define platform requirements for native, on-premise, and cloud deployments. Provide clear direction, context, and priorities to development teams. Collaborate closely with key internal stakeholders and engage with external stakeholders. Focus Areas: Must - Healthcare market. Product knowhow and customer understanding. Must - Sound knowledge of Clinical Workflows and Healthcare IT, especially in the area of Radiology. Must - Healthcare Industry standards like DICOM and IHE. Must - Good understanding of software systems categorized as Medical Device. Must - Basic understanding of Legal regulations and standards applicable for medical devices, affecting safety aspects(i.e. FDA 21CFR820QSR, ISO 13485). Must - Platform Scalability & ModernizationEnable flexible architecture supporting hybrid cloud, containerization, and orchestration (e.g., Kubernetes). Must - Azure ExpertiseDeep knowledge of Azure services (Data Lake Storage, SQL, Data Factory, Synapse) and cloud cost management. Must - Data Lake ArchitectureProficient in data ingestion, storage formats (Parquet, Delta Lake), and multi-zone design (raw, curated, analytics). Nice to have - SQL & DatabasesStrong SQL skills with experience in database design, optimization, and complex queries. Nice to have - Qlik BI ToolsSkilled in Qlik Sense/QlikView for data modeling, transformation, and dashboard/report development. Nice to have - Exposure to agile methodology What are my tasks? Gather, prioritize, create & communicate stakeholder and market requirements & S/W specifications Guide and support development teams, resolving conflicts and answering questions Manage all the Agile methodology practices related to requirements engineering and product definition Provide input to project management and support rollout activities such as training, presentations, and workshops What do I need to know to qualify for this job? QualificationA Bachelors / masters degree in engineering and / or MCA or equivalent. Work Experience12 to 15 years
Posted 2 months ago
2 - 5 years
11 - 16 Lacs
Bengaluru
Work from Office
Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot LocationBengaluru Type of ContractPermanent Analytics, Data and Integration Type of workHybrid Work LanguageFluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also a ct as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantageshealth insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps, , MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversityrespect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestl’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestl We are Nestl, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respectrespect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 ?billion sales in 2022, we have an expansive presence, with 344 ?factories in 77 ?countries. Want to learn more? Visit us at www.nestle.com . ?We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestl #beaforceforgood How we will proceed You send us your CV ? We contact relevant applicants ? Interviews ? Feedback ? Job Offer communication to the Finalist ? First working day
Posted 2 months ago
1 - 5 years
9 - 13 Lacs
Bengaluru
Work from Office
Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot LocationBengaluru Type of ContractPermanent Analytics, Data and Integration Type of workHybrid Work LanguageFluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also a ct as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using SAP PO, CPI, Logic Apps , ADF, LTRS, Data Integration, MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantageshealth insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including SAP PO, CPI, Logic Apps, ADF, LTRS, Data Integration, MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversityrespect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestl’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestl We are Nestl, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respectrespect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 ?billion sales in 2022, we have an expansive presence, with 344 ?factories in 77 ?countries. Want to learn more? Visit us at www.nestle.com . ?We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestl #beaforceforgood How we will proceed You send us your CV ? We contact relevant applicants ? Interviews ? Feedback ? Job Offer communication to the Finalist ? First working day
Posted 2 months ago
4 - 8 years
10 - 18 Lacs
Kochi, Chennai, Bengaluru
Hybrid
Data warehouse developer Experience: 3-8 years Location Chennai/Kochi/Bangalore Responsibilities: Design, build, and maintain scalable and robust data engineering pipelines using Microsoft Azure technologies such as SQL Azure, Azure Data Factory, and Azure Databricks. Develop and optimize data solutions using Azure SQL, PySpark, and PySQL to handle complex data transformation and processing tasks. Implement and manage data storage solutions in One Lake and Azure SQL, ensuring data integrity and accessibility. Work closely with stakeholders to design and build effective reporting and analytics solutions using Power BI and other analytical tools. Collaborate with IT and security teams to integrate solutions within Azure AD and ensure compliance with data security and privacy standards. Contribute to the architectural design of database and lakehouse structures, optimizing for performance and scalability. Utilize .NET frameworks where applicable, to enhance data processing and integration capabilities. Design and implement OLAP and data warehousing solutions, adhering to best practices in data warehouse design concepts. Perform database and query performance tuning and optimizations to ensure high performance and reliability. Stay updated with the latest technologies and trends in big data, proposing and implementing new tools and technologies to improve data systems and processes. Implement unit testing and automation strategies to ensure the reliability and performance of the full-stack application. Conduct thorough code reviews, providing constructive feedback to team members and ensuring adherence to coding standards and best practices. Collaborate with QA engineers to implement and maintain automated testing procedures, including API testing. Work in an Agile environment, participating in sprint planning, daily stand-ups, and retrospective meetings to ensure timely and iterative project delivery. Stay abreast of industry trends and emerging technologies to continuously improve skills and contribute innovative ideas. Requirements: Bachelors degree in computer science, Engineering, or a related field. 3-8 years of professional experience in data engineering or a related field. Profound expertise in SQL,T-SQL, database design, and data warehousing principles. Strong experience with Microsoft Azure tools including MS Fabric, SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. Proficient in Python, PySpark, and PySQL for data processing and analytics tasks. Experience with Power BI and other reporting and analytics tools. Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. Knowledge of .NET frameworks is highly preferred. Excellent problem-solving, analytical, and communication skills. Bachelors or Masters degree in Computer Science, Engineering, or a related field. Interested candidates can share their resumes at megha.chattopadhyay@aspiresys.com
Posted 2 months ago
4 - 8 years
12 - 22 Lacs
Kochi, Gurugram, Bengaluru
Hybrid
Project Role: Azure date engineer Work Experience: 4 to 8 Years Work location: Bangalore / Gurugram / Kochi Work Mode: Hybrid Must Have Skills: Azure Data engineer, SQL, Spark/Pyspark Job Overview: Responsible for the on-time completion of projects or components of large, complex projects for clients in the life sciences field. Identifies and elevates potential new business opportunities and assists in the sales process. Skills required: Experience in developing Azure components like Azure data factory, Azure data Bircks, Logic Apps, Functions Develop efficient & smart data pipelines in migrating various sources on to Azure datalake Proficient in working with Delta Lake, Parquet file formats Designs, implements, and maintain the CI/CD pipelines, deploy, merge codes Expert in programming in SQL, Pyspark, Python Creation of databases on Azure data lake with best data warehousing practises Build smart metadata databases and solutions, parameterization, configurations Develop Azure frameworks, develops automated systems for deployment & monitoring Hands-on experience in continuous delivery and continuous integration of CI/CD pipelines, CI/CD infrastructure and process troubleshooting. Extensive experience with version control systems like Git and their use in release management, branching, merging, and integration strategies Essential Functions: Participates or leads teams in the design, development and delivery of consulting projects or components of larger, complex projects. Reviews and analyzes client requirements or problems and assists in the development of proposals of cost effective solutions that ensure profitability and high client satisfaction. Provides direction and guidance to Analysts, Consultants, and where relevant, to Statistical Services assigned to engagement. Develops detailed documentation and specifications. Performs qualitative and/or quantitative analyses to assist in the identification of client issues and the development of client specific solutions. Designs, structures and delivers client reports and presentations that are appropriate to the characteristics or needs of the audience. May deliver some findings to clients. Qualifications Bachelor's Degree Req Master's Degree Business Administration Pref 4-8 years of related experience in consulting and/or life sciences industry Req.
Posted 2 months ago
7 - 10 years
17 - 22 Lacs
Mumbai
Work from Office
Position Overview: The Microsoft Cloud Data Engineering Lead role is ideal for an experienced Microsoft Cloud Data Engineer who will architect, build, and optimize data platforms using Microsoft Azure technologies. The role requires the candidate to have deep technical expertise in Azure data services, strong leadership capabilities, and a passion for building scalable, secure, and high-performance data ecosystems. Key Responsibilities: Lead the design, development, and deployment of enterprise-scale data pipelines and architectures on Microsoft Azure. Manage and mentor a team of data engineers, promoting best practices in cloud engineering, data modeling, and DevOps. Architect and maintain data platforms using Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, and Azure SQL/SQL MI. Develop robust ETL/ELT workflows for structured and unstructured data using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business units to deliver data solutions supporting advanced analytics, BI, and operational use cases. Implement data governance, quality, and security frameworks, leveraging tools such as Azure Purview and Azure Key Vault. Drive automation and infrastructure-as-code practices using Bicep, ARM templates, or Terraform with Azure DevOps or GitHub Actions. Ensure performance optimization and cost-efficiency across data pipelines and cloud environments. Stay current with Microsoft cloud advancements and help shape cloud strategy and data architecture roadmaps. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Microsoft Azure . Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert knowledge of Azure Data Lake, Synapse Analytics, Data Factory, Databricks, and Azure SQL-based technologies. Proficiency in SQL, Python, and/or Spark for data transformation and analysis. Strong understanding of data governance, security, compliance (e.g., GDPR, PCIDSS), and privacy in cloud environments. Experience leading data engineering teams or cloud data projects from design to delivery. Familiarity with CI/CD pipelines, infrastructure as code, and DevOps practices within the Azure ecosystem Familiarity with Power BI and integration of data pipelines with BI/reporting tools Certifications : Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert.
Posted 2 months ago
6 - 11 years
25 - 40 Lacs
Pune
Hybrid
Role Definition: Data Scientists focus on researching and developing AI algorithms and models. They analyse data, build predictive models, and apply machine learning techniques to solve complex problems. Skills: • Proficient: Languages/Framework: Fast API, Azure UI Search API (React) o Databases and ETL: Cosmos DB (API for MongoDB), Data Factory Data Bricks o Proficiency in Python and R o Cloud: Azure Cloud Basics (Azure DevOps) o Gitlab: Gitlab Pipeline o Ansible and REX: Rex Deployment o Data Science: Prompt Engineering + Modern Testing o Data mining and cleaning o ML (Supervised/unsupervised learning) o NLP techniques, knowledge of Deep Learning techniques include RNN, transformers o End-to-end AI solution delivery o AI integration and deployment o AI frameworks (PyTorch) o MLOps frameworks o Model deployment processes o Data pipeline monitoring Expert: (in addition to proficient skills) o Languages/Framework: Azure Open AI o Data Science: Open AI GPT Family of models 4o/4/3, Embeddings + Vector Search o Databases and ETL: Azure Storage Account o Expertise in machine learning algorithms (supervised, unsupervised, reinforcement learning) o Proficiency in deep learning frameworks (TensorFlow, PyTorch) o Strong mathematical foundation (linear algebra, calculus, probability, statistics) o Research methodology and experimental design o Proficiency in data analysis tools (Pandas, NumPy, SQL) o Strong statistical and probabilistic modelling skills o Data visualization skills (Matplotlib, Seaborn, Tableau) o Knowledge of big data technologies (Spark, Hive) o Experience with AI-driven analytics and decision-making systems
Posted 2 months ago
9 - 14 years
15 - 30 Lacs
Bengaluru
Work from Office
TRUGlobal is Hiring!! Skills: ( Hands on Experience on below skills Min 7 to 8 years ) MS Azure Data Bricks Data Factory Power BI Overall Exp : 10+ years NP : Immediate to should join in 15-20 days Location : Bangalore ( WFO ) Interested candidates share me you're updated resume to pooja.v@truglobal.com
Posted 2 months ago
8 - 13 years
20 - 35 Lacs
Chennai
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 8-14 yrs Location: Hyderabad/Chennai/Kolkata/Delhi Skill: Databricks Architect Must have experience on databricks which consists of Delta lake, Unity Catalog, Databricks workflows orchestration, Security management, Platform governance, Data Security. Must have knowledge of new features available in Databricks and its implications along with various possible use-case . Must have followed various architectural principles to design best suited per problem. Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have strong understanding of Data warehousing and various governance and security standards around Databricks. Must have knowledge of cluster optimization and its integration with various cloud services. Must have good understanding to create complex data pipeline . Must be strong in SQL and sprak-sql . Must have worked on designing both Batch and streaming data pipelin Interested can share your resume to sangeetha.spstaffing@gmail.com with the below inline details Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in ADF: Rel Exp in Pyspark/Spark: Rel Exp in Python/Scala: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year: Regards Sangeetha 7871316699
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough