Jobs
Interviews

1262 Azure Databricks Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

6 - 16 Lacs

Kolkata, Pune, Bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Cloud Platform->Azure Development & Solution Architecting Preferred Skills: Technology->Cloud Platform->Azure Development & Solution Architecting

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai, Bengaluru

Work from Office

Availability: Immediate preferred Key Responsibilities: Design and implement advanced data science workflows using Azure Databricks. Collaborate with cross-functional teams to scale data pipelines. Optimize and fine-tune PySpark jobs for performance and efficiency. Support real-time analytics and big data use cases in a remote-first agile environment. Required Skills: Proven experience in Databricks, PySpark, and big data architecture. Ability to work with data scientists to operationalize models. Strong understanding of data governance, security, and performance. Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 week ago

Apply

5.0 - 10.0 years

0 - 3 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Job description Expirence - 5+ Years Location - Anywhere India Mandatory skills* Oracle, Azure Data Factory and Azure Databricks, SQL Server, AWS S3, FTP, Oracle, Blob Role & responsibilities Design, develop, and deploy scalable ETL solutions using Azure Data Factory and Azure Databricks. Construct and optimize complex data pipelines, leveraging various ADF activities such as Get Metadata, Lookup, Stored Procedure, For Each, If, and Execute Pipeline. Implement dynamic pipelines to efficiently extract and load data from multiple sources (SQL Server, AWS S3, FTP, Oracle, Blob) into Azure Data Lake Storage. Collaborate with data architects and stakeholders to understand data requirements and translate them into technical specifications. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor and troubleshoot data pipelines to ensure optimal performance and reliability. Contribute to the continuous improvement of data engineering processes and best practices. Mandatory skills* Oracle, Azure Data Factory and Azure Databricks, SQL Server, AWS S3, FTP, Oracle, Blob Preferred candidate profile If you feel JD suits your profile kindly share me your update CV for below email id -Kavyacn@radiants.com

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Greeting from ARi !!! Company Profile: ARI is US Based organization; Head Quartered @ USA and having local offices in India @ Hyderabad and Chennai. The overall employee strength is more than 1800+ Website: https://ariglobalsolutions.com Job Title: Senior Azure Cloud Data Engineer Location: Bangalore Experience Required: 5+ years Job Summary: We are seeking a highly skilled Senior Azure Cloud Data Engineer to join our team. The ideal candidate will have extensive experience in processing and analyzing IoT data from connected products to drive value for our customers. This role involves working with cloud data and analytics platforms such as AWS, Azure, or GCP, and requires advanced knowledge of various Azure Data Analytics tools. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics. Process and analyse IoT data from connected products to derive actionable insights. Implement and manage cloud data and analytics platforms (AWS, Azure, GCP). Develop and optimize SQL queries for data extraction and reporting. Automate data workflows using CI/CD pipelines. Collaborate with cross-functional teams to define data requirements and deliver solutions. Ensure data quality and integrity across various data sources and platforms. Utilize advanced Azure Data Analytics tools including: Azure Functions (Compute) Azure Blob Storage (Storage) Azure Cosmos DB (Databases) Azure Synapse Analytics (Databases) Azure Data Factory (Analytics) Azure Synapse Serverless SQL Pools (Analytics) Azure Event Hubs (Analytics - Realtime data) Implement data streaming technologies such as Kafka or Azure Event Hubs. Follow Agile methodology for project management and delivery. Qualifications: Educational: B.E or B. Tech 7 - 9 years of relevant experience in cloud data engineering. Proficiency in SQL, ETL, ELT, and data reporting. Hands-on experience with CI/CD and automation tools. Strong understanding of data streaming technologies and real-time analytics. Excellent problem-solving skills and attention to detail. Ability to work effectively in an Agile environment. Strong communication and collaboration skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune, Chennai

Hybrid

We are looking for Data Engineers with 5-10 years experience , need candidates who can join us within 15 days Exp: 5-10 Yrs Location: Chennai/Pune Mode: Hybrid - 3 days in a week What we Need: Candidates with good exposure on the below skill: Azure Data Bricks, Azure Data Factory, Azure Data Lakes ,Devops, Python Pyspark, CDC

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

5+ Years of Data and Analytics experience with minimum 3 years in Azure Cloud Excellent communication and interpersonal skills. Extensive experience in Azure stack ADLS, Azure SQL DB, Azure Data Factory, Azure Data bricks, Azure Synapse, Cosmo DB, Analysis Services, Event Hub etc.. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. Good experience in designing & delivering data analytics solutions using Azure Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Documentation of solutions (e.g. data models, configurations, and setup). Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Azure Data Engineer or Azure Solution Architect certification will be added advantage. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Pyspark, Scala, Python, Hive etc.

Posted 1 week ago

Apply

4.0 - 7.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Seeking a skilled and experienced Technical Hands-On Production Support Lead with a focus on Azure data engineering and production support. The ideal candidate will work with a technical team in building and supporting data solutions on the Azure platform. This is an exciting opportunity to work on cutting-edge projects and be part of a dynamic team that thrives on innovation and collaboration. Required Skills & Experience: • 5+ years of experience in data engineering with a strong focus on Azure data services. • Hands-on experience with Azure Data Factory, Azure Databricks, and Azure Synapse. • Strong proficiency in Python, SQL, and PySpark. • Proven experience in production support environments, with the ability to resolve critical issues efficiently. • Solid understanding of ETL processes, data pipelines, and data management best practices. • Ability to troubleshoot and optimize existing data pipelines and queries. • Excellent communication skillsboth written and verbalwith the ability to convey complex technical concepts to non-technical stakeholders.

Posted 1 week ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Noida

Remote

Position Overview: The primary focus of this position is to Design, develop, and maintain robust data pipelines using Azure Data Factory. Implement and manage ETL processes to ensure efficient data flow and transformation. What youll do as a (BI Developer Lead): Design, develop, and maintain robust data pipelines using Azure Data Factory. Implement and manage ETL processes to ensure efficient data flow and transformation. Develop and maintain data models and data warehouses using Azure SQL Database and Azure Synapse Analytics. Create and manage Power BI reports and dashboards to provide actionable insights to stakeholders. Ensure data quality, integrity, and security across all data systems. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize data storage and retrieval processes for performance and cost efficiency. Monitor and troubleshoot data pipelines and workflows to ensure smooth operations. Create and maintain tabular models for efficient data analysis and reporting. Stay updated with the latest Azure services and best practices to continuously improve data infrastructure. What will you bring to the team: Bachelors degree in computer science, Information Technology, or a related field. Certification in Azure Data Engineer or related Azure certifications will be an added advantage. Experience with machine learning and AI services on Azure will be an added advantage. Proven experience in designing and maintaining data pipelines using Azure Data Factory. Strong proficiency in SQL and experience with Azure SQL Database. Hands-on experience with Azure Synapse Analytics and Azure Data Lake Storage. Proficiency in creating and managing Power BI reports and dashboards. Knowledge of Azure DevOps for CI/CD pipeline implementation. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Knowledge of data governance and compliance standards.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

faridabad, haryana

On-site

As a Senior Software Engineer at our organization, you will play a crucial role in leading and contributing to our healthcare data engineering initiatives. Your primary focus will be on building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment. You will design and develop scalable real-time data streaming solutions using Apache Kafka and Python. Additionally, you will architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data. It will be your responsibility to optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability. Ensuring data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST will be a key aspect of your role. You will collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions. Furthermore, you will mentor junior engineers, perform code reviews, and promote engineering best practices. To excel in this role, you should have 4+ years of hands-on experience in data engineering roles. Proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry) and Python for data processing and automation is essential. Experience with Azure Databricks (or readiness to ramp up quickly) and a solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus) are required. You should also possess strong knowledge of SQL and NoSQL databases, data modeling for large-scale systems, familiarity with containerization tools like Docker, and orchestration using Kubernetes. Exposure to CI/CD pipelines for data applications and prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable. In addition to technical skills, excellent problem-solving abilities, a proactive mindset, and strong communication and interpersonal skills to work in cross-functional teams will be valuable assets in this role. Stay current with evolving technologies in cloud, big data, and healthcare data standards to contribute effectively to our projects. If you are passionate about data engineering, possess the required skills and qualifications, and are eager to make a significant impact in the healthcare industry, we look forward to receiving your application.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

You should have more than 5 years of experience in designing and implementing end-to-end data solutions using Microsoft Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. Your role will involve developing complex transformation logic using SQL Server, SSIS, and ADF, along with creating ETL Jobs/Pipelines to execute these mappings concurrently. You will also be responsible for maintaining and enhancing existing ETL Pipelines, Warehouses, and Reporting leveraging the traditional MS SQL Stack. It is crucial that you have an understanding of REST API principles and the ability to create ADF pipelines to handle HTTP requests for APIs. You should be well-versed in best practices for the development and deployment of SSIS packages, SQL jobs, and ADF pipelines. Additionally, you will be required to implement and manage source control practices using GIT within Azure DevOps to ensure code integrity and facilitate collaboration. Your responsibilities will also include participating in the development and maintenance of CI/CD pipelines for automated testing and deployment of BI solutions. Preferred skills include an understanding of the Azure environment and developing Azure Logic Apps and Azure Function Apps, as well as knowledge of code deployment, GIT, CI/CD, and deployment of developed ETL code (SSIS, ADF). This is a contractual/temporary or freelance job with a contract length of 6 months. The work location is in person.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have 5+ years of experience in data analysis, engineering, and science. Your proficiency should include Azure Data Factory, Azure DataBricks, Python, PySpark, SQL, and PLSQL or SAS. Your responsibilities will involve designing, developing, and maintaining ETL pipelines using Azure Data Bricks, Azure Data Factory, and other relevant technologies. You will be expected to manage and optimize data storage solutions using Azure Data Lake Storage (ADLS) and develop and deploy data processing workflows using Pyspark and Python. Collaboration with data scientists, analysts, and stakeholders to understand data requirements and ensure data quality is essential. Implementing data integration solutions, ensuring seamless data flow across systems, and utilizing Github for version control and collaboration on the codebase are also part of the role. Monitoring and troubleshooting data pipelines to guarantee data accuracy and availability is crucial. It is imperative to stay updated with the latest industry trends and best practices in data engineering.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You have an exciting opportunity to join YASH Technologies as a Microsoft Fabric Professional. As part of our team, you will be working with cutting-edge technologies to drive business transformation and create real positive changes in an increasingly virtual world. Your main responsibilities will include working with Azure Fabric, Azure Data Factory, Azure Databricks, Azure Synapse, Azure SQL, and ETL processes. You will be creating pipelines, datasets, dataflows, integration runtimes, and monitoring pipelines to trigger runs. Additionally, you will be involved in extracting, transforming, and loading data from source systems using Azure Databricks, as well as creating SQL scripts for complex queries. Moreover, you will work on creating Synapse pipelines to migrate data from Gen2 to Azure SQL, data migration pipelines to Azure cloud (Azure SQL), and database migration from on-prem SQL server to Azure Dev Environment using Azure DMS and Data Migration Assistant. Experience in using Azure Data Catalog and Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions is a plus. As a Microsoft Fabric Professional, you are encouraged to pursue relevant certifications to enhance your skills. At YASH Technologies, we provide a supportive and inclusive team environment where you can create a career that aligns with your goals. Our Hyperlearning workplace is built on flexibility, emotional positivity, trust, transparency, and open collaboration to help you achieve your business goals while maintaining stable employment in a great atmosphere with an ethical corporate culture.,

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune

Remote

Key Responsibilities: At least 5 years of experience in data engineering with a strong background on Azure Databricks and Scala/Python. Databricks with knowledge in Pyspark Database: Oracle or any other database Programming: Python with awareness of Flask or Streamlit

Posted 1 week ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Kochi, Bengaluru, Thiruvananthapuram

Hybrid

Position: Data Engineer Azure Databricks Experience: 7+ Years Locations: Trivandrum, Kochi, Bangalore No. of Positions: 20 Notice Period: 0 – 15 Days (Strictly) CTC: Up to 40 LPA (Case-to-case basis) Mandatory Skills: Azure Databricks PySpark SQL Python Key Responsibilities: Develop and optimize robust data pipelines using Databricks, PySpark, and Azure Work on complex ETL/ELT processes , transforming and modeling data for analytics and reporting Build scalable data solutions using relational and big data engines Apply strong understanding of data warehousing concepts (e.g., Kimball/Star Schema ) Collaborate with cross-functional teams in Agile environments Ensure clean code, versioning, documentation , and pipeline maintainability Must be able to work on a MacBook Pro (mandatory for script compatibility) Requirements: 7+ years of hands-on experience in data engineering Expertise in Azure cloud platform and Databricks notebooks Proficiency in SQL , Python , and PySpark Good communication and collaboration skills Solid documentation and version control practices Preferred Candidates: Immediate joiners or those with 0–15 days notice Comfortable working from Trivandrum, Kochi, or Bangalore locations Previous experience in data-heavy environments with real-time or batch processing

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job description Role: Azure Databricks Allen.Prashanth@ltimindtree.com share resume and details Experience: 5-12 Years Location: Bangalore/Chennai/Kolkata/Coimbatore/Mumbai/Pune/Hyderabad. Forms link to Get quicker turn around on response, will consider application only if this link is filled. Overview: Data Engineer having good experience on Azure Databricks and Pyspark Must Have: Databricks Python Azure Good to have: ADF Requirements Candidate must be proficient in Databricks Skills The Candidate must have 5-12 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Pune

Work from Office

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT JOB SUMMARY: Position Sr Consultant Location Capco Locations (Bengaluru/ Chennai/ Hyderabad/ Pune/ Mumbai/ Gurugram) Band M3/M4 (8 to 14 years) Role Description: Job TitleSenior Consultant - Data Engineer Responsibilities Design, build and optimise data pipelines and ETL processes in Azure Databricks ensuring high performance, reliability, and scalability. Implement best practices for data ingestion, transformation, and cleansing to ensure data quality and integrity. Work within clients best practice guidelines as set out by the Data Engineering Lead Work with data modellers and testers to ensure pipelines are implemented correctly. Collaborate as part of a cross-functional team to understand business requirements and translate them into technical solutions. Role Requirements Strong Data Engineer with experience in Financial Services Knowledge of and experience building data pipelines in Azure Databricks Demonstrate a continual desire to implement strategic or optimal solutions and where possible, avoid workarounds or short term tactical solutions Work within an Agile team Experience/Skillset 8+ years experience in data engineering Good skills in SQL, Python and PySpark Good knowledge of Azure Databricks (understanding of delta tables, Apache Spark, Unity Catalog) Experience writing, optimizing, and analyzing SQL and PySpark code, with a robust capability to interpret complex data requirements and architect solutions Good knowledge of SDLC Familiar with Agile/Scrum ways of working Strong verbal and written communication skills Ability to manage multiple priorities and deliver to tight deadlines WHY JOIN CAPCO You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients A diverse, inclusive, meritocratic culture We offer: A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients #LI-Hybrid

Posted 1 week ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional Requirements: PythonPySparkETLData PipelineBig DataAWSGCPAzureData WarehousingSparkHadoop Preferred Skills: Technology-Big Data-Big Data - ALL

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills:Technology-Cloud Platform-Azure Development & Solution Architecting Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting Generic Skills: Technology-Cloud Integration-Azure Data Factory (ADF)

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BSc,Bachelor of Business Administration and Bachelor of Legislative Law (BBA LLB) Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skillsTechnology-Cloud Platform-Azure Analytics Services-Azure Data Lake Preferred Skills: Technology-Cloud Platform-Azure Development & Solution Architecting

Posted 1 week ago

Apply

9.0 - 11.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills:Technology-Cloud Platform-Azure Development & Solution Architecting Preferred Skills: Technology-Cloud Platform-Azure Analytics Services-Azure Databricks

Posted 1 week ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply

7.0 - 12.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Consult with clients and propose architectural solutions to help move & improve infra from on-premises to cloud or help optimize cloud spend from one public cloud to the other. Be the first one to experiment on new age cloud offerings, help define the best practice as a thought leader for cloud, automation & Dev-Ops, be a solution visionary and technology expert across multiple channels. Good understanding of cloud design principles, sizing, multi-zone/cluster setup, resiliency and DR design. Solution Architect or similar certifications from Azure is must. Good business judgment, a comfortable, open communication style, and a willingness and ability to work with customers and teams. Strong communication skills and ability to lead discussions with client technical experts, application team & Vendors to drive collaboration, design thinking model towards reaching the desired objective Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in participating in technical reviews of requirements, designs, code, and other artifacts and use your experience in Multicloud to build hybrid-cloud solutions for customers. Provide leadership to project teams and facilitate the definition of project deliverables around core Cloud based technology and methods. Define tracking mechanisms and ensure IT standards and methodology are met; deliver quality results. Sound knowledge of SRE principles and ability to address performance issues through design or coding is must. Implement observability, develop, and support pipeline model to deploy key features, changes. Security, Risk and Compliance - Advise customers on best practices around access management, network setup, regulatory compliance, and related areas Preferred technical and professional experience 10 - 15 years of experience with at least 5+ years of hands-on experience in Azure Cloud Computing and IT operational experience in a global enterprise environment. Experience in Azure Databricks is preferred. Must have Azure DevOps experience and expertise in all Azure services and Database and Operating Systems experience and good experience in Automation skills like Terraform Ansible etc. Should work in IBM Cloud project as and when needed

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries ; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 1 week ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Kochi

Work from Office

Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python Azure Azure data factory, ADLS- Azure data lake store, Azure data bricks, Mandatory Programming languages Py-Spark, PL/SQL, Spark SQL Database SQL DB Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc. Data Warehousing experience with strong domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc

Posted 1 week ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies