Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7 - 10 years
25 - 30 Lacs
Hyderabad
Work from Office
Proficient in Azure Databricks, Azure Data Factory and Azure SQL DB Proficient in designing and developing ingestion pipelines with DLT and Structured Streaming Proficient in Databricks/Spark internals Proficient in Spark jobs optimization techniques Proficient in designing the pipelines with latest Databrick/Spark optimization techniques Proficient in PySpark, SQL and Python for data engineering Experience in developing and maintaining meta data driven data ingestion framework Extensive experience in the data warehousing concepts Certification in professional data bricks is preferred Exhibits leadership skills
Posted 2 months ago
5 - 9 years
5 - 15 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Title: Data Architect Location: Hyderabad , Bangalore, Mumbai (Hybrid) Job Type: Contract of 1+ year Years of Experience - 5 - 9 years Job Summary: We are seeking a Data Architect with strong hands-on experience in designing and building real-time data streaming solutions using PySpark on Azure Databricks . This role requires a blend of data architecture expertise and practical engineering skills to build scalable and efficient streaming pipelines in a cloud environment. Key Responsibilities: Design and implement real-time data streaming pipelines using PySpark on Azure Databricks . Define and maintain the overall data architecture , ensuring scalability, performance, and security. Collaborate with data engineers, analysts, and business teams to understand data requirements. Optimize data workflows for performance and cost-efficiency in Azure. Ensure data quality, governance, and compliance with organizational standards. Document architectural decisions, data flows, and system design. Required Skills & Experience: 5+ years of experience in data engineering or architecture roles. Strong expertise in PySpark , especially for streaming data (Structured Streaming). Hands-on experience with Azure Databricks and related Azure data services (e.g., Event Hubs, Data Lake, Synapse). Proficiency in designing and deploying scalable data pipelines in cloud environments. Solid understanding of data modeling , ETL/ELT , and real-time data processing . Strong problem-solving skills and the ability to work independently or in a team.
Posted 2 months ago
7 - 10 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Skills: Python, SQL, PySpark, Azure Databricks, Data Pipelines SQL: Great skills on T-SQL, stored procedures troubleshooting and development, schema management, data issues analysis, query performance analysis. Python: Intermediate development knowledge skillful in data frames, Pandas library, parquets management, deployment on cloud. • Databricks: PySpark and data frames, azure databricks notebooks management and troubleshooting, Azure databricks architecture. • Azure Data Factory/ADF/Synapse/ Data Explorer: Data pipelines design and troubleshooting, Azure Linked services management.
Posted 2 months ago
3 - 8 years
5 - 15 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Databuzz is Hiring for Data Engineer (PAN INDIA) -3+ Yrs -Hybrid - Immediate joiners Please mail your profile to alekya.chebrolu@databuzzltd.com with the below details, If you are Interested. About DatabuzzLTD: Databuzz is One stop shop for data analytics specialized in Data Science, Big Data, Data Engineering, AI & ML, Cloud Infrastructure and Devops. We are an MNC based in both UK and INDIA. We are a ISO 27001 & GDPR complaint company. CTC - ECTC - Notice Period/LWD - (Candidate serving notice period will be preferred) Position: Data Engineer (PAN INDIA) -3+ Yrs -Hybrid - Immediate joiners Exp -3+ yrs Mandatory Skills: Proficiency in PySpark or Scala and SQL for data processing tasks. Hands on experience with Azure Databricks Delta Lake Delta Live tables Auto Loader and Databricks SQL Expertise with Azure Data Lake Storage ADLS Gen2 for optimized data storage and retrieval Strong knowledge of data modelling ETL processes and data warehousing concepts Regards, Alekya ch Talent Acquisition Specialist alekya.chebrolu@databuzzltd.com
Posted 2 months ago
5 - 7 years
7 - 9 Lacs
Mumbai, Chennai
Hybrid
Before you apply to a job, select your language preference from the options available at the top right of this page. About the role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development : Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About the Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications.
Posted 2 months ago
1 - 4 years
6 - 10 Lacs
Gurugram
Work from Office
KDataScience (USA & INDIA) is looking for Jr. Data Engineer - Python & MYSQL to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
4 - 7 years
3 - 8 Lacs
Chennai
Hybrid
Intermediate Data Developer - Azure Databricks+cosmos DB + SQL + ETL+SSIS Job Title: Intermediate Data Developer Azure ADF and Databricks Experience Range: 4-7 Years Location: Chennai, Hybrid Employment Type: Full-Time About the role We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns. Key Responsibilities Data Solution Design and Development : Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF). Implement data transformations and processing using Azure Databricks. Develop and maintain NoSQL data models and queries in Cosmos DB. Optimize data pipelines for performance, scalability, and cost efficiency. Data Integration and Architecture: Integrate structured and unstructured data from diverse data sources. Collaborate with data architects to design end-to-end data flows and system integrations. Implement data security, governance, and compliance standards. Performance Tuning and Optimization: Monitor and tune data pipelines and processing jobs for performance and cost efficiency. Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB. Collaboration and Mentoring: Collaborate with cross-functional teams including data testers, architects, and business analysts. Conduct code reviews and provide constructive feedback to improve code quality. Mentor junior developers, fostering best practices in data engineering and cloud development. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Leadership and Mentoring: Strong communication and coaching skills for team collaboration. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional About the Team As a Senior Data Developer , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications.
Posted 2 months ago
6 - 11 years
10 - 15 Lacs
Gurugram, Chennai, Mumbai (All Areas)
Hybrid
Position : Senior Data Engineer with Azure & Java Location : Chennai, Mumbai & Gurugram Position Type : Permanent Work Mode: Hybrid Notice Period: - Immediate - 30 Days Job Description: Bachelors or Masters degree in computer science, engineering, mathematics, statistics, or equivalent technical discipline 7+ years of experience working with data mapping, data analysis and numerous large data sets/data warehouses. Strong application development experience using JAVA, C++ Strong experience with Azure Data bricks, Azure Data Explorer, ADLS2, EventHub technologies. Experience with application containerization and deployment process (Docker, GitHub, CI/CD pipelines). Experience working with Cosmos DB is preferred. Ability to assemble, analyze, and evaluate big data and be able to make appropriate and well-reasoned recommendations to stakeholders. Good analytical and problem-solving skills, good understanding of different data structures, algorithms, and their usage in solving business problems. Strong communication (verbal and written) and customer service skills. Strong interpersonal, communication, and presentation skills applicable to a wide audience including senior and executive management, customers, etc. Strong skills in setting, communicating, implementing, and achieving business objectives and goals. Strong organization/project planning, time management, and change management skills across multiple functional groups and departments, and strong delegation skills involving prioritizing and reprioritizing projects and managing projects of various size and complexity Accountabilities: Work in iterative processes to map data into common formats, perform advanced data analysis, validate findings, or test hypotheses, and communicate results and methodology. Provide recommendations on who to utilize our data to optimize our search, increase in data accuracy results, and help us to better understand or existing data. Communicate technical information successfully with technical and non-technical audiences such as third-party vendors, external customer technical departments, various levels of management and other relevant parties. Collaborate effectively with all team members as well as attend regular team meetings.
Posted 2 months ago
7 - 12 years
9 - 14 Lacs
Ahmedabad
Work from Office
Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Data Analytics, Data Warehouse ETL Testing, Big Data Analysis Tool and Techniques, Hadoop Administration Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :Experienced Data Engineer with a strong background in Azure data services and broadcast supply chain ecosystems. Skilled in OTT streaming protocols, cloud technologies, and project management. Roles & Responsibilities: Proven experience as a Data Engineer or in a similar role. Lead and support expert guidance to Principal - Solutions & Integration. Track and report on project progress using internal applications. Transition customer requirements to on-air operations with proper documentation. Scope projects and ensure adherence to budgets and timelines. Generate design and integration documentation. Professional & Technical Skills: Strong proficiency in Azure data services (Azure Data Factory, Azure Databricks, Azure SQL Database). Experience with SQL, Python, and big data tools (Hadoop, Spark, Kafka). Familiarity with data warehousing, ETL techniques, and microservices in a cloud environment. Knowledge of broadcast supply chain ecosystems (BMS, RMS, MAM, Playout, MCR/PCR, NLE, Traffic). Experience with OTT streaming protocols, DRM, and content delivery networks. Working knowledge of cloud technologies (Azure, Docker, Kubernetes, AWS Basics, GCP Basics). Basic understanding of AWS Media Services (Media Connect, Elemental, MediaLive, Media Store, Media 2 Cloud, S3, Glacier). Additional Information: Minimum of 5 years' experience in Data Analytics disciplines. Good presentation and documentation skills. Excellent interpersonal skills. Undergraduate qualifications in engineering or computer science.Networking:Apply basic networking knowledge including TCP/IP, UDP/IP, IGMP, DHCP, DNS, and LAN/WAN technologies to support video delivery systems.Highly Desirable: Experience in defining technical solutions with over 99.999% reliability. Qualifications Specific undergraduate qualifications ie engineering computer science
Posted 2 months ago
12 - 17 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing innovative solutions to meet customer needs. You will also be involved in testing, debugging, and troubleshooting applications to ensure their smooth functioning and optimal performance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Develop and maintain high-quality software applications. Collaborate with business analysts and stakeholders to gather and analyze requirements. Design and implement application features and enhancements. Perform code reviews and ensure adherence to coding standards. Troubleshoot and debug application issues. Optimize application performance and scalability. Conduct unit testing and integration testing. Document application design, functionality, and processes. Stay updated with emerging technologies and industry trends. Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : DBA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing user requirements, developing software solutions, and ensuring the applications are optimized for performance and usability. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with cross-functional teams to gather and analyze user requirements. Design, develop, and test software applications using Microsoft Azure Databricks. Ensure the applications are optimized for performance, scalability, and security. Troubleshoot and debug issues in the applications, and provide timely resolutions. Document the application design, development, and testing processes. Stay updated with the latest industry trends and technologies to enhance application development practices. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft Azure Databricks. Strong understanding of software development principles and best practices. Experience in designing and developing applications using cloud platforms. Knowledge of programming languages such as Python, Java, or Scala. Familiarity with database systems and SQL. Good To Have Skills:Experience with data engineering and ETL processes. Experience with big data technologies such as Apache Spark. Knowledge of containerization technologies like Docker and Kubernetes. Additional Information: The candidate should have a minimum of 2 years of experience in Microsoft Azure Databricks. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 2 months ago
5 - 7 years
11 - 21 Lacs
Noida, Mumbai (All Areas)
Work from Office
You will be responsible for assessing complex new data sources and quickly turning these into business insights. You also will support the implementation and integration of these new data sources into our Azure Data platform.
Posted 2 months ago
3 - 8 years
13 - 18 Lacs
Pune
Work from Office
Enterprise Systems Administrator - Azure JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22752 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Assist in deploying, managing, and troubleshooting Azure resources, including virtual machines, networking, and storage. Monitor and respond to alerts to ensure optimal performance and availability of Azure services. Manage Azure networking services such as private endpoints, load balancers, Application Gateway, and ExpressRoute to ensure secure and efficient connectivity. Administer Azure governance features, including management groups, Azure Policy, and cost management. Oversee Power BI, Databricks, and Synapse administration, ensuring proper configuration and security. Ensure security compliance by managing Azure Defender, monitoring for vulnerabilities, and implementing security best practices. Handle ServiceNow tickets and resolve issues related to Azure services promptly. Maintain and update documentation related to Azure deployments, policies, and best practices. What you'll bring: Strong knowledge of Azure infrastructure, including networking, storage, private endpoints, and load balancing (Application Gateway, WAF). Basic understanding of monitoring tools like Azure Monitor, along with setting up alerts and reports. Experience with Azure governance, including management groups, policies, and cost optimization strategies. Knowledge of security best practices and tools such as Azure Defender and role-based access control (RBAC). Experience in Power BI, Azure Databricks, Synapse administration, and ServiceNow ticket management. Familiarity with scripting and automation tools such as PowerShell, ARM templates. Basic understanding of containerization and orchestration tools (Docker, Kubernetes) is a plus. Additional Skills: 1-3 years of experience managing Azure cloud environments. Strong communication and problem-solving skills. Ability to manage multiple tasks and work both independently and within a team. Azure certifications such as Azure Administrator Associate or Azure Fundamentals are a plus. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 2 months ago
5 - 8 years
10 - 20 Lacs
Noida, Mumbai (All Areas)
Work from Office
5+ years of experience in the azure domain with minimum 4 years of relevant experience.
Posted 2 months ago
10 - 12 years
10 - 20 Lacs
Noida, Mumbai (All Areas)
Work from Office
Advanced working knowledge and experience with relational and non - relational databases.
Posted 2 months ago
5 - 9 years
6 - 7 Lacs
Noida, Ahmedabad, Chennai
Hybrid
This role focuses on building efficient, scalable SQL-based data models and pipelines using Data bricks SQL, Spark SQL, and Delta Lake. The ideal candidate will play a key role in transforming raw data into valuable analytical insights.
Posted 2 months ago
4 - 9 years
14 - 18 Lacs
Bengaluru
Work from Office
Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python Azure Azure data factory, ADLS- Azure data lake store, Azure data bricks, Mandatory Programming languages Py-Spark, PL/SQL, Spark SQL Database SQL DB Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc. Data Warehousing experience with strong domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc
Posted 2 months ago
5 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? As RPA Senior developer you will be responsible for design & development of end-to-end RPA automation leveraging A360 tools & technologies.This will include working with the clients or/and stake holders to understand the requirements, prepare technical specification documents, unit test cases and develop the automation adhering to client requirements and policies. What are we looking for? Minimum 5 – 8 years of strong software design & development experience Minimum 4 – 5 year(s) of programming experience in Automation Anywhere A360 , Document Automation-pilot, Python Effective GEN AI Prompts creation for Data extraction using GEN AI OCR Experience with APIs, data integration, and automation best practices Experience in VBA VB or Python Script programming Good Knowledge on GEN AI , Machine Learning. Good and Hands-on in core .NET concepts and OOPs Programming. Understands OO concepts and consistently applies them in client engagements. Hands on experience in SQL & T-SQL Queries, Creating complex stored procedures. Exposure towards performing Unit Testing Experience on Virtualization and VDI Technologies is a mandate. Exceptional presentation, written and verbal communication skills (English) Able to prioritize work, complete multiple tasks, and work under deadlines. Extensive customer facing with excellent business communication skills. Must be self-motivated with an excellent attitude. Automation Anywhere A360 Master/Advanced certification. Exposure to SAP automation is preferred. Azure Machine Learning, Azure Databricks, and other Azure AI services. Exposure to A360 Control Room features. Exposure to Pharma domain is preferred. Exposure to GDPR compliance is preferred. Agile development methodologies are an added advantage. Roles and Responsibilities: Design & build end-to-end automation leveraging A360 tool. Design & develop reusable components. Support building automation capability Anticipate, identify, track, and resolve technical issues and risks affecting delivery. Develop automation bots and processes using A360 platform. Utilize A360 s advanced features (AARI, WLM and API Consumption, Document automation, Co-pilot) to automate complex tasks, streamline processes, and optimize efficiency. Integrate A360 with various APIs, databases, and third-party tools to ensure seamless data flow and interaction between systems. Perform rigorous testing of automation solutions to identify and address issues. Debug and troubleshoot bots to ensure flawless execution of automation processes. Collaborate with cross-functional teams including business analysts, Process Architects to deliver holistic automation solutions that cater to various stakeholder needs. Qualification Any Graduation
Posted 2 months ago
3 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist in the documentation of application specifications and user guides. Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: Must To Have Skills: Proficiency in Microsoft Azure Databricks. Strong understanding of cloud computing concepts and services. Experience with data integration and ETL processes. Familiarity with programming languages such as Python or Scala. Knowledge of application development methodologies and best practices. Additional Information: The candidate should have minimum 3 years of experience in Microsoft Azure Databricks. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
7 - 11 years
5 - 10 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years What would you do? RPA Lead developer will be responsible for design & development of end-to-end RPA automation leveraging A360 tools & technologies. Should anticipate, identify, track, and resolve technical issues and risks affecting delivery.Understand the Automation Anywhere RPA platform, its features, capabilities, and best practices. You would need to be proficient in designing and implementing automation workflows that optimize business processes. What are we looking for? Minimum 6 – 8 years of strong software design & development experience Minimum 5 – 6 year(s) of programming experience in Automation Anywhere A360 , Document Automation, Co-pilot, Python. Effective GEN AI Prompts creation for Data extraction using GEN AI OCR Experience with APIs, data integration, and automation best practices Experience in VBA ,VB and Python Script programming Good Knowledge on GEN AI , Machine Learning. Should have good hands-on in core .NET concepts and OOPs Programming. Understands OO concepts and consistently applies them in client engagements. Hands on experience in SQL & T-SQL Queries, Creating complex stored procedures. Exceptional presentation, written and verbal communication skills (English) Good understanding of workflow-based logic and hands on experience using process templates, VBO design and build. Should understand process analysis and pipeline build for automation process. Automation Anywhere A360 Master/Advanced certification. Strong programming knowledge on HTML, JavaScript / VB scripts Experience with Agile development methodology. Exposure to SAP automation is preferred. Exposure to A360 Control Room features. Azure Machine Learning, Azure Databricks, and other Azure AI services. Exposure to GDPR compliance is preferred. Agile development methodologies are an added advantage. Roles and Responsibilities: Lead the team to develop automation bots and processes using A360 platform. Utilize A360 s advanced features (AARI, WLM and API Consumption, Document automation,Co-pilot) to automate complex tasks, streamline processes, and optimize efficiency. Integrate A360 with various APIs, databases, and third-party tools to ensure seamless data flow and interaction between systems. Should be able to identify and build the common components to be used across the projects. Collaborate with cross-functional teams including business analysts, Process Architects to deliver holistic automation solutions that cater to various stakeholder needs. Strong SQL database management and troubleshooting skills. Serve as a technical expert on development projects. Review code for compliance and reuse. Ensure code complies with RPA architectural industry standards. Lead problem identification/error resolution process, including tracking, repairing, and reporting defects. Creates and maintains documentation to support role responsibilities for training, cross-training, and disaster recovery. Monitor and maintain license utilization and subscriptions. Maintain / monitor RPA environments (Dev/Test/Prod) Review and ensure automation runbooks are complete and maintained. Design, develop, document, test, and debug new robotic process automation (RPA) applications for internal use. Qualification Any Graduation
Posted 2 months ago
4 - 8 years
12 - 22 Lacs
Kochi, Gurugram, Bengaluru
Hybrid
Project Role: Azure date engineer Work Experience: 4 to 8 Years Work location: Bangalore / Gurugram / Kochi Work Mode: Hybrid Must Have Skills: Azure Data engineer, SQL, Spark/Pyspark Job Overview: Responsible for the on-time completion of projects or components of large, complex projects for clients in the life sciences field. Identifies and elevates potential new business opportunities and assists in the sales process. Skills required: Experience in developing Azure components like Azure data factory, Azure data Bircks, Logic Apps, Functions Develop efficient & smart data pipelines in migrating various sources on to Azure datalake Proficient in working with Delta Lake, Parquet file formats Designs, implements, and maintain the CI/CD pipelines, deploy, merge codes Expert in programming in SQL, Pyspark, Python Creation of databases on Azure data lake with best data warehousing practises Build smart metadata databases and solutions, parameterization, configurations Develop Azure frameworks, develops automated systems for deployment & monitoring Hands-on experience in continuous delivery and continuous integration of CI/CD pipelines, CI/CD infrastructure and process troubleshooting. Extensive experience with version control systems like Git and their use in release management, branching, merging, and integration strategies Essential Functions: Participates or leads teams in the design, development and delivery of consulting projects or components of larger, complex projects. Reviews and analyzes client requirements or problems and assists in the development of proposals of cost effective solutions that ensure profitability and high client satisfaction. Provides direction and guidance to Analysts, Consultants, and where relevant, to Statistical Services assigned to engagement. Develops detailed documentation and specifications. Performs qualitative and/or quantitative analyses to assist in the identification of client issues and the development of client specific solutions. Designs, structures and delivers client reports and presentations that are appropriate to the characteristics or needs of the audience. May deliver some findings to clients. Qualifications Bachelor's Degree Req Master's Degree Business Administration Pref 4-8 years of related experience in consulting and/or life sciences industry Req.
Posted 2 months ago
5 - 10 years
20 - 27 Lacs
Hyderabad, Bengaluru
Hybrid
Location: Bnagalore, Hyderabad Notice Period: Immediate to 20 days Experience: 6+ years Relevant Experience: 6+ years Skills: Python, SQL, PySpark, Azure Databricks, Data Pipelines
Posted 2 months ago
9 - 14 years
30 - 37 Lacs
Bengaluru
Work from Office
Hi, Greetings from Decision Minds! Mandatory Skill: Lead Azure Data Engg (Data Lake, Synapse, Data Bricks, Spark) Databricks Certified Data Engg Exp: 9 - 14 years B'LORE, HYD, CHE If interested, please share your profile to barsas@decisionminds.com
Posted 2 months ago
5 - 7 years
10 - 20 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Work Location : Any V2soft Offices / Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote . Must have Selenium, Java with Data factory and Databricks. Excellent Communication Skill Max NP: 15 Days. Minimum Qualifications and Job Requirements: 5+ years of experience in automating APIs and web services. 3+ years of experience in Selenium automation tool. - 1+ years of expereince with Datafactory and Databricks Experience with BDD implementations using Cucumber. Excellent SQL skills and the ability to write complex queries Highly skilled in at least one programming language. Java is preferred Highly skilled in 2 or more Automation Test tools. Experience in ReadyAPI is preferred. 2+ years of experience with Jenkins 2+ years of experience delivery automation solutions using Agile methodology. Experience with Eclipse or similar IDEs Experience with Source Control tools such as Git Ability to work on multiple projects concurrently and meet deadlines Ability to work in a fast-paced team environment. Expectations include a high level of initiative and a strong commitment to job knowledge, productivity, and attention to detail Strong verbal and written communication skills. Solid software engineering skills - participated in full lifecycle development on large projects.
Posted 2 months ago
8 - 13 years
25 - 27 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
NP: 15 Days. Minimum Qualifications Job Requirements: * Bachelors degree in CS. 8 years of hands-on experience in designing and developing distributed data pipelines. 5 years of hands-on experience in Azure data service technologies.5 years of hands-on experience in Python, SQL, Object oriented programming, ETL and unit testing Experience with data integration with APIs, Web services, Queues Experience with Azure DevOps and CI/CD as well as agile tools and processes including JIRA, confluence. Excellent Communication skill Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Delhi, Mumbai, Bengaluru, Remote
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France