Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
20 - 35 Lacs
Chennai
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-15yrs Work Location :Chennai Job Description: Required Technical Skill Set: Azure Native Technology, synapse and data bricks, Python Desired Experience Range: 8+ Years Location of Requirement: Chennai Required Skills: Previous experience as a data engineer or in a similar role Must have experience with MS Azure services such as Data Lake Storage, Data Factory, Databricks, Azure SQL Database, Azure Synapse Analytics, Azure Functions Technical expertise with data models, data mining, analytics and segmentation techniques Knowledge of programming languages and environments such as Python, Java, Scala, R, .NET/C# Hands-on experience with SQL database design Great numerical and analytical skills Degree in Computer Science, IT, or similar field; a master's is a plus Experience working in integrating Azure PaaS services Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601 )to proceed further.
Posted 3 weeks ago
8.0 - 14.0 years
12 - 30 Lacs
Mumbai
Work from Office
We are hiring Prompt Engineering Support for our MNC client. Location-Pune, Mumbai Experience- 8yrs to 14yrs Notice Period- only immediate to 30days Below I have mentioned the JD AI components - Open AI, Document Intelligence - AI/ML libraries (PyTorch, Langchain) - API Management - Effective prompts - Refining prompts - No SQL Cosmos/ MongoDB - Azure Portal - Azure DevOps Interested candidates please share your resume to
Posted 3 weeks ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : Python on AzureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will engage in the development and configuration of software systems, either managing the entire process or focusing on specific stages of the product lifecycle. Your day will involve applying your extensive knowledge of various technologies, methodologies, and tools to support projects and clients effectively, ensuring that the software solutions meet the required standards and specifications. You will also be responsible for guiding your team through challenges and fostering an environment of collaboration and innovation.Project Requirement:+ Python Data Pipeline (ETL) Development (MUST)+ Hands on experience on writing tech designs (MUST).+ Experience in developing with streaming technologies - Kafka , EventHub, Event grid (MUST)+ Kubernetes deployments, Dev Ops knowledge (MUST)+ Database technology - SQL Server, Cosmos DB (MUST)+ Develop Microservice/API in Nest JS/ Node JS Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Python on Azure.- Strong understanding of software development methodologies.- Experience with version control systems such as Git.- Familiarity with Agile and Scrum methodologies. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
8.0 - 14.0 years
8 - 11 Lacs
Mumbai
Work from Office
Hi Jobseeker, We are hiring Prompt Engineering Support for our MNC client. Location-Pune, Mumbai Interview Mode- Virtual Experience- 8yrs to 14yrs Notice Period- only immediate to 30days Below I have mentioned the JD J Prompt Engineering Support AI components - Open AI, Document Intelligence - AI/ML libraries (PyTorch, Langchain) - Python - API Management - Effective prompts - Refining prompts - No SQL - Cosmos/ MongoDB - Azure Portal - Azure DevOps - Grafana Interested candidates please share your resume to Priyanka.B@natobotics.com
Posted 3 weeks ago
5.0 - 10.0 years
0 - 0 Lacs
Pune
Hybrid
Full Stack Microsoft .NET Smart Web App Work Location Kharadi, Pune/ 4 Days Work from office is mandatory Total Exp Required 5+ yrs Relevant Exp required- 5+ yrs MUST HAVE Good knowledge on .NET 8, AKS, React JS & Cosmos. Proficiency in .NET framework and C# programming. Strong experience with React JS and front-end development. Hands-on experience with Cosmos DB or similar NoSQL databases. GOOD TO HAVE Familiarity with RESTful APIs and web services. Knowledge of version control systems, such as Git.
Posted 3 weeks ago
5.0 - 10.0 years
40 - 45 Lacs
Bengaluru
Work from Office
The Data Engineering role requires working on the Big data warehouse for Personalization Products. The associate may be involved in one or more business applications which are part of personalization portfolio. The associate would be working in a Global Team working closely in a distributed setup across timezone. Build data-driven platforms and capabilities to power Personalization experiences across site, app, stores, voice commerce. Build systems and workflows to process and manage petabyte scales of features data. Collaborate with member of technical staff to deliver end-to-end scalable systems for cross- functional projects. Work closely with business and product stakeholders to deliver on strategy, vision and roadmap for top initiatives in Personalization and Recommendations. Actively keep pace with new developing technologies in the data space and present technical solutions including architecture, design, implementation details, and customer and business impacting KPIs. Actively contribute to research community through participation at conferences, seminar and workshops. What you will bring: You have experience in building large-scale distributed systems that process large volume of data focusing on scalability, latency, and fault-tolerance. You have knowledge of complex software design, distributed system design, design patterns, data structures and algorithms. You have experience in building systems that orchestrate and execute complex workflows in big-data leveraging Apache Spark, Apache Kafka, and Hadoop stack preferably in Google Cloud Platform. You have experience in evaluating and fine-tuning systems for speed, robustness, and cost efficiency. You have experience in designing features and models from structured and unstructured data. You have experience in building datasets, tools, and services supporting big data and analytics operations. You have experience in relational SQL and NoSQL databases like Cassandra, Azure SQL, Cosmos. You are proficient in Java or Scala, Python, shell scripts, HQL, SQL. You have experience with distributed version control like Git or similar. You are familiar with continuous integration/deployment processes and tools such as Jenkins and Maven. You have strong written and oral communication skills. Good to Have Skills Hands-on experience with Java. Work experience with Spring framework. Work experience in using/building feature stores. Minimum Qualifications... Option 1: Bachelors degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification
Posted 4 weeks ago
3.0 - 8.0 years
40 - 45 Lacs
Bengaluru
Work from Office
The Data Engineering role requires working on the Big data warehouse for Personalization Products. The associate may be involved in one or more business applications which are part of personalization portfolio. The associate would be working in a Global Team working closely in a distributed setup across timezone. What will you do: Build data-driven platforms and capabilities to power Personalization experiences across site, app, stores, voice commerce. Build systems and workflows to process and manage petabyte scales of features data. Collaborate with member of technical staff to deliver end-to-end scalable systems for cross-functional projects. Work closely with business and product stakeholders to deliver on strategy, vision and roadmap for top initiatives in Personalization and Recommendations. Actively keep pace with new developing technologies in the data space and present technical solutions including architecture, design, implementation details, and customer and business impacting KPIs. Actively contribute to research community through participation at conferences, seminar and workshops. What you will bring: You have experience in building large-scale distributed systems that process large volume of data focusing on scalability, latency, and fault-tolerance. You have knowledge of complex software design, distributed system design, design patterns, data structures and algorithms. You have experience in building systems that orchestrate and execute complex workflows in big-data leveraging Apache Spark, Apache Kafka, and Hadoop stack preferably in Google Cloud Platform. You have experience in evaluating and fine-tuning systems for speed, robustness, and cost efficiency. You have experience in designing features and models from structured and unstructured data. You have experience in building datasets, tools, and services supporting big data and analytics operations. You have experience in relational SQL and NoSQL databases like Cassandra, Azure SQL, Cosmos. You are proficient in Java or Scala, Python, shell scripts, HQL, SQL. You have experience with distributed version control like Git or similar. You are familiar with continuous integration/deployment processes and tools such as Jenkins and Maven. You have strong written and oral communication skills. Good to Have Skills Hands-on experience with Java. Work experience with Spring framework. Work experience in using/building feature stores. Minimum Qualifications... Option 1: Bachelors degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description We are seeking an experienced Azure DevOps Engineer to develop and manage Infrastructure as Code (IaC) using Bicep for Azure PaaS (Platform as a Service) solutions with a focus on private networking. In this role, you will collaborate with development, security, and infrastructure teams to automate and streamline the deployment of secure, scalable, and resilient cloud environments. Responsibilities Infrastructure as Code (IaC) Development Design, develop, and maintain Bicep templates to provision and manage Azure PaaS resources such as App Services, Azure Functions, Logic Apps, EventHub, Service Bus, Azure SQL, Azure Storage, Key Vault, API Management, and Cosmos DB. Ensure IaC templates are modular, reusable, and follow best practices. Implement parameterization, modules, and consistent naming conventions to enhance template flexibility. Private Networking & Security Architect and deploy private networking solutions using Private Endpoints, Private Links, Virtual Networks (VNet), Subnets, and Network Security Groups (NSGs). Configure service integrations with private networking, ensuring traffic stays within the Azure backbone. Use Application Firewall and policies to enhance security. Use Azure Key Vault to securely store and manage secrets, certificates, and keys. CI/CD Pipelines & Automation Build and manage CI/CD pipelines in Azure DevOps to automate the deployment of Bicep-based PaaS environments. Integrate IaC pipelines with Biceps for consistent deployment. Use release gates, approvals, and checks to ensure compliance and security in deployment processes. Monitoring & Optimization Implement Azure Monitor, Application Insights, and Log Analytics to monitor PaaS environments. Create alerts and dashboards to ensure performance, availability, and security visibility. Optimize PaaS resources for cost, performance, and reliability. Collaboration & Documentation Collaborate with cloud architects, security teams, and application developers to design and implement PaaS solutions. Document infrastructure, deployment processes, and Bicep modules. Provide guidance and training to development teams on Bicep and Azure networking best practices. Skills Must have Education & Experience Bachelor's degree in Computer Science, Information Technology, or related field. 5+ years of experience in Azure DevOps engineering with a focus on PaaS solutions. Hands-on experience with Azure Bicep and private networking. Technical Skills: Strong proficiency with Bicep for IaC, including modular templates and reusable components. Experience with Azure PaaS services (App Services, Functions, SQL, Storage, API Management). Expertise in Azure networking, including Service and Private Endpoints, Private Links, and VNets. Network Security Groups (NSGs) and Application Gateway. Proficiency in CI/CD pipelines using Azure DevOps. Scripting skills in PowerShell, Bash, or Python for automation tasks. Familiarity with Azure RBAC and role-based security models. Soft Skills: Strong problem-solving and troubleshooting skills. Effective collaboration and communication abilities. Detail-oriented with a focus on cloud security and performance. Nice to have Azure certifications (e.g., Azure DevOps Engineer Expert, Azure Solutions Architect Expert, or Azure Administrator Associate). Knowledge of microservices architecture and containerization ( AKS). Familiarity with Azure Policy and Azure Blueprints. Locations-PUNE,BANGALORE,HYDERABAD,CHENNAI,NOIDA
Posted 4 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 4 weeks ago
6.0 - 11.0 years
10 - 15 Lacs
Ahmedabad
Remote
6+ years of experience in Software Application Programming and Maintenance Develop, maintain, enhance new and existing applications and products. Research, Learn, Suggest and Implement new Tools and Technologies. Write, unit test, and deploy code using established standards and procedures. Code Review and Optimization. * Role specific competencies DotNet Core (.net Core), MVC, ReactJS ASP.NET, C#, JavaScript, JQuery, Ajax, OOPS, CSS. GIT, SVN, MS SQL * Mandatory job behavior attributes required ? Which attributes are mandatory? Deliver Excellence at work and create error free code with less supervision. Drive solution mindset, work with the team and add value to them Manage requirement analysis and provide estimates. Display your readiness to change and work on new technologies. Apply knowledge of industry trends in Web application best practices, accessibility standards and developments. Excellent communication and interpersonal skills. Good Knowledge of SQL server programming. Analytical & Problem solving Skills. Candidate must be self motivated
Posted 4 weeks ago
2.0 - 4.0 years
5 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Position Type: Full-Time Location: Hyderabad Company Description We are a product based company that provides comprehensive software solutions for research facilities in Universities and Institutions worldwide. Please visit www.IdeaElan.com for more information. Key responsibilities Design and develop high-performance, scalable, and secure backend APIs and services using .NET Core. Work with relational (MS-SQL) and NoSQL (Cosmos DB, MongoDB) databases to create optimized data models and ensure data consistency and performance. Participate in code reviews and provide constructive feedback. Collaborate with front-end developers and other teams to deliver high-quality software. Write clean, maintainable, and efficient code while ensuring quality standards. Troubleshoot and debug complex issues, optimizing code for maximum performance and scalability. Stay updated with the latest trends in backend development and cloud technologies to drive innovation. Optimize database performance and ensure data integrity. Required Experience 2-4 years of experience in backend development. Strong experience with .NET Core and building RESTful APIs. Proficient with MS-SQL and experience working with NoSQL databases like Cosmos DB and MongoDB. Hands-on experience with Azure Cloud services (e.g., Azure Functions, Azure Storage, API Management, Azure SQL Database, etc.). Understanding of software development principles such as object-oriented programming (OOP), design patterns, and SOLID principles. Experience with version control systems such as Git. Strong knowledge of asynchronous programming, microservices architecture, and cloud-native application design. Familiarity with CI/CD pipelines, containerization (Docker), and deployment automation is a plus. Excellent problem-solving and debugging skills. Ability to work in an Agile development environment and collaborate with cross-functional teams. Good communication and collaboration skills.
Posted 4 weeks ago
4.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
At Elanco (NYSE: ELAN) it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We re driven by our vision of Food and Companionship Enriching Life and our approach to sustainability the Elanco Healthy Purpose to advance the health of animals, people, the planet and our enterprise. Making animals lives better makes life better join our team today! Your Role: Sr Data Engineer The data engineer s role is delivery focused. The person in this role will drive data pipeline and data product delivery through data- architecture, modeling, design, and development a professional grade solution on premise and/or Microsoft Azure cloud. Partner with data scientists and statisticians across Elanco global business functions to help prepare and transform their data into data products that further drive the scientific and/or business knowledge discovery, insights, and forecasting. Data engineers will be part of a highly collaborative and cross-functional team of technology and data experts working on solving complex scientific and business challenges in animal health using cutting edge data and analytics technologies. Your Responsibilities: Provide data engineering subject matter expertise and hands-on data- capture, ingestion, curation, and pipeline development expertise on Azure to deliver cloud optimized data solutions. Provide expert data PaaS on Azure storage; big data platform services; server-less architectures; Azure SQL DB; NoSQL databases and secure, automated data pipelines. Participate in data/data-pipeline architectural discussions to help build cloud native solutions or migrate existing data applications from on premise to Azure platform. Perform current state AS-IS and future state To-Be analysis. Participate and help develop data engineering community of practice as a global go-to expert panel/resource. Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems. What You Need to Succeed (minimum qualifications): At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory . Education : Bachelors or higher degree in Computer Science or a related discipline. What will give you a competitive edge (preferred qualifications): Azure native data/big-data tools, technologies and services experience including Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB , NoSQL and SQL Data Warehouse. Sound problem solving skills in developing data pipelines using Data Bricks , Stream Analytics and PowerBI. Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as PowerShell, C#, Java, Python, Scala, SQL , ADLS/Blob, Hadoop, Spark/SparkSQL, Hive , and streaming technologies like Kafka, EventHub etc. Additional Information: Travel: 0% Location: India, Bangalore Don t meet every single requirementStudies have shown underrecognized groups are less likely to apply to jobs unless they meet every single qualification. At Elanco we are dedicated to building a diverse and inclusive work environment. If you think you might be a good fit for a role but dont necessarily meet every requirement, we encourage you to apply. You may be the right candidate for this role or other roles! Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status
Posted 4 weeks ago
7.0 - 10.0 years
9 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
JD: Must-Have: 1. React JS and Type Scripting 2. Azure - API Gateway, Cloud services like LogicApp, Functions 3. GIT / Bit Bucket 4. REST APIs Below is a short tech list of my teams day-to-day activities. The expectation is that the candidate has working experience with the below-highlighted areas: React Apps deployed in Azure Third-party SAS applications like Twilio, MS Dynamics (CRM) Adobe Experience Manager Microsoft Chatbot REST APIs Guidewire Applications - PC/BC/CC/CE Java/J2EE Services Security - JWT, OAUTH, AAD, SSO Front-end libraries Azure: Web Applications LogicApps Functions App Insights SQL Server/Database Cosmos DB Azure API Gateway ARMS template/ Biceps DevOps CI/CD pipelines Azure Data factory Good to have: 1. Chatbots/Azure Bot Framework 2. Adobe Experience Manager 3. Guidewire - Core, Portals, Jutro framework 4. Java Develop and implement highly responsive user interface components using react concepts. Translating designs and wireframes into high-quality code Testing and updating the application to ensure a high level of performance and security Optimizing components for maximum performance across a vast array of web-capable devices and browsers Using additional resources, such as reusable components or frontend libraries Designing different Azure cloud solutions to meet different customer requirements. Using best practices to build unique Azure cloud solutions. Maintaining and monitoring Azure cloud solutions for availability and performance. Troubleshooting any security issues in Azure cloud solutions. Documenting application changes and developing updates
Posted 1 month ago
7.0 - 9.0 years
9 - 11 Lacs
Bengaluru
Work from Office
Job Tittle: Senior Azure Data Engineer (IoT & Streaming Analytics) Location: Bangalore Experience: 7-9 Years Job Summary: We are looking for a highly skilled Senior Azure Data Engineer with cloud data and analytics platforms, including a strong focus on Azure cloud services and IoT streaming data . The ideal candidate will design, build, and maintain real-time and batch data pipelines, ensuring scalability, performance, and high availability. You ll collaborate with cross-functional teams including data analysts, scientists, and platform engineers to drive innovative, data-driven solutions. Key Responsibilities: Design, develop, and maintain batch and streaming data pipelines in the Azure cloud environment Develop and implement strategies for processing and analyzing large volumes of IoT and unstructured streaming data Build scalable and secure data solutions using Azure services such as Data Factory, Synapse Analytics, Functions, Cosmos DB, and Event Hubs Collaborate with data analysts and data scientists to deliver insights-driven data models and pipelines Optimize and tune data workflows for high performance and scalability Implement monitoring, alerting, and data quality checks across data pipelines Support CI/CD, automation, and DevOps/DataOps practices Conduct code reviews and enforce best practices across the team Required Skills: Cloud Platforms and Tools: 3+ years of hands-on experience with Azure cloud analytical tools including: Azure Data Factory Azure Synapse Analytics (including Serverless SQL Pools) Azure Functions Azure Blob Storage Azure Cosmos DB Azure Event Hubs Experience with AWS or GCP is a plus Data Engineering and Analytics: 5+ years of experience with data and analytics concepts such as: ETL/ELT SQL development Data warehousing Reporting and visualization
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Dot Net Developer Company Description We are looking for you who is immediate joiner and want to grow with us! Job Description : This is a full-stack Senior software developer role with hands-on development experience in the below skillsets. Minimum 8 years of hands-on experience in application development. Must have .NET core experience with Microservices and API. Hands on experience in ReactJS and JavaScript designing responsive and reactive webpages. Strong hands-on experience in Azure cloud in services such as Azure Functions, Azure Storage Service, Azure Service Bus, Azure App Service, Azure App Insights, Azure Monitoring, and Azure Cosmos DB, Azure VMs, etc. Strong experience with SQL Server as well as Document databases. Experience in creating and working with CI/CD pipelines using Azure Devops and Github actions Good knowledge of Application and Cloud Security best practices. Nice to have is UX designing competency Required cloud certification: az-900 Start: Immediate Location: Bangalore Form of employment: Full-time until further notice, we apply 6 months probationary employment. We interview candidates on an ongoing basis, do not wait to submit your application.
Posted 1 month ago
10.0 - 15.0 years
20 - 25 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job_Description":" This is a remote position. Overview : Define cloud-native product architecture on Azure, ensuring scalability, performance, and modular integration Design core components including data ingestion pipelines, document processing modules, and AI integration patterns Translate product requirements and demo flows into a technical blueprint and infrastructure design Establish Azure service configurations (e.g., App Services, Azure Functions, Blob Storage, Azure OpenAI, Cosmos DB, API Management) Define role-based access and security architecture across personas and workflows Collaborate with UX, AI, and front-end development teams to enable seamless feature integration Ensure alignment with requirements including offline data handling and simulated workflows Design Azure infrastructure for product with a long-term view toward PaaS readiness Provide technical leadership and guidance across the solution delivery lifecycle Support iterative prototyping and manage demo product performance and architecture Requirements Required Skills & Qualifications: 10+ years in software architecture, with 3+ years designing solutions on Microsoft Azure Expertise in Azure services (App Services, Azure Functions, Logic Apps, Storage, AI/ML tools, Security) Experience integrating AI-driven features like document parsing, chatbot interfaces, and data analytics Experience designing enterprise SaaS applications and working in fast-paced demo/prototype environments Strong understanding of data modeling, API strategies, and microservices architecture Excellent communication and stakeholder engagement skills Desired: Familiarity with the Retirement or financial services domain Benefits At Exavalu, we are committed to building a diverse and inclusive workforce. We welcome applications for employment from all qualified candidates, regardless of race, colour, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We foster a culture that values all individuals and promotes diverse perspectives, where you can make a meaningful impact and advance your career. Exavalu also promotes flexibility depending on the needs of employees, customers and the business. It might be part-time work, working outside normal 9-5 business hours or working remotely.. . ","Job_Type":"Full time","
Posted 1 month ago
13.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
[{"Salary":null , "Remote_Job":false , "Posting_Title":"Data Architect" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" KeyResponsibilities: Design and architect end-to-end data solutions usingMicrosoft Fabric, Azure Data Factory, Azure Synapse Analytics, and other Azuredata services Develop comprehensive data architecture blueprints,including logical and physical data models Create data integration patterns and establish bestpractices for data ingestion, transformation, and consumption Design data lake and lakehousearchitectures optimized for performance, cost, and governance Lead implementation of Microsoft Fabric solutions includingData Factory, Data Activator, Power BI, and Real-Time Analytics Design and implement medallion architecture (Bronze,Silver, Gold layers) within Fabric Optimize OneLake storage and data organization strategies Configure and manage Fabricworkspaces, capacity, and security models Architect complex ETL/ELT pipelines using Azure DataFactory and Fabric Data Factory Design real-time and batch data processing solutions Implement data quality frameworks andmonitoring solutions RequiredQualifications: Overall, 13-15 years of experience; 5+ years ofexperience in data architecture and analytics solutions Hands-on experience with MicrosoftFabric, Expert-level proficiency in Azure data services(Azure Data Factory, Synapse Analytics, Azure SQL Database, Cosmos DB) Strong experience with Power BI development andadministration Proficiency in SQL, Python, and/or Scala for dataprocessing Experience with Delta Lake and Apache Spark Proficiency in data cataloging tools and techniques Experience in data governance using Purview or UnityCatalog like tools Expertise in Azure DataBricks in conjunction with AzureData Factory and Synapse Implementation and optimization using Medallionarchitecture Experience with EventHub and IoT data (streaming) Strong understanding of Azure cloud architecture andservices Knowledge of Git, Azure DevOps, andCI/CD pipelines for data solutions Understanding of containerization andorchestration technologies Hands-on experience with Fabric Data Factory pipelines Experience with Fabric Data Activator for real-timemonitoring Knowledge of Fabric Real-Time Analytics (KQL databases) Understanding of Fabric capacity management andoptimization Experience with OneLake and Fabric
Posted 1 month ago
4.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You ll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you re working in our four global Home Offices, Distribution Centers or Retail Stores TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you ll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family a Fortune 100 company and the world s leading off-price retailer. Job Description: Senior Engineer What you ll discover Inclusive culture and career growth opportunities A truly Global IT Organization that collaborates across North America, Europe, Asia and Australia, click here to learn more Challenging, collaborative, and team-based environment What you will do Success is always in style at TJX! Continue to explore career opportunities at TJX, a Fortune 100 company and the leading off-price retailer of clothing and home fashions in the U.S. and worldwide. At TJX, we are proud that, for 40 years, we have provided amazing value to our customers; but the merchandise we sell is just part of our story. We believe that our Associates bring our business to life, and we aim to support you by making TJX a terrific place to work. We are committed to leveraging our differences, and believe that the diverse skills, experiences and background that you bring into the organization will help us continue to succeed. Our retail chains include TJ Maxx, Marshalls, HomeGoods, Sierra Trading Post and Homesense, as well as tjmaxx.com, homegoods.com, marshalls.com and sierratradingpost.com in the U.S.; Winners, HomeSense and Marshalls in Canada; TK Maxx in the U.K., Ireland, Germany, Poland, Austria and the Netherlands, as well as Homesense and tkmaxx.com in the U.K.; and TK Maxx in Australia. With over $45 billion in sales, more than 3,800 stores and 235,000 Associates world-wide - TJX is an exciting place to grow your career! The Merchandising Solutions team is responsible for software and systems that support TJX s world-class Buying, Product Development, Planning and Allocation functions. This also includes Data Analytics and Automation opportunities Buying is the start of our business cycle - we build and support systems that we use to source, select, negotiate and procure products our customer s treasure. Product Development covers the full life cycle of bringing an idea from concept to physical product our customers will find in our stores and love. Planning and Allocations functions are the know-how of where and how we place our products and how we plan our Buying strategy. We build and support systems that have our proprietary expertise to make the best decisions for our customers and us. Data Analytics focuses on the data needs of our Buyers and Merchandisers and creating capabilities for them to make the best decisions for TJX and our customers. Automation covers RPA and Power platform solutions that enable our business to work in a more effective and efficient manner. You will be a part of a fully integrated agile team that is empowered to make decisions and look for improvements of how you get your work done. You will design, build, test and deliver software solutions that align to our business value. You will work with various stakeholders both inside our team and the broader TJX organization to refine ideas and provide options to solve problems. You will be laser focused on the quality and goodness of what we deliver and how we deliver it. Our ideal candidate will have a combination of deep technical knowledge, and experience with custom, cloud or hosted platforms. We are looking for a person who likes to influence and bring others up and truly be a part of a team. We want creative people that are familiar with the challenges of designing, developing, and deploying software both on-prem and in the cloud. What you will need We are looking for an experienced Senior Software engineer who has a broad technical domain experience. Someone who is not afraid of continuous learning and improving themselves and the team. With us you will have a chance to collaborate with other software engineers and be able to work with and influence full stack solutions. You will need a passion for collecting and developing knowledge and building expertise of our solutions and business model. You are also expected to contribute to full software development life cycle, including hands on coding, code reviews, source control management, build processes, testing, Devops, engineering excellence and operations excellence. Major Duties & Responsibilities A Senior Engineer has the knowledge & experience to design and implement business functionality in their assigned Platform(s)/Product(s). Engineers at this level can lead & deliver on assigned Epics in their supported area. They use DevSecOps best practices to ship high-quality code and continue to push their knowledge. End to End Feature Development Develop new features and services that are low-medium level of complexity working with the product / platform / infrastructure / security team. Design, code, deploy & support working software/technology component, working collaboratively with architects and other engineers. Build systems and services that have immediate impact for our business. Create Architecture, Estimates, Design, Code, COTS configuration for the required business feature. Responsible for code analysis, debugging, review and execution of unit/integration tests. Create test automation scripts. Responsible for delivering clean, secure, performant code / technology components that meets all the non-functional requirements. Responsible for achieving operational excellence as part of delivering feature. Acts in Lead Engineer capacity for medium to large initiatives, prioritizing and assigning tasks, providing guidance and resolving issues. Sets up application jobs, creates test data, and supports deployment, all at a medium to high level of complexity. Adheres to Sarbanes Oxley compliance and all TJX Company standards as applicable. Fully Owns Features. Co-owns Epic with strong guidance. Begins deeper questioning of processes in order to improve them. Strategy & Best Practices Actively participates in Development process definitions, best practices and standards. Can help lead evaluation of 3rd party engagement need. Talent Development & Evangelizing Helps onboard and mentor new hires. Mentors other engineers on the team. Support Talent Acquisition process by performing technical interviews for hiring. Learns and champions DevOps practices. Subject Matter Expert Develops expert knowledge in specific business applications / services supporting a given technical domain. Has a solid understanding of the end-to-end business application processes in a given technical domain. Provides input into technical application roadmaps including the recommendation of IT driven projects. Minimum Qualifications 4 - 8 years of experience as a software engineer with full stack development skills (C#, Front End: Blazor/React/Angular, Rest APIs, Entity framework, Bicep, Azure, DevOps etc.) Good Understanding of Clean Architectural principles. Strong Experience with Azure Functions, Cosmos DB, Event Hubs & Event Grids Experience/Exposure with DevSecOps in Azure Space using GitHub Experience building and supporting scalable, high-performance applications Experience with agile delivery model Preferred Qualifications Azure Certifications are a plus Front end development using Blazor is a plus In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individuals status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address: Salarpuria Sattva Knowledge City, Inorbit Road Location: APAC Home Office Hyderabad IN
Posted 1 month ago
3.0 - 5.0 years
7 - 12 Lacs
Pune
Work from Office
Role & responsibilities Proficient: Languages/Framework: Fast API, Azure UI Search API (React) tabases and ETL: Cosmos DB (API for MongoDB), Data Factory Data Bricks Proficiency in Python and R Cloud: Azure Cloud Basics (Azure DevOps) Gitlab: Gitlab Pipeline o Ansible and REX: Rex Deployment Data Science: Prompt Engineering + Modern Testing Data mining and cleaning ML (Supervised/unsupervised learning) NLP techniques, knowledge of Deep Learning techniques include RNN, transformers End-to-end AI solution delivery AI integration and deployment AI frameworks (PyTorch) MLOps frameworks Model deployment processes Data pipeline monitoring Expert: (in addition to proficient skills) Languages/Framework: Azure Open AI Data Science: Open AI GPT Family of models 4o/4/3, Embeddings + Vector Search Databases and ETL: Azure Storage Account Expertise in machine learning algorithms (supervised, unsupervised, reinforcement learning) Proficiency in deep learning frameworks (TensorFlow, PyTorch) Strong mathematical foundation (linear algebra, calculus, probability, statistics) Research methodology and experimental design Proficiency in data analysis tools (Pandas, NumPy, SQL) Strong statistical and probabilistic modelling skills Data visualization skills (Matplotlib, Seaborn, Tableau) Knowledge of big data technologies (Spark, Hive) Experience with AI-driven analytics and decision-making systems Note: ***Notice Period should not be more than 10-15 days. ***
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Software Engineer About JLL Technologies JLL is a leading professional services firm that specializes in real estate and investment management. Our vision is to re imagine the world of real estate, creating rewarding opportunities and amazing spaces where people can achieve their ambitions. In doing so, we will build a better tomorrow for our clients, our people and our communities. JLL Technologies is a specialized group within JLL. At JLL Technologies, our mission is to bring technology innovation to commercial real estate. We deliver unparalleled digital advisory, implementation, and services solutions to organizations globally. Our goal is to leverage technology to increase the value and liquidity of the world's buildings, while enhancing the productivity and the happiness of those that occupy them. What this job involves As the Full Stack Software Engineer, you will help JLL Technologies build the worlds best AI driven application s for exploring and analyzing the global commercial real estate market while helping real estate owners and occupiers see options and context. Responsibilities Hands-on Developmenton our leadingAI (Artificial Intelligence)Apps and PlatformTeam. Innovate new AI driven ideas and perform quick proof-of-concepts. Design and implement features and epics Automated Unit Testing and Automated QA Testing Review code and enforce team standards for quality, test coverage, performance etc. Network with engineers at adjacent teams and facilitate alignment and information sharing Continuously improve the development/delivery process Consults with the business to develop documentation and communication materials to ensure accurate usage and interpretation of JLL business requirements. The application is a Premium Tier 1 app.Though production incidentshappensrarely, flexibility to support the Team during off hours in case of Incidents is expected from this role. Sound like the job youre looking for Before you apply its also worth knowing what were looking for: Minimum Qualifications 2+ years of experience developing rich interactive web applications Excellent programming, software architecture, and communication skills Experience with agile software development methodologies (Scrum, etc.) Technical Skills & Competencies (Mandatory) C#, .NET Core React.js 16.8+ using ES6+ or TypeScript, Redux, Hooks Rest API development CosmosDB or other NoSQL Databases Experience with developingusing services on Azure or AWS Cloud DevOps tools and pipelines such as GitHub Workflows. Agile Scrum/Kanban working experience are preferred. Good technical writing, documentation, and communication skills. Self-motivated team player who is flexible and adapts in quickly changing environment. Technical Skills & Competencies (Preferred) Monitoring tools like Datadog, Splunk, etc. Experience with LLMs. Preferred Qualifications Ability to define and implement a solution across a complex application stack Bachelors in Computer Science or Engineering
Posted 1 month ago
6.0 - 8.0 years
25 - 30 Lacs
Bengaluru, India
Hybrid
Exp - 5 to 8 yrs Loc - Bengaluru Pref - Local candidates Mode - Hybrid (2 days from office) Posi - Permanent FTE Must have skills - Java, Spring boot, Multithreading, Logging, Angular, Kafka & Azure cloud
Posted 1 month ago
7.0 - 11.0 years
6 - 10 Lacs
Pune
Work from Office
Primary Skills .NET Core and .NET Framework developmentIn-depth experience in building scalable and maintainable applications using C#. This includes web applications, APIs, background services, and integration with third-party systems. Azure App Services, Azure Functions, and Azure DevOpsHands-on expertise in deploying applications to Azure App Services, creating serverless workflows with Azure Functions, and managing end-to-end CI/CD pipelines using Azure DevOps. Docker containerization and image managementSkilled in writing Dockerfiles, building and managing container images, and using Docker Compose for multi-container applications. Ensures consistent environments across development, testing, and production. Kubernetes orchestration and deploymentProficient in deploying and managing containerized applications using Kubernetes. Experience includes writing YAML manifests for deployments, services, config maps, and secrets, as well as managing scaling, rolling updates, and health checks. CI/CD pipeline creation and managementCapable of designing and implementing automated pipelines for building, testing, and deploying applications. Familiar with tools like Azure DevOps, GitHub Actions, and Jenkins to ensure smooth and reliable delivery processes. RESTful API development and integrationStrong understanding of REST principles and experience in designing, building, and consuming APIs. Uses tools like Swagger/OpenAPI for documentation and Postman for testing and validation. Microservices architecture designExperience in designing and implementing microservices-based systems using .NET and Docker. Focuses on modularity, scalability, and resilience, with inter-service communication via HTTP or messaging systems. Infrastructure as Code (IaC)Skilled in automating infrastructure provisioning using tools like Bicep, ARM templates, or Terraform. Ensures consistent and repeatable deployments of Azure resources across environments. Secondary Skills Azure Monitor, Application Insights, and Log AnalyticsFamiliar with monitoring and diagnostics tools in Azure to track application performance, detect anomalies, and troubleshoot issues using telemetry and logs. Helm charts for Kubernetes deploymentsBasic to intermediate knowledge of using Helm to package, configure, and deploy Kubernetes applications, enabling reusable and version-controlled deployments. Git and version control best practicesProficient in using Git for source control, including branching strategies, pull requests, and code reviews to maintain code quality and collaboration. SQL and NoSQL database integrationExperience in integrating applications with databases like Azure SQL, PostgreSQL, and Cosmos DB. Capable of writing optimized queries and managing database connections securely. Security best practices in cloud and container environmentsUnderstanding of authentication, authorization, and secure communication practices. Familiar with managing secrets, certificates, and identity access in Azure and Kubernetes. Agile/Scrum methodologiesComfortable working in Agile teams, participating in sprint planning, daily stand-ups, retrospectives, and using tools like Azure Boards or Jira for task tracking. Unit testing and integration testing frameworksKnowledge of writing and maintaining tests using frameworks like xUnit, NUnit, or MSTest. Ensures code reliability and supports test-driven development practices. Basic networking and DNS concepts in cloud environmentsUnderstanding of virtual networks, subnets, firewalls, load balancers, and DNS configurations in Azure and Kubernetes to support application connectivity and security.
Posted 1 month ago
7.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Seize the opportunity to be at the forefront of digital innovation in the reinsurance sector by joining us as a Lead DevSecOps Engineer for our innovative platform providing quick, efficient, and user-friendly solutions for facultative reinsurance business. About the Role We have an open Lead DevSecOps Engineering position where we need your expertise while you can grow further. We are looking for an experienced Engineer who can contribute to our delivery by forming and maintaining a high-quality DevSecOps function. As a Lead DevSecOps Engineer, you take ownership for the DevSecOps function in an agile setup. This includes: Plan and execute infrastructure improvements to guarantee a stable rollout of features together with the feature Squads and architects Coordinate and manage the release process, working in collaboration with feature and quality Squads Hands-on contribution in analyzing and resolving system instabilities Take care of vulnerabilities and coordinate their timely resolution Our applications are written in Java Spring Boot and .NET/C# leveraging various Azure Cloud Services (e.g., AKS, Cosmos DB, PostgreSQL, ASB). The Backend serves various user specific Frontends written in Angular. You will work with an Azure Cloud based application that Has a worldwide user base Is highly integrated into the full value chain of Swiss Re (including core legacy services) Must fulfill high security and compliance standards Uses state of the art technology stacks We work in an agile setup where empowerment of the team is of highest value. About You Nobody is perfect and meets 100% of requirements. If you, however, meet some of the criteria below and are genuinely curious about the world of software engineering, we will be happy to meet you. If you possess traits and experience that match most of the below, we would love to hear from you: Experience in leading DevSecOps activities in agile environments 7+ years of proven experience as a DevSecOps Engineer or similar role in software development Strong knowledge of software testing methodologies, tools, and processes Experience with Azure cloud services, and proficient knowledge in development (Java, C#, Angular), network architecture and security Excellent Communication Skills in English, both written and verbal, to effectively interact with the team and stakeholders Proactive and collaborative, with a strong sense of ownership and a willingness to go the extra mile to deliver quality solutions Abou
Posted 1 month ago
6.0 - 10.0 years
3 - 8 Lacs
Noida
Work from Office
Position: Pega LSA Architect Experience: 12+ years Location: Noida/ Chennai Educational Qualification: B.E./ B.Tech./ MCA Job Description: Minimum 12+ years of PEGA experience, at least 5+ years working as an LSA. Should be certified CLSA (Part 1 & 2 both) in Pega PRPC v6.x/v7.x/v8. Responsibilities: Lead the design and development of PEGA applications Collaborate with business stakeholders to understand requirements and develop solutions Conduct code reviews and ensure adherence to PEGA best practices Troubleshoot and resolve technical issues during development and post-production Provide technical guidance and mentorship to a team of Pega developers and Tech Leads Work closely with project managers to ensure projects are delivered on time and within scope Continuously explore new Pega features, tools, and technologies to enhance solution delivery Lead the PEGA upgrade programs Lead the configurations and deployments Contribute to the new proposal development Contribute to the internal capability-building Technical Skills: In-depth knowledge of following Pega Platform 8.x components Application Structure, Data Management, Process Automation, UI/UX Components, Decisioning Components, Security Components, Deployment, Reporting, PEGA Cloud [PDC], Advance Features [Cosmos UI & Constellation UI, PEGA Mobile] Proficiency in direct capture of objectives (DCO)methodology Proficiency in integration rules (SOAP, REST, MQ, JMS, Kafka.. etc.) Expertise in PEGA rules such as Decision rules, Declarative rules, Data pages, Validation rules, Data transforms and Activities, Flows and Flow Actions, Queue processors and job schedulers, Layouts, Report definitions, Multivariate circumstancing, Access groups and roles, Rule skimming Experience in Telecom/ Energy/ Banking/ Health care domain
Posted 1 month ago
5.0 - 8.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Role - Azure Database Admin Location- Bangalore Experience - 5 - 8 Yrs Notice Period - Immediate to 10 days working Shift - 12:30 PM to 9:30 PM
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough