Home
Jobs

28300 Azure Jobs - Page 31

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

15 - 17 Lacs

Chennai

Work from Office

Naukri logo

Location(s): Pune (Preferred), Chennai, Bengaluru (Flexibility for remote work within these locations may be considered for the right candidate.) Who You Are: Technical Expertise: o Proficient in .NET Core with 5+ years of hands-on expertise, demonstrating a strong foundation in developing robust, scalable applications using .NET technologies. o Specializes in: .NET Core (Expert level): Deep knowledge in building and maintaining high-performance, server-side applications with .NET Core. Microservices (Advanced level): Experienced in designing, developing, and implementing microservices architectures, understanding the principles of autonomy, granularity, and independent scaling. RESTful/GraphQL APIs (Advanced level): Proficient in creating and managing APIs, ensuring they are secure, scalable, and performant. Cloud Environments like AWS/Azure (Intermediate level): Solid experience in leveraging cloud services for deploying, managing, and scaling applications. Skilled at writing clean, scalable code that drives innovation, emphasizing maintainability and best practices in software development. o Experience includes working with: ORM: Understanding of Object-Relational Mapping to facilitate data manipulation and querying in a database-agnostic manner. JSON: Proficient in using JSON for data interchange between servers and web applications. Event-Driven Architecture: Knowledgeable in building systems that respond dynamically to events, improving application responsiveness and scalability. Inversion of Control (IOC) and Aspect-Oriented Programming (AOP): Implementing these patterns to increase modularity and separation of concerns. Containerization: Experience with Docker or similar technologies for encapsulating application environments, enhancing consistency across development, testing, and production. Service Discovery and Service Mesh: Familiarity with managing microservices communication patterns, ensuring services are dynamically discoverable and communicable. Multi-threading: Expertise in developing applications that efficiently execute multiple operations concurrently to improve performance. o Proficient with: RDBMS and NoSQL (Intermediate level): Competent in working with relational and non-relational databases, understanding their respective use cases and optimization techniques. Jira (Advanced level) and Git (Advanced level): Advanced proficiency in project management with Jira and version control with Git, ensuring efficient workflow and code management. Maven (Intermediate level): Knowledgeable in using Maven for project build and dependency management in .NET environments. Jenkins (Intermediate level): Experienced in implementing CI/CD pipelines with Jenkins, automating the software development process for increased productivity and reliability. o Utilizes these tools and platforms effectively in the software development process, contributing to the delivery of high-quality software solutions. Analytical Thinker: A strategic thinker passionate about engaging in requirements analysis and solving complex issues through software design and architecture. Team Player: A supportive teammate ready to mentor, uplift your team, and collaborate with internal teams to foster an environment of growth and innovation. Innovation-Driven: Always on the lookout for new technologies to disrupt the norm, youre committed to improving existing software and eager to lead the charge in integrating AI and cutting-edge technologies. What We Offer: Leadership & Impact: Step into a pivotal role within our innovation team, driving projects from inception to impactful implementation. Your leadership in integrating AI and cutting-edge tech into the financial sector will leave a lasting mark on clients and the industry. Growth & Learning: Immerse yourself in continuous learning, mastering the intricacies of SDLC documentation and project leadership. We provide the environment for nurturing skills and advancing careers. Recognition & Excellence: Your dedication and innovative contributions are celebrated here, with rewards and acknowledgment for your efforts toward shared goals. Global Influence: Lead initiatives with global impact, reshaping financial accessibility and championing sustainable tech practices worldwide.

Posted 1 day ago

Apply

175.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. American Express offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give dedicated time to invest in professional development & learning experience for Intern. Find your place in technology at #TeamAmex. Key Responsibilities Perform technical aspects of software development for assigned applications including design, developing prototypes, and coding assignments Debug software components and identify code defects for remediation Explore and innovate new solution to modernize platforms Preferred Qualifications 1+ years of Software development experience in a professional environment and/or comparable experience Hands-on Java/J2EE, RESTful API development, Spring Boot, Microservices, BPM Tool(Pega, JBPM, Camunda etc), Hands-on expertise with application design, software development and automated testing Experience in Agile development, application design, software development, and testing Experience with continuous integration/deployment (Jenkins, Maven, XLR, Mockito, SOAPUI, JMeter, OpenShift, Public Cloud(AWS/GCP/Azure), Docker). Ability to effectively communicate to internal and external business partners on architecture and solution design. Bachelor’s degree in computer science, computer science engineering, or related experience required, advanced degree Added advantage with HTML, CSS, AJAX, JavaScript Frameworks React and NodeJS. Java Certified professional Minimum Qualifications Collaborates with leadership across multiple teams to define solution requirements and technical implementation Engineering & Architecture’ Demonstrate technical expertise to help team members overcome technical problems Solves technical problems outside of day-to-day responsibilities We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of contractual employment as an Internship with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra

On-site

SimplyHired logo

Technical Specialist with 6+ years’ experience as Java Full Stack Development · Preferred Domain knowledge – Banking · Experience in Development, Enhancement Project . Strong full stack development experience on Core Java . Angular JS, Node JS, HTML, CSS, JavaScript Spring, Spring MVC, Spring boot framework, Hibernate and JPA,REST API and SOAP services . Strong full stack experience on Core Java and Advanced java development skills . Spring, Spring MVC, Spring boot framework, Hibernate and JPA,REST API and SOAP services . Good to have Azure/AWS cloud experience along with tools like Service Manager & XLRelease for DEVOPS,Kubernetes,Docker . Experience on Volante will be an added advantage · Exposure to agile teamwork process would be added advantage. · Good team player and self-motivated. . Analysis of APIs and existing Java code . Test driven development using Spring Boot, Hibernate, JPA. . Writing hibernates mapping files and maintain database. . Designing Controller, Services, Utility and Dao specific classes

Posted 1 day ago

Apply

4.0 - 9.0 years

12 - 15 Lacs

Mumbai

Remote

Naukri logo

Job descripon Senior Data Analyst (Power BI) Experience: 5+ years Opportunity Type: Permanent & Remote Placement Technical and Professional Requirements: • Develop, design & maintain interactive and insightful Power BI Dashboards. • Strong analytical & problem-solving skills. • Knowledge about extracting data from multiple resources. • Good understanding of Data-modelling, querying data from sources. • Basic understanding of DAX, Power Query. • Provide technical support & troubleshooting. • Ability to ensure optimal performance & data integrity. • Expertise in support for deployments. • Security & managing roles in Power BI. Job Responsibilities: • Proficiency in all aspects of Power BI, including report development, data modelling, and visualization. • Strong understanding of DAX and Power Query for data transformation and modelling. • Good knowledge of database concepts and experience working with relational databases. • Proficient in writing and optimizing SQL queries. • Strong analytical and problem-solving skills. • Excellent communication and collaboration skills to work effectively with crossfunctional teams. • Ability to translate business requirements into effective and visually appealing Power BI reports. • Knowledge on Python & Machine Learning is beneficial. • Knowledge of cloud technologies, particularly Azure and SharePoint.

Posted 1 day ago

Apply

7.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad, Experience: 7-9 Years Employment Type: Full-time Company Description At Antz AI, we lead the AI revolution with a focus on AI Agentic Solutions. Our mission is to integrate intelligent AI Agents and Human-Centric AI solutions into core business processes, driving innovation, efficiency, and growth. Our consulting is grounded in robust data centralization, resulting in significant boosts in decision-making speed and reductions in operational costs. Through strategic AI initiatives, we empower people to achieve more meaningful and productive work. Role Summary: We are seeking a highly experienced and dynamic Senior AI Consultant with 7-9 years of overall experience to join our rapidly growing team. The ideal candidate will be a hands-on technologist with a proven track record of designing, developing, and deploying robust AI solutions, particularly leveraging agentic frameworks. This role demands a blend of deep technical expertise, strong system design capabilities, and excellent customer-facing skills to deliver impactful real-world products. We are looking for an immediate joiner who can hit the ground running. Key Responsibilities: Solution Design & Architecture: Lead the design and architecture of scalable, high-performance AI solutions, emphasizing agentic frameworks (e.g., Agno, Langgraph) and microservices architectures. Hands-on Development: Develop, implement, and optimize AI models, agents, and supporting infrastructure. Write clean, efficient, and well-documented code, adhering to software engineering best practices. Deployment & Operations: Oversee the deployment of AI solutions into production environments, primarily utilizing AWS services. Implement and maintain CI/CD pipelines to ensure seamless and reliable deployments. System Integration: Integrate AI solutions with existing enterprise systems and data sources, ensuring robust data flow and interoperability. Customer Engagement: Act as a key technical liaison with clients, understanding their business challenges, proposing AI-driven solutions, and presenting technical concepts clearly and concisely. Best Practices & Quality: Champion and enforce best practices in coding, testing, security, and MLOps to ensure the delivery of high-quality, maintainable, and scalable solutions. Problem Solving: Diagnose and resolve complex technical issues related to AI model performance, infrastructure, and integration. Mentorship: Provide technical guidance and mentorship to junior team members, fostering a culture of continuous learning and excellence. Required Qualifications: Experience: 7-9 years of overall experience in software development, AI engineering, or machine learning, with a strong focus on deploying production-grade solutions. Agentic Frameworks: Demonstrated hands-on experience with agentic frameworks such as Langchain, Langgraph, Agno, AutoGen , or similar, for building complex AI workflows and autonomous agents. Microservices Architecture: Extensive experience in designing, developing, and deploying solutions based on microservices architectures. Cloud Platforms: Proven expertise in AWS services relevant to AI/ML and microservices (e.g., EC2, S3, Lambda, ECS/EKS, SageMaker, DynamoDB, API Gateway, SQS/SNS). Programming & MLOps: Strong proficiency in Python. Experience with MLOps practices, including model versioning, monitoring, and pipeline automation. System Design: Excellent understanding and practical experience in system design principles, scalability, reliability, and security. Real-World Deployment: A strong portfolio demonstrating successful deployment of AI products or solutions in real-world, production environments. Customer-Facing: Prior experience in customer-facing roles, with the ability to articulate complex technical concepts to non-technical stakeholders and gather requirements effectively. Immediate Availability: Ability to join immediately. Preferred Qualifications / Bonus Points: Experience with other cloud platforms (Azure, GCP). Knowledge of containerization technologies (Docker, Kubernetes). Familiarity with various machine learning domains (NLP, Computer Vision, Generative AI). Contributions to open-source AI projects. Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka

On-site

SimplyHired logo

Req ID: 328040 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Twin Solution Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Experience: 10+ years Key Responsibilities: Define the end-to-end architecture for Digital Twin solutions across cloud, edge, and on-premise environments. Lead design and integration of IoT, 3D modeling, simulation, and AI/ML components. Collaborate with business stakeholders to align digital twin strategy with operational goals. Evaluate technologies like Azure Digital Twins, NVIDIA Omniverse, Unity, etc. Skills Required: Strong experience in Digital Twin platforms (Azure, Siemens, Dassault, PTC, etc.). Deep understanding of industrial systems, IoT architecture, OPC-UA, MQTT. Knowledge of CAD/BIM integrations, APIs, 3D data pipelines. Hands-on with cloud-native tools (Azure, AWS, GCP). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu

On-site

SimplyHired logo

Req ID: 322998 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Database Administrator to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Job Title: Database Administrator (Microsoft SQL & Azure SQL) Seniority: 4-5 OffShore Profile Summary: We are looking for a highly skilled Database Administrator (DBA) with strong expertise in Microsoft SQL, Azure SQL, Windows environments, and networking. The ideal candidate will also have experience in PostgreSQL as a plus. This role requires excellent technical knowledge combined with strong communication skills, the ability to work effectively in a team, and autonomy in decision-making when necessary. The DBA will be responsible for managing, maintaining, and optimizing database systems to ensure their reliability, security, and performance. Additionally, as part of a database administration team, the candidate is encouraged to expand their skills by incorporating knowledge of other database technologies, such as DB2. Certifications will be considered a plus. List of Key Accountabilities: Design, implement, and maintain Microsoft SQL Server and Azure SQL databases. Ensure database security, integrity, and compliance with industry standards. Monitor, troubleshoot, and optimize database performance. Perform database migrations, backups, and disaster recovery planning and periodic execution. Collaborate with development and infrastructure teams to optimize database performance and integration. Document database structures, configurations, and procedures. Automate routine database tasks and enhance monitoring processes. Provide support for incident resolution and root cause analysis. List of Business Accountabilities: Act as a subject matter expert in database management for Microsoft SQL and Azure SQL. Work closely with stakeholders to understand business needs and provide database solutions. Ensure databases align with organizational goals and performance expectations. Support data governance, ensuring compliance with internal and external regulations. Effectively communicate database-related topics to both technical and non-technical audiences. Proactively identify opportunities to improve database processes and reduce risks. Soft Skills Strong communication skills, both written and verbal. Ability to work collaboratively in a team-oriented environment. Autonomous decision-making for database-related tasks. Analytical thinking and problem-solving capabilities. Proactive approach to identifying and mitigating risks. Adaptability to changing business and technical requirements. Technical Knowledge: Technology Level of expertise* Priority Must Nice to have Microsoft SQL Server Advanced ✅ Azure SQL Advanced ✅ Windows Environment Advanced ✅ Networking Intermediate ✅ PostgreSQL Basic ✅ DB2 Basic ✅ Database Security Advanced ✅ Performance Tuning Advanced ✅ Azure Cloud Technologies Advanced ✅ Scripting (T-SQL, PowerShell) Intermediate ✅ About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

KEY RESPONSIBILITIES Building and customizing websites and applications using NextJS, Tailwind CSS, .NET programming languages (such as C#), Sitecore XMCloud specific APIs, and other web development tools. Developing and implementing templates, components, layouts, and workflows to enable content creators to easily manage and publish web content. Developing and building new services in the microservice architecture Integrating with other systems and tools, such as CRMs, ERPs, marketing automation software, and social media platforms. Collaborating with other developers, designers, and content creators to ensure that the website and application meet functional, design, and performance requirements. Testing and debugging solutions to ensure high performance, security, and usability. Providing technical support and maintenance for websites and applications, including troubleshooting and fixing issues as they arise. Staying up-to-date with the latest Sitecore and web development trends, best practices, and technologies. PROFESSIONAL EXPERIENCE/ QUALIFICATIONS Bachelor's degree in computer science or a related field or equivalent experience 5+ years of experience in Sitecore development. 3+ years of experience in Sitecore XM Cloud. 5+ years of experience with Microservice architecture. Experience with a variety of Sitecore technologies, including Sitecore XP, Sitecore MVC, and Sitecore XM Cloud. Experience with cloud computing platforms, such as Azure. Experience with performance tuning and optimization. Excellent problem-solving and analytical skills. Strong documentation and organizational skills Strong communication and teamwork skills. Must Have Hands on experience with Sitecore XM Cloud Hands on experience with Microservice architecture Preferred Sitecore CMS Versions: 9.x, 10.x Languages: T-SQL, LINQ Scripting Languages: Ajax, JavaScript, Knockout.js, jQuery, JSON, HTML, XML CSS: Tailwind CSS, any CSS library Component Technologies: Web Services, SOAP, ADO.NET, WCF, WEB API, SOAP, REST Databases (RDBMS): SQL 2016/2019 Version Control: GitHub Enterprise ORM: Glass Mapper, Entity Framework Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

15 - 17 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Location(s): Pune (Preferred), Chennai, Bengaluru (Flexibility for remote work within these locations may be considered for the right candidate.) Who You Are: Technical Expertise: o Proficient in .NET Core with 5+ years of hands-on expertise, demonstrating a strong foundation in developing robust, scalable applications using .NET technologies. o Specializes in: .NET Core (Expert level): Deep knowledge in building and maintaining high-performance, server-side applications with .NET Core. Microservices (Advanced level): Experienced in designing, developing, and implementing microservices architectures, understanding the principles of autonomy, granularity, and independent scaling. RESTful/GraphQL APIs (Advanced level): Proficient in creating and managing APIs, ensuring they are secure, scalable, and performant. Cloud Environments like AWS/Azure (Intermediate level): Solid experience in leveraging cloud services for deploying, managing, and scaling applications. Skilled at writing clean, scalable code that drives innovation, emphasizing maintainability and best practices in software development. o Experience includes working with: ORM: Understanding of Object-Relational Mapping to facilitate data manipulation and querying in a database-agnostic manner. JSON: Proficient in using JSON for data interchange between servers and web applications. Event-Driven Architecture: Knowledgeable in building systems that respond dynamically to events, improving application responsiveness and scalability. Inversion of Control (IOC) and Aspect-Oriented Programming (AOP): Implementing these patterns to increase modularity and separation of concerns. Containerization: Experience with Docker or similar technologies for encapsulating application environments, enhancing consistency across development, testing, and production. Service Discovery and Service Mesh: Familiarity with managing microservices communication patterns, ensuring services are dynamically discoverable and communicable. Multi-threading: Expertise in developing applications that efficiently execute multiple operations concurrently to improve performance. o Proficient with: RDBMS and NoSQL (Intermediate level): Competent in working with relational and non-relational databases, understanding their respective use cases and optimization techniques. Jira (Advanced level) and Git (Advanced level): Advanced proficiency in project management with Jira and version control with Git, ensuring efficient workflow and code management. Maven (Intermediate level): Knowledgeable in using Maven for project build and dependency management in .NET environments. Jenkins (Intermediate level): Experienced in implementing CI/CD pipelines with Jenkins, automating the software development process for increased productivity and reliability. o Utilizes these tools and platforms effectively in the software development process, contributing to the delivery of high-quality software solutions. Analytical Thinker: A strategic thinker passionate about engaging in requirements analysis and solving complex issues through software design and architecture. Team Player: A supportive teammate ready to mentor, uplift your team, and collaborate with internal teams to foster an environment of growth and innovation. Innovation-Driven: Always on the lookout for new technologies to disrupt the norm, youre committed to improving existing software and eager to lead the charge in integrating AI and cutting-edge technologies. What We Offer: Leadership & Impact: Step into a pivotal role within our innovation team, driving projects from inception to impactful implementation. Your leadership in integrating AI and cutting-edge tech into the financial sector will leave a lasting mark on clients and the industry. Growth & Learning: Immerse yourself in continuous learning, mastering the intricacies of SDLC documentation and project leadership. We provide the environment for nurturing skills and advancing careers. Recognition & Excellence: Your dedication and innovative contributions are celebrated here, with rewards and acknowledgment for your efforts toward shared goals. Global Influence: Lead initiatives with global impact, reshaping financial accessibility and championing sustainable tech practices worldwide. Your Path with Company: Pioneering the Digital Era: Actively shape the future of technology with us, from enhancing financial accessibility in India to pioneering global sustainability solutions. Engagement in Cutting-Edge Projects: Dive into the SDLC heart, from writing code to deployment, driving professional growth with each project. Global Impact and Personal Growth: Contribute to global projects while advancing your skills and career path in alignment with our collective vision. Benefits We Offer: Flexibility for Work-Life Harmony: Find balance with flexible schedules that prioritize your well-being. Relocation Support & Global Opportunities: Seamlessly transition with relocation support and explore international opportunities. Rewarding Performance: Performance-based bonuses and annual rewards recognize your dedication. Comprehensive Well-being: Comprehensive benefits including Provident Fund and health insurance ensure your peace of mind.

Posted 1 day ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Mode of Work : Full-time, On-site Experience Required : 8+ years Who You Are : - You are a visionary leader with a robust technical background in Microsoft .Net and related technologies, eager to shape the future of fintech solutions with a harmonious blend of project management expertise and profound technical knowledge, you stand ready to guide teams, mentor emerging talent, and spearhead innovative projects from their inception through to their triumphant realization. Your Role : - Lead and Innovate: Direct the planning, execution, and delivery of complex Microsoft .Net projects, guaranteeing high-quality results within budget, scope, and timeline constraints. - Foster Growth : Create an engaging and cohesive work environment, mentoring team members to unlock their full potential. - Mitigate Risks : Proactively identify project risks and formulate effective mitigation strategies, maintaining transparent communication with all stakeholders. - Ensure Excellence : Uphold process adherence by leveraging industry best practices and standards in software development and delivery. - Develop Talent : Supervise, coach, and cultivate your team, ensuring alignment with performance appraisal processes and fostering professional growth. - Embrace Technology : Drive strategic leadership in the adoption of new technologies, especially AI, to innovate and disrupt within the financial sector. - Desired/Recommended Technical Competencies & Skills : - .NET Core Mastery : Strong hands-on expertise in .NET Core, showcasing deep knowledge and experience in building robust, scalable applications. - Software Development Best Practices : Proficient in writing clean, maintainable code, with extensive experience in ORM, JSON, and multi-threading, ensuring high performance and scalability. - API Design and Development : Skilled in developing both RESTful and GraphQL APIs, understanding the nuances of creating highly accessible and efficient web services. - Microservices and Event-Driven Architecture : Experienced with designing and implementing microservices architectures, utilizing event-driven patterns for dynamic and responsive applications. - Containerization and Orchestration : Proficient in containerization technologies like Docker, and orchestration with Kubernetes, including service discovery and service mesh, to manage complex, scalable microservices landscapes. - Cloud Platforms : Expertise in cloud environments such as AWS/Azure, leveraging cloud services for enhanced application performance, scalability, and reliability. - Database Management : Expertise with RDBMS and NoSQL databases, understanding their application within .NET environments for optimal data storage, retrieval, and manipulation strategies. - DevOps Practices : Comprehensive understanding of DevOps practices including continuous integration and continuous delivery (CI/CD) using tools like Jenkins, and version control systems like Git, integrated within Jira for project management and Maven for dependency management. - Security Practices : Awareness of security best practices and common vulnerabilities specific to .NET development, implementing secure coding techniques to protect data and applications. - Monitoring and Logging : Adept at using tools for application monitoring, logging, and distributed tracing, ensuring high availability and identifying issues proactively. - Leadership and Communication : Exceptional leadership, communication, and project management abilities to lead diverse and geographically dispersed teams. What We Offer : - Impactful Work : A pivotal role within our innovation team, making a significant impact on financial accessibility and sustainability globally. Projects you lead will directly contribute to enhancing the financial infrastructure, with a focus on inclusivity and sustainability. - Growth Opportunities : Continuous learning, career advancement, and hands-on experience with cutting-edge projects. - Recognition and Reward : A culture that celebrates your dedication and innovative contributions with comprehensive benefits, flexible schedules, relocation support, and opportunities for international exposure. Benefits We Offer : - Flexibility for Work-Life Harmony : Find balance with flexible schedules that prioritize your well-being. - Relocation Support & Global Opportunities : Seamlessly transition with relocation support and explore international opportunities. - Rewarding Performance : Performance-based bonuses and annual rewards recognize your dedication. - Comprehensive Well-being : Comprehensive benefits including Provident Fund and health insurance ensure your peace of mind.

Posted 1 day ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Hyderabad

Hybrid

Naukri logo

Job Title: Data Engineer Data, Analytics Location: Hyderabad Employment Type: Full-time Why this role matters Our analytics and AI products are only as good as the data they run on. You will design and operate the pipelines and micro-services that transform multi-structured data into reliable, governed, and instantly consumable assetsregardless of which cloud the customer chooses. Core Skills & Knowledge Programming: Python 3.10+, Pandas or Polars, SQL (ANSI, window functions, CTEs), basic bash. Databases & Warehouses: PostgreSQL, Snowflake (stages, tasks, streams), parquet/Delta-Lake tables on S3/ADLS/GCS. APIs & Services: FastAPI, Pydantic models, OpenAPI specs, JWT/OAuth authentication. Orchestration & Scheduling: Apache Airflow, Dagster, or Prefect; familiarity with event-driven triggers via cloud queues (SQS, Pub/Sub). Cloud Foundations: Hands-on with at least one major cloud (AWS, Azure, GCP) and willingness to write cloud-agnostic code, with a cost-aware development approach. Testing & CI/CD: pytest, GitHub Actions / Azure Pipelines; Docker-first local dev; semantic versioning. Data Governance: Basic understanding of GDPR/PII handling, role-based access, and encryption-at-rest/in-flight. Nice-to-Have / Stretch Skills Streaming ingestion with Kafka / Kinesis / Event Hub and PySpark Structured Streaming. Great Expectations, Soda, or Monte Carlo for data quality monitoring. Graph or time-series stores (Neo4j, TimescaleDB). Experience & Education 6-8 years of overall IT experience with over 4 years of relevant experience building data pipelines or back-end services in production, ideally supporting analytics or ML use-cases. Bachelors in Computer Science, Data Engineering, or demonstrably equivalent experience.

Posted 1 day ago

Apply

55.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

SimplyHired logo

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Extensive experience in developing and optimizing complex Power BI solutions Proven ability to design and implement scalable data models In-depth knowledge of data integration techniques and tools Experience with data pipeline orchestration and automation Advanced knowledge of DAX calculations and data modelling Proficiency in SQL and data warehouse concepts Expertise in performance tuning and optimization of Power BI reports and SQL queries Experience on Azure preferred Your profile Create DAX measures, reports, and other objects (bookmarks, calculation groups) Optimize pbix files via Performance Analyzer, Tabular Editor, Dax Studio/Bravo Share with rest team PBI best practices Review models & reports follow by BPA Support team in bug fixing, optimizing reports and measures Help understand SDLC framework for PBI Implement & Explain Dynamic RLD follow Data Governance guidelines Support PBI Workspace Maintenance Create C# scripts to automate various tasks in PBI Share knowledge about PBI capabilities. What you’ll love about working here Choosing Capgemini means having the opportunity to make a difference, whether for the world’s leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesn’t look as bright as you’d like, you have the opportunity to make change: to rewrite it. When you join Capgemini, you don’t just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Posted 1 day ago

Apply

5.0 - 7.0 years

20 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Must have (Top to Bottom is the Priority) Very dynamic to work Kubernetes experience Azure / Aws experience (Mainly in Networking, Storage, Computing). GIT and Ansible Jenkins or GitHub Deployment experience (Including CI/CD) Programming knowledge in any of the following technology C# .NET JAVA Angular Node Understanding of Horizon view (Connection Server / VIDM and Security server) ADO Knowledge Good Communication skill Good to have: Understanding the REST based API creation and testing Scripting knowledge (Like PowerShell / Linux Shell scripting) API lifecycle (Testing, Debugging and test with load) Basic understanding of OKTA Good understanding in the application layer security (Layer 7) and Transport Layer security (Layer 4) Knowledge in system securities (Like Windows GPOs) Terraform is an additional good to have skill Understanding of GPOs Understanding of Products and technologies are more value add.

Posted 1 day ago

Apply

3.0 - 7.0 years

6 - 15 Lacs

Pune, Chennai

Hybrid

Naukri logo

Company Description: Volante is on the Leading Edge of Financial Services technology, if you are interested to be on an Innovative fast- moving team that leverages the very best in Cloud technology our team may be right for you. By joining the product team at Volante, you will have an opportunity to shape the future of payments technology, with focus on payment intelligence. We are a financial technology business that provides a market leading, cloud native Payments Processing Platform to Banks and Financial institutions globally. Education Criteria: • B.E, MSc, M.E/MS in Computer Science or similar major. Relevant certification courses from reputed organization. Experience of 3+ years as a Data Engineer Responsibilities: • The role involves design and development of scalable solutions, payment analytics unlocking operational and business insight • Own data modeling, building ETL pipelines and enabling data driven metrics • Build and optimize data models for our application needs • Design & develop data pipelines and workflows that integrate data sources (structured, unstructured data) across the payment landscape • Assess customer's data infrastructure landscape (payment ancillary systems including Sanctions, Fraud, AML) across cloud environments like AWS, Azure as well as on-prem, for deployment design • Lead the enterprise application data architecture design, framework & services plus Identify and enable the services for SaaS environment in Azure and AWS • Implement customizations and data processing required to transform customer datasets that is needed for processing in our analytics framework/BI models • Monitor data processing, machine learning workflows to ensure customer data is successfully processed by our BI models, debugging and resolving any issues faced along the way • Optimize queries, warehouse, data lake costs • Review and provide feedback on Data Architecture Design Document/HLD for our SaaS application • Cross team collaboration to successfully integrate all aspects of the Volante PaaS solution • Mentor to the development team Skills: • 3+ years of data engineering experience data collection, preprocessing, ETL processes and Analytics • Proficiency in data engineering Architecture, Metadata management, Analytics, reporting and database administration • Strong in SQL/NoSQL, Python, JSON and data warehousing/data lake , orchestration, analytical tools • ETL or pipeline design & implementation of large data • Experience with data technologies, frameworks like Databricks, Synapse, Kafka, Spark, Elasticsearch • Knowledge of SCD, CDC, core data warehousing to develop a cost-effective, secure data collection, storage, and distribution of data for SaaS application • Experience in application deployment in AWS or Azure w/container, Kubernetes • Strong problem-solving skills and passion for building data at scale Job Description Engineering Skills (Desirable): • Knowledge of data visualization tools like Tableau • ETL Orchestration tools like Airflow and visualization tools like Grafana • Prior experience in Banking or Payments domain Location: India (Pune or Chennai)

Posted 1 day ago

Apply

12.0 - 22.0 years

30 - 37 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

5 days working Mondat-Friday Work from office:9am-6 pm Who You Are: You are a visionary leader with a robust technical background in Microsoft .Net and related technologies, eager to shape the future of fintech solutions. With a harmonious blend of project management expertise and profound technical knowledge, you stand ready to guide teams, mentor emerging talent, and spearhead innovative projects from their inception through to their triumphant realization. Your Role: - Lead and Innovate: Direct the planning, execution, and delivery of complex Microsoft .Net projects, guaranteeing high-quality results within budget, scope, and timeline constraints. - Foster Growth: Create an engaging and cohesive work environment, mentoring team members to unlock their full potential. - Mitigate Risks: Proactively identify project risks and formulate effective mitigation strategies, maintaining transparent communication with all stakeholders. - Ensure Excellence: Uphold process adherence by leveraging industry best practices and standards in software development and delivery. - Develop Talent: Supervise, coach, and cultivate your team, ensuring alignment with performance appraisal processes and fostering professional growth. - Embrace Technology: Drive strategic leadership in the adoption of new technologies, especially AI, to innovate and disrupt within the financial sector. Desired/Recommended Technical Competencies & Skills: - .NET Core Mastery: Strong hands-on expertise in .NET Core, showcasing deep knowledge and experience in building robust, scalable applications. - Software Development Best Practices: Proficient in writing clean, maintainable code, with extensive experience in ORM, JSON, and multi-threading, ensuring high performance and scalability. - API Design and Development: Skilled in developing both RESTful and GraphQL APIs, understanding the nuances of creating highly accessible and efficient web services. - Microservices and Event-Driven Architecture: Experienced with designing and implementing microservices architectures, utilizing event-driven patterns for dynamic and responsive applications. - Containerization and Orchestration: Proficient in containerization technologies like Docker, and orchestration with Kubernetes, including service discovery and service mesh, to manage complex, scalable microservices landscapes. - Cloud Platforms: Expertise in cloud environments such as AWS/Azure, leveraging cloud services for enhanced application performance, scalability, and reliability. - Database Management: Expertise with RDBMS and NoSQL databases, understanding their application within .NET environments for optimal data storage, retrieval, and manipulation strategies. - DevOps Practices: Comprehensive understanding of DevOps practices including continuous integration and continuous delivery (CI/CD) using tools like Jenkins, and version control systems like Git, integrated within Jira for project management and Maven for dependency management. - Security Practices: Awareness of security best practices and common vulnerabilities specific to .NET development, implementing secure coding techniques to protect data and applications. - Monitoring and Logging: Adept at using tools for application monitoring, logging, and distributed tracing, ensuring high availability and identifying issues proactively. - Leadership and Communication: Exceptional leadership, communication, and project management abilities to lead diverse and geographically dispersed teams.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data warehouse,data engineering,etl,data,python,sql,data pipeline,azure synapse,azure datafactory,pipelines,skills,azure databricks,architect designing,pyspark,azure,airflow Show more Show less

Posted 1 day ago

Apply

12.0 - 22.0 years

35 - 65 Lacs

Coimbatore

Work from Office

Naukri logo

Timings : 10 am to 7.30 pm 2 days WFO and 3 days WFH Responsibilities : - As part of a scrum team, you'll contribute to the continuous development of the product features that are in the care of your team - Analyse requirements, prepare High level/low-level designs, and realize it with project team. - Lead a team of software engineers, take the delivery responsibility of the team. - Ensures quality of deliverables from team by doing stringent reviews and coaching the team. - Provide the estimates for complex and large projects, support Project manager to arrive at project planning. - Forms the bridge between the Software Engineers and Solution Analysts, IT architects. - Discusses technical topics (with the SEs, as a specialist), as well as holistic, architecture topics (with the IT Architects, as a generalist). - Translates complex content to different stakeholders, both technical (like software engineers) as well as functional (business), in a convincing and well-founded way, and adapted to the target audience. - Works in a support environment, eye for details and keen on optimizations. Profile Description : - Able to take care of all responsibilities mentioned in above section. - Minimum 6 years of experience and in-depth knowledge of one or more Azure services - Service Bus, Azure API Management, Azure Cosmos DB, Azure SQl, Redis. - Minimum 6 years of experience of C#, .NET, API & Service Deployment and SQl Server. And familiar with Application Insights, Azure App Configuration and Azure Key Vault. - Has 3 years of experience and Knowledge of Kubernetes on Azure. Has Experience with : - Building secure API's - Leveraging messaging systems, such as Service Bus, for distributed processing - Leveraging integration technologies such as API Management, Service Bus, Event Grid and Event Hub - Developing asynchronous, reliable backend tasks - Understanding the various data storage options of the Azure Platform - Experience with deploying mobile software: phased rollouts, feature flagging, CICD approach, build toolchain. - Able to write well-documented and clean code. - Affinity with maintaining and evolving a codebase to nourish high-quality code. - Familiar with one or more CI/CD environments: Gitlab CI, Github Actions, Circle CI, etc - Strong problem-solving and critical-thinking abilities - Good communication skills that facilitate interaction - Confident, detail oriented and highly motivated to be part of a high-performing team - A positive mindset and continuous learning attitude - You have the ability to work under pressure and adhere to tight deadlines - Familiar with SCRUM and Agile collaborations. Ensures compliance of project deliverables in line with Project Management methodologies. - Exchanging expertise with other team members(knowledge sharing) - Strong customer affinity, to deliver highly performant applications & quick turn around in bug fixes. - Work in project teams and go for your success as a team. lead by example to drive the success of the team on a technical level. - Is willing to work in both Projects and Maintenance activities. - Open for travel to Belgium Nice to have Competencies : - Working experience in a SAFe environment is a plus. - Experience working with European clients

Posted 1 day ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description Job Description: CGBU ECP/IOT software development team is seeking an experienced platform software engineer in IOT space. We are seeking a candidate with 10+ years of development experience on Linux based platforms. Location: Bengaluru, India (or Gurugram, India) Job Requirements: BS or MS degree in computer science, or equivalent 10+ years of software engineering with proven results in designing, implementing and maintaining complex systems and services Rapid and continuous learning to support new features, innovation on Linux platform. Participate in discussions and maintenance to improve the software performance, maintainability, serviceability, and reliability. Ability to work in CI/CD and DevOps work styles. Solid technical background, in-depth understanding of Linux (or Unix) OS internals Strong understanding of Linux networking, routing protocols, firewall and NAT Embedded Platform system development experience Linux Operating systems and firmware development expertise Good knowledge on Public/Private Cloud concepts. Proficiency in programming languages: C/C++/Python/Java Linux package management (YUM, RPM) Linux Distributions: e.g. Oracle Linux, Debian, Red Hat, CentOS. Bootstrapping new platforms (x86, ARM) Ability to integrate 3 rd party drivers on a Linux based Platform Familiarity with OpenWrt and Yocto and Linux open-source development Familiarity with WIFI and Mesh standards and open-source integrations Familiarity with 4G/LTE/5G Cellular WAN technologies Linux Open-source development experience is a big plus Ability and desire to resolve bugs, from debugging to deployment of fixes in production The ideal candidate will have the following skills: Proficiency in C/C++/Python/Java Ability to work in a fast paced and challenging environment. Experience working on agile development teams. Knowledge on OCI / AWS / Azure / GCP is a strong plus. Knowledge of Micro-Services architecture Bootstrapping new Linux platforms (x86, ARM) Hands on embedded systems development experience Linux OS installation, customization, deployment with an emphasis on embedded products Linux driver and application development Linux networking concepts like poll mode/user mode (DPDK) Containerization (Docker, Kubernetes) Container networking, docker/podman Experience working with IPSEC, TLS and Routing protocols like BGP/OSPF Experience with Oracle database/ MySQL Strong understanding of WIFI 6/7, Mesh standards and open-source integrations with MediaTek or Qualcomm WIFI modules. Working knowledge of 4G/LTE/5G Cellular WAN technologies Familiarity with Protocols: TCP/IP, UDP, HTTPS, DNS, DHCP, Firewall, NAT, VLAN etc. Linux performance tuning and characterization. Soft Skills: Ability to multi-task and handle changing priorities. Strong command on spoken and written English. Excellent team skills, can-do attitude, focus on quality and drive to make a difference in a dynamic, fast paced organization. Advocate standard methodologies with other engineers, when it comes to development, troubleshooting, testing and deployment Write blogs and presentations on a variety of subjects. Why CGIU Reimaging communications to connect the world Gone are the days where telecommunications were synonymous with pure networking protocols. In the world of tech, disruption is the name of the game. Driven by 5G, IoT, analytics and cloud technologies, this disruption is happening at the core of the telecom network, impacting the way we communicate, work, study and consume entertainment services. And it is not just at the network level. 5G, edge, IoT, AI/ML, analytics and cloud technologies are the underlying fabric of an entire ecosystem of fully connected intelligent sensors and devices, capable of overhauling economic and business policies, and further blurring geographical and cultural borders. When these technologies come together, they are capable of delivering at every rung of the ecosystem’s ladder, and spark innovations that will have a profound impact on social and lifestyle shifts, which affect the way we do business and drive the next generation of technological development. We are the Communications Global Industries Unit in Oracle, where IT meets network! We believe that the essence of us as a society is fulfilled by the communications that we have and the interactions that we enjoy with one another. And now more than ever, these interactions are becoming enriched with machine to machine and machine to everything. Proud of our 40+ years of heritage and expertise in helping our customers in the areas most critical for them in security, signaling and policy, we are leveraging the Oracle Cloud DNA to help our customers capitalize on emerging technologies to ultimately drive differentiation and deliver a competitive edge for their organization. Driven by a DevOps culture, we aim to deliver continuous innovation into our customers’ CI/CD pipelines, with the power of ML, AI, automation and analytics to help them create, test and deploy services in a more agile, secure and flexible way. Do you have the “right brain telco”, “left brain IT” mentality? Do you want to be at the forefront of delivering groundbreaking IT technologies into the core of telecoms networks? Do you thrive in a DevOps culture fuelled by collaboration and open communication? If you enjoy working within highly skilled, multicultural teams and are passionate about cloud technologies, AI/ML, analytics, automation, 5G, edge, IoT and communications technologies, come join us as we re-imagine communications to connect the world. Our mission is to be the most trusted provider of multi-generational network solutions. We thrive to deliver high quality products and services by investing in our people and delighting our customers. If you are passionate about customer centricity, strive to deliver the highest quality and value creativity then we want to hear from you. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of Edge Platform software .. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 day ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Monday to Friday (WFO) Timings : 9 am to 6 pm Desired Skills Expertise: Strong experience and mathematical understanding in one or more of Natural Language Understanding, Computer Vision, Machine Learning, and Optimization Proven track record in effectively building and deploying ML systems using frameworks such as PyTorch, TensorFlow, Keras, scikit-learn, etc. Expertise in modular, typed, and object-oriented Python programming Proficiency with core data science languages (such as Python, R, Scala), and familiarity & flexibility with data systems (e.g., SQL, NoSQL, knowledge graphs) Experience with financial data analysis, time series forecasting, and risk modeling Knowledge of financial regulations and compliance requirements in the fintech industry Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization technologies (Docker, Kubernetes) Understanding of blockchain technology and its applications in fintech Experience with real-time data processing and streaming analytics (e.g., Apache Kafka, Apache Flink) Excellent communication skills with a desire to work in multidisciplinary teams Ability to explain complex technical concepts to non-technical stakeholders

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Opportunity A key player in the Big Data solutions space, we specialize in creating and implementing large-scale data processing frameworks. Our mission is to help clients harness the power of data analytics to drive business insights and operational efficiency. With a strong focus on leveraging cutting-edge technologies, we provide a collaborative environment conducive to professional growth and innovation. Role & Responsibilities Design and implement scalable data processing frameworks using Hadoop and Spark. Develop ETL processes for data ingestion, transformation, and loading from diverse sources. Collaborate with data architects and analysts to optimize data models and enhance performance. Ensure data quality and integrity through rigorous testing and validation. Create and maintain documentation for data workflows, processes, and architecture. Troubleshoot and resolve data-related issues in a timely manner. Skills & Qualifications Must-Have Proficiency in the Hadoop ecosystem (HDFS, MapReduce, Hive). Hands-on experience with Apache Spark and its components. Strong SQL skills for querying relational databases. Experience with ETL tools and data integration technologies. Knowledge of data modeling techniques and best practices. Familiarity with Python for scripting and automation. Preferred Experience with NoSQL databases (Cassandra, MongoDB). Ability to tune performance for large-scale data workflows. Exposure to cloud-based data solutions (AWS, Azure). Benefits & Culture Highlights Dynamic work environment focused on innovation and continuous learning. Opportunities for professional development and career advancement. Collaborative team atmosphere that values diverse perspectives. Skills: sql proficiency,big data debveloper,data modeling techniques,data integration technologies,python scripting,etl tools,gcp,performance tuning,python,sql,hadoop ecosystem (hdfs, mapreduce, hive),apache spark,data modeling,pyspark,data warehousing,hadoop ecosystem Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly skilled Senior/Lead Data Scientist with deep expertise in AI/ML/Gen AI, including Deep Learning, Computer Vision, and NLP. The ideal candidate will bring strong hands-on experience, particularly in building, fine-tuning, and deploying models, and will work directly with customers with minimal supervision. This role requires someone who can not only lead and execute technical projects but also actively contribute to business development through customer interaction, proposal building, and RFP responses. You will be expected to take ownership of AI project execution and team leadership, while helping Tekdi expand its AI footprint. Key Responsibilities Contribute to AI business growth by working on RFPs, proposals, and solutioning activities. Lead the team in delivering customer requirements, ensuring quality and timely execution. Develop and fine tune advanced AI/ML models using deep learning and generative AI techniques. Fine-tune and optimize Large Language Models (LLMs) such as GPT, BERT, T5, and LLaMA. Interact directly with customers to understand their business needs and provide AI-driven solutions. Work with Deep Learning architectures including CNNs, RNNs, and Transformer-based models. Leverage NLP techniques such as summarization, NER, sentiment analysis, and embeddings. Implement MLOps pipelines and deploy scalable AI solutions in cloud environments (AWS, GCP, Azure). Collaborate with cross-functional teams to integrate AI into business applications. Stay updated with AI/ML research and integrate new techniques into projects. Required Skills & Qualifications Minimum 6 years of experience in AI/ML/Gen AI, with at least 3+ years in Deep Learning/Computer Vision. Strong proficiency in Python and popular AI/ML frameworks (TensorFlow, PyTorch, Hugging Face, Scikit-learn). Hands-on experience with LLMs and generative models (e.g., GPT, Stable Diffusion). Experience with data preprocessing, feature engineering, and performance evaluation. Exposure to containerization and cloud deployment using Docker, Kubernetes. Experience with vector databases and RAG-based architectures. Ability to lead teams and projects, and work independently with minimal guidance. Experience with customer-facing roles, proposals, and solutioning. Educational Requirements Bachelor’s, Master’s, or PhD in Computer Science, Artificial Intelligence, Information Technology, or related field. Preferred Skills (Good To Have) Knowledge of Reinforcement Learning (e.g., RLHF), multi-modal AI, or time-series forecasting. Familiarity with Graph Neural Networks (GNNs). Exposure to Responsible AI (RAI), AI Ethics, or AutoML platforms. Contributions to open-source AI projects or publications in peer-reviewed journals. Skills:- Artificial Intelligence (AI) and Machine Learning (ML) Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data warehouse,data engineering,etl,data,python,sql,data pipeline,azure synapse,azure datafactory,pipelines,skills,azure databricks,architect designing,pyspark,azure,airflow Show more Show less

Posted 1 day ago

Apply

Exploring Azure Jobs in India

Azure, Microsoft's cloud computing platform, has seen a rapid growth in demand for skilled professionals in India. The job market for Azure roles in India is booming, with numerous opportunities available for job seekers with the right skills and experience.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Azure professionals.

Average Salary Range

The average salary range for Azure professionals in India varies based on experience and skill level. Entry-level positions can expect to earn around INR 6-8 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career path in Azure roles may start as a Junior Developer, progress to a Senior Developer, then move on to a Tech Lead position. With experience and additional certifications, professionals can advance to roles such as Solutions Architect, Cloud Consultant, or Azure DevOps Engineer.

Related Skills

In addition to Azure expertise, professionals in this field may benefit from having skills in: - Cloud computing concepts - Programming languages such as C# or Python - Networking fundamentals - Security and compliance knowledge

Interview Questions

  • What is Azure and how does it differ from other cloud platforms? (basic)
  • Explain the different service models in Azure. (basic)
  • How do you secure an Azure Virtual Network? (medium)
  • What is Azure DevOps and how can it be used in CI/CD pipelines? (medium)
  • Describe the difference between Azure AD and on-premises AD. (medium)
  • How do you monitor and optimize Azure costs? (advanced)
  • Explain the concept of Azure Virtual Machines and their use cases. (basic)
  • What is Azure Functions and when would you use it? (medium)
  • How do you implement high availability in Azure? (medium)
  • Describe the process of migrating on-premises infrastructure to Azure. (advanced)
  • What is Azure Kubernetes Service and how does it work? (medium)
  • Explain the concept of Azure Storage and its types. (basic)
  • How does Azure Active Directory integrate with other Azure services? (medium)
  • What is Azure SQL Database and how is it different from traditional SQL databases? (medium)
  • Describe the role of Azure Resource Manager. (basic)
  • How do you ensure data security and compliance in Azure? (medium)
  • Explain the concept of Azure App Services. (basic)
  • What is Azure Key Vault and how is it used for secure key management? (medium)
  • How do you troubleshoot performance issues in Azure? (medium)
  • Describe the process of disaster recovery in Azure. (advanced)
  • What is Azure Logic Apps and how can it be used for workflow automation? (medium)
  • How do you implement Azure Active Directory B2B and B2C? (advanced)
  • Explain the concept of Azure Blob Storage and its use cases. (basic)
  • How do you automate Azure deployments using ARM templates? (medium)
  • Describe the role of Azure DevOps in agile software development. (medium)

Closing Remark

As you explore opportunities in Azure jobs in India, remember to continuously upskill and stay updated with the latest trends in cloud computing. Prepare thoroughly for interviews and showcase your expertise confidently to land your dream job in this thriving field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies