Home
Jobs

1347 Azure Cloud Jobs - Page 20

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

10 - 18 Lacs

Pune, Chennai, Thiruvananthapuram

Work from Office

Naukri logo

Requirements: Experience: Technical Lead/ Technical Manager/ Technical Architect with 8-12 years in web application development Tech Stack: Proficiency in the following technologies: Frontend: Typescript, React.js, npm (library creation), Electron Backend & Data Management: Nodejs, Typescript, Serverless microservices using AWS Lambda, DynamoDB, Redis Cache Testing & Code Quality: Automated unit testing frameworks (jest), E2E testing frameworks (Selenium/ PlayWright), Static code analysis tools (ESLint) Performance Engineering Infrastructure & Deployment: GitLab CI/CD pipelines, AWS Cloud, yarn Monitoring & Observability: New Relic, Grafana Security: Familiarity with application security best practices, Familiarity with SAST/ DAST tools, PCI-DSS standards Hands on experience in either reactjs or nodejs is a must have Responsibilities: Technical leadership & Code quality : Technical expertise to lead technical design discussions, review high/ low level designs, do estimation. Ensure deliverables meet high quality standards, adhering to clean architecture principles, implements and maintain best practices in engineering across SDLC and mentor the team towards this Strategic Technical discussions : Make informed technical and strategic decisions to support functional evolution, scalability, and performance Testing & Security: Has experience in understanding E2E testing requirements, review the plan and suggest what is in best interest of the program. Lead defect triage meetings and can make informed decisions on business criticality and priority Technical Debt Management : Identify, prioritize and manage technical debt to ensure long term stability and maintainability of the application Collaboration : Work closely with cross functional teams (product management, architects, infra, DevOps, program management) across client and internal organizations for large program planning and ensure work is aligned to larger goals Project management (nice to have) : Has expertise in creating an implementation plan for small scale projects based on estimates. Ensure timely delivery of business values within timelines and cost, Effective scope management, change management, risk management, dependency management Problem-Solving & Communication: Strong analytical thinking, ability to debug efficiently, and excellent communication skills Team Player: Ability to work effectively in a remote, cross-cultural team with international colleagues.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

45 - 60 Lacs

Noida, New Delhi, Gurugram

Work from Office

Naukri logo

Technical Architect MEAN Stack Location: Noida (Work from Office) Salary: Up to 60 LPA Joiners: Early joiners preferred (immediate to 30 days notice period) About the Company: An established and rapidly growing global technology platform that has revolutionized digital lending and credit marketplaces. The company provides AI-powered, cloud-native solutions to financial institutions, enabling them to automate lending processes, optimize risk management, and deliver superior customer experiences across multiple channels. Role Overview: An excellent career opportunity for a passionate and experienced Technical Architect to take ownership of the product architecture and drive innovation across multiple technology stacks. You will play a key role in building and enhancing high-performance, scalable solutions while mentoring and guiding technical teams to success. Key Responsibilities: Own and drive the technical architecture roadmap in alignment with business objectives. Architect, implement, and maintain scalable and efficient MEAN stack solutions. Lead and enforce best coding standards, design patterns, and code quality across development teams. Proactively identify technical debt, drive technology upgrades, and ensure system robustness. Mentor and develop senior engineers and budding architects to build a strong technical bench. Collaborate closely with cross-functional teams, stakeholders, and leadership to ensure seamless delivery. Build, enhance, and manage the Product API layer through platforms like Postman, Swagger, Apiary, etc. Exhibit strong hands-on coding abilities to guide the team by example. Mandatory Candidate Profile: Bachelors degree (B.Tech/B.E.) from a reputed engineering college or university. Consistent academic record with 60% and above throughout academics. 1012 years of relevant experience in full-stack development with strong MEAN stack expertise. Proven experience in designing scalable cloud-native applications. Strong leadership skills with a focus on team building and people management. Excellent communication, stakeholder management, and presentation skills. Solid understanding of API design, integration patterns, and security protocols. Technology Stack: Frontend: Angular, HTML5, CSS3, JavaScript/TypeScript Backend: Node.js, Express.js Database: MongoDB, SQL/NoSQL databases Cloud: AWS/Azure/GCP (Preferred) Tools: Postman, Swagger, Apiary, Git, Jenkins, Docker, Kubernetes Mail updated CVs to careerprospects@thehrbps.in with "Technical Architect- MEAN stack" in subject line.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 2 weeks ago

Apply

2.0 - 6.0 years

6 - 9 Lacs

Pune

Remote

Naukri logo

Experience in VMware vCenter (6.0,7.0,8.0), ESXi (6.0,7.0,8.0), VM Creation, Snapshots, Clone,Strong on VM Advanced features like VMware HA, FT, DRS and VMotion.Hands on experience in VM Migration, VMware vSphere e ESXi hosts, vCenter servers

Posted 2 weeks ago

Apply

10.0 - 16.0 years

15 - 30 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Gracenote, a Nielsen company, is dedicated to connecting audiences to the entertainment they love, powering a better media future for all people. Gracenote is the content data business unit of Nielsen that powers innovative entertainment experiences for the worlds leading media companies. Our entertainment metadata and connected IDs deliver advanced content navigation and discovery to connect consumers to the content they love and discover new ones. Gracenotes industry-leading datasets cover TV programs, movies, sports, music and podcasts in 80 countries and 35 languages.Common identifiers Universally adopted by the worlds leading media companies to deliver powerful cross-media entertainment experiences. Machine driven, human validated best-in-class data and images fuel new search and discovery experiences across every screen. Gracenote's Data Organization is a dynamic and innovative group that is essential in delivering business outcomes through data, insights, predictive & prescriptive analytics. An extremely motivated team that values creativity, experimentation through continuous learning in an agile and collaborative manner. From designing, developing and maintaining data architecture that satisfies our business goals to managing data governance and region-specific regulations, the data team oversees the whole data lifecycle. Role Overview: We are seeking an experienced Senior Data Engineer with 10-12 years of experience to join our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will design, build, and maintain our data processing systems and pipelines. You will work closely with Product managers, Architects, analysts, and other stakeholders to ensure data is accessible, reliable, and optimized for Business, analytical and operational needs. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes Architect and implement data warehousing solutions and data lakes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Mentor junior engineers and provide technical leadership Evaluate and integrate new data management technologies and tools Implement Optimization strategies to enable and maintain sub second latency. Oversee Data infrastructure to ensure robust deployment and monitoring of the pipelines and processes. Stay ahead of emerging trends in Data, cloud, integrating new research into practical applications. Mentor and grow a team of junior data engineers. Required qualification and Skills: Expert-level proficiency in Python, SQL, and big data tools (Spark, Kafka, Airflow). Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Preferred. Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Proven leadership skills in grooming data engineering teams. Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation)Role & responsibilities

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Noida, Hyderabad, Bengaluru

Work from Office

Naukri logo

Solution Architect Experience: 7 - 20 Years Exp. Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Hybrid (Bengaluru) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Fullstack development, process flows, PI planning Meesho (One of Uplers' Clients) is Looking for: Solution Architect who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job title: Solution Architect Experience: 7-20 years Job description: Location: Ahmedabad, Pune, Bangalore, Noida, Gurgaon, Hyderabad, Nagpur, Mumbai, Chennai, Goa .Net fullstack (react/ angular) with Azure Translate Business Requirements into process flows. Identify opportunities to deliver automated solutions to replace manual efforts within proposed process flows. Work with the business SMEs to understand data needs throughout the development effort and work with data architects to design appropriate data stores. Translate Business Requirements into actionable technical features. Develop preliminary (t-shirt) sizing for use in overall project development estimation. Identify interdependencies of features to be developed and build a proposed development plan. (PI Plan) Participate in the grooming of technical features and support the detailed design efforts. Maintain awareness of known business growth and expansion plans and of related projects. (both ongoing and future) that may be able to leverage features being developed and strive to ensure that these features will be easily adaptable through enhancement to meet future needs. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: We are a full-service Software Application Development company that focuses on portals, document management, collaboration, business intelligence, CRM tools, cloud technology, and data. Much of the work done for our clients are based in the Microsoft Application stack of business tools. About Uplers: We are a full-service Software Application Development company that focuses on portals, document management, collaboration, business intelligence, CRM tools, cloud technology, and data. Much of the work done for our clients are based in the Microsoft Application stack of business tools.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

1 - 2 Lacs

Madurai

Work from Office

Naukri logo

Position Titles: 1. Java Full Stack Trainer 2. Python Full Stack Trainer 3. MERN/MEAN Stack Trainer 4. DevOps Trainer 5. AWS/Azure Trainer

Posted 2 weeks ago

Apply

6.0 - 8.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Front End Developer Be a part of our success story. Launch offers talented and motivated people the opportunity to do the best work of their lives in a dynamic and growing company. Through competitive salaries, outstanding benefits, internal advancement opportunities, and recognized community involvement, you will have the chance to create a career you can be proud of. Your new trajectory starts here at Launch. What are we looking for: We are looking for a Frontend Developer who is passionate about creating intuitive, user-friendly, and visually appealing web applications. As a key member of our development team, you will be responsible into high-quality code and developing functional and responsive interfaces using modern JavaScript frameworks and libraries.You will work closely with product managers, designers, and backend developers to deliver seamless and scalable user experiences across platforms. Location: Hyderabad Experience: 5+ Years Employment Type: Full-time Shift Type: US Shift - Up to 2.30am IST Responsibilities Develop and maintain web applications using HTML5, CSS3, JavaScript, and TypeScript. Resolve localization and accessibility bugs to enhance the user experience for a global audience. Utilize the Microsoft stack (ASP.NET, C#, Azure, Visual Studio) to ensure high performance and scalability of web applications. Conduct usability testing and provide feedback to improve the accessibility of web applications. Implement localization strategies to support multiple languages and regions. Participate in code reviews and contribute to the continuous improvement of our development processes. Requirements: Bachelors degree in Computer Science or a related field. [5] years of experience in frontend development. Proficiency in HTML5, CSS3, JavaScript, and TypeScript. Experience with React.js, Angular, or similar frameworks. Strong preference for candidates with extensive experience in React.js. Strong knowledge of the Microsoft stack (ASP.NET, C#, Azure, Visual Studio). Familiarity with accessibility standards (WCAG 2.1, ARIA) and tools (screen readers). Experience with localization (i18n, l10n) and globalization. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Preferred Qualifications: Contributions to open-source projects related to accessibility and localization. We are Navigators in the Age of Transformation: We use sophisticated technology to transform clients into the digital age, but our top priority is our positive impact on human experience. We ease anxiety and fear around digital transformation and replace it with opportunity. Launch IT is an equal opportunity employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Launch IT is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. About Company: Launch IT India is wholly owned subsidiary of The Planet Group (http://www.launchcg.com; http://theplanetgroup.com ) a US company, offers attractive compensation and work environment for the prospective employees. Launch is an entrepreneurial business and technology consultancy. We help businesses and people navigate from current state to future state. Technology, tenacity, and creativity fuel our solutions with offices in Bellevue, Sacramento, Dallas, San Francisco, Hyderabad & Washington D.C. https://www.linkedin.com/company/launch-consulting-group-india/

Posted 2 weeks ago

Apply

10.0 - 16.0 years

25 - 40 Lacs

Hyderabad

Hybrid

Naukri logo

About The Role-Senior Backend Engineer Senior Engineer with strong Backend experience with full stack exposure will combine design thinking techniques, software development, domain know how and apply agile architecture principles to build high-quality applications. You are an experienced backend engineer with good full stack knowledge who has passion for application development, loves to analyze and address various technical challenges on the way. You will work with the business product owner, architecture, and engineering teams to translate a business need into a robust and integrated technology solution to meet the Product vision and roadmap. You will provide your technical expertise for application backend demonstrating modern cloud services aligned with the technology governance standards and current methodologies for the Sales and Origination business of Swiss Re. Key Responsibilities Include Focusing on Backend Design, development, and creation of proof of concept for our product application solutions. Participate in Design & Code reviews and effectively engage in Review sessions. Collaborate with peers to drive the backend application development standard methodologies demonstrating the modern cloud services for our product portfolio. Keep up with technology trends and identify promising new solutions that meet our requirements. Provides hands-on solving and consulting for solution challenges when needed. Follow standard methodologies for building, testing, and deploying of applications. Contribute to Swiss Re's healthy and transparent risk culture where everyone engages in continuous risk accountability activities. Hands-on software development for our product solutions, as part of a cross-functional team, under the guidance of the Engineering Lead. As Senior Engineer, you will take a leading role in code reviews and coaching of junior resources. Provide hands-on troubleshooting and client support whenever needed. Keep up with technology trends and identify innovative solutions that meet our requirements. About You You enjoy delivering state of the art products by developing code using multiple languages and using modern technologies. You work well independently and as part of a team, you take up responsibility and ownership. Required Experience 10+ years of experience in designing, developing, and delivering software applications. 6+ years of experience with backend development using Java, REST APIs, Spring Boot 3+ years of experience the following full stack: Front end - Angular, TypeScript, Node, JavaScript Databases (e.g. Mongo DB, PostgreSQL, Oracle, etc.) Good experience in working with Azure Cloud services APIs: API design, development and API management on Azure Azure Search, Index, Azure functions, Service bus, Logic apps, Event grids, storage accounts, VMs, AKS, Webapps, etc., Leads the definition and maintenance of the company's cloud technology framework. This includes cloud platform infrastructure architecture, application design principles, integration patterns and solutions and security aspects. Experience with solutions hosted on Azure Cloud and understanding of Azure stack. Experience in one or more web/application servers, Experience in Microservices Architectures and using integrated development environments Experience in containerization & deployment of applications using Docker and/or Azure Kubernetes Service (AKS) Exposure to Azure DevOps (Repo, Pipeline, CICD) Experience in agile software engineering practices with strong fundamentals Worked with GIT code repository. Leads the build, test and maintenance of the infrastructure and tools (CI/CD pipeline) to allow for the continuous process of development, integration, validation and deployment of software. Experience with Scrum/Agile development methodologies using JIRA or similar tools. Experienced in supporting application in an operational environment and aligning with business cycles. Ability to describe and document application architecture and processes in an easy-to-understand manner. Experienced working in multicultural globally distributed teams. Self-starter with a positive approach, who can manage their own workload. Strong written and verbal communication skills. Must Have Backend - Java,REST APIs,Spring Boot Front end - Angular, TypeScript, Node, JavaScript Databases (e.g. Mongo DB, PostgreSQL, Oracle, etc.) Good experience in working with Azure Cloud Nice to Have GraphQL, Azure cloud services (e.g Azure Search, Index, functions) GitHub Copilot Bachelor's Degree in Engineering or Equivalent

Posted 2 weeks ago

Apply

10.0 - 18.0 years

15 - 25 Lacs

Noida, Kolkata

Hybrid

Naukri logo

Job Title: Manager, Cloud Support Role Overview: This is a critical leadership role where you will ensure the 24x7 stability of our cloud services while also driving business growth. You will lead the cloud support team in managing everything from incident response and SLA performance to providing expert presales support, solution design, and customer cost optimization. Key Responsibilities: Lead 24x7 Operations: Manage the cloud support team, overseeing monitoring, L1/L2 support, and ensuring operational stability and availability. Incident & Problem Management: Own the Major Incident Management process and support Root Cause Analysis (RCA) activities to prevent recurrence. SLA & Reporting: Track, report on, and drive improvements for all operational SLAs and Key Performance Indicators (KPIs). Solutioning: Lead technical presales activities, design customer cloud solutions, and coordinate technical responses for RFPs. Commercial Support: Develop customer cloud cost optimization (FinOps) strategies and provide technical insights into our managed services portfolio. Team & Process Leadership: Mentor your team, manage schedules, and continuously improve support runbooks and operational readiness processes. What You Bring: Experience: 10+ years in cloud operations or support, with at least 3 years in a leadership role managing a 24x7 team. Hybrid Skills: Proven ability in both technical operations management (ITIL, Incident Management) and customer-facing commercial support (presales, solution design). Cloud Proficiency: Deep knowledge of at least one major cloud platform (AWS, Azure, or GCP). Technical Toolkit: Experience with modern monitoring tools (Site24x7, Grafana, etc.) and ITSM platforms (Jira Service Management, ServiceNow). Leadership: Strong people management skills with a proven ability to lead effectively under pressure. Education & Certs: Bachelor's degree in a technical field. ITIL and relevant Cloud certifications (e.g., AWS Solutions Architect) are highly desirable. 10-15 yrs Job Title: Manager, Cloud Support Location: New Town, Kolkata, India Role Overview: This is a critical leadership role where you will ensure the 24x7 stability of our cloud services while also driving business growth. You will lead the cloud support team in managing everything from incident response and SLA performance to providing expert presales support, solution design, and customer cost optimization. Key Responsibilities: Lead 24x7 Operations: Manage the cloud support team, overseeing monitoring, L1/L2 support, and ensuring operational stability and availability. Incident & Problem Management: Own the Major Incident Management process and support Root Cause Analysis (RCA) activities to prevent recurrence. SLA & Reporting: Track, report on, and drive improvements for all operational SLAs and Key Performance Indicators (KPIs). Solutioning: Lead technical presales activities, design customer cloud solutions, and coordinate technical responses for RFPs. Commercial Support: Develop customer cloud cost optimization (FinOps) strategies and provide technical insights into our managed services portfolio. Team & Process Leadership: Mentor your team, manage schedules, and continuously improve support runbooks and operational readiness processes. What You Bring: Experience: 10+ years in cloud operations or support, with at least 3 years in a leadership role managing a 24x7 team. Hybrid Skills: Proven ability in both technical operations management (ITIL, Incident Management) and customer-facing commercial support (presales, solution design). Cloud Proficiency: Deep knowledge of at least one major cloud platform (AWS, Azure, or GCP). Technical Toolkit: Experience with modern monitoring tools (Site24x7, Grafana, etc.) and ITSM platforms (Jira Service Management, ServiceNow). Leadership: Strong people management skills with a proven ability to lead effectively under pressure. Education & Certs: Bachelor's degree in a technical field. ITIL and relevant Cloud certifications (e.g., AWS Solutions Architect) are highly desirable.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : AI/ML Engineer (with LLM, Azure, Python & PySpark expertise) Job Description : We are looking for a skilled and experienced AI/ML Engineer to join our data science and AI team. The ideal candidate will have a strong foundation in machine learning, artificial intelligence, and large language models (LLMs), along with deep proficiency in Python, PySpark, and Microsoft Azure services. You will be responsible for developing and deploying scalable AI solutions, working with big data frameworks, and leveraging cloud platforms to operationalize machine learning models. Key Responsibilities : Artificial Intelligence (AI) & Machine Learning (ML): Design, develop, and optimize machine learning and AI models to solve business problems. Perform exploratory data analysis and feature engineering for model development. Use supervised, unsupervised, and reinforcement learning techniques where appropriate. Build AI pipelines and integrate models into production systems. Large Language Models (LLM): Fine-tune and deploy LLMs (e.g., OpenAI, Hugging Face, or custom-trained models). Develop prompt engineering strategies for LLM applications. Implement RAG (Retrieval-Augmented Generation) systems or LLMOps workflows. Evaluate LLM outputs for accuracy, bias, and performance. Python Programming: Write efficient, reusable, and testable Python code for data processing, modeling, and API services. Build automation scripts for data pipelines and model training workflows. Use popular libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, and NumPy. PySpark and Big Data: Work with large datasets using PySpark for data wrangling, transformation, and feature extraction. Optimize Spark jobs for performance and scalability. Collaborate with data engineering teams to implement end-to-end data pipelines. Microsoft Azure: Deploy models and applications using Azure ML, Azure Databricks, Azure Functions, and Azure Synapse. Manage compute resources, storage, and data security on Azure. Use Azure DevOps for CI/CD of ML pipelines and automation. Cross-Functional Collaboration & Documentation: Collaborate with data engineers, product managers, and business stakeholders to align technical solutions with business needs. Maintain clear documentation of models, code, and workflows. Present technical findings and model outcomes to both technical and non-technical audiences. Required Skills & Qualifications : Bachelor's or Masters degree in Computer Science, Data Science, Engineering, or a related field. 3+ years of experience in AI/ML and data engineering roles. Proficient in Python and PySpark. Experience with cloud platforms, especially Microsoft Azure. Hands-on experience with LLMs (e.g., GPT, BERT, Claude, etc.). Familiarity with ML frameworks like Scikit-learn, TensorFlow, or PyTorch. Solid understanding of ML lifecycle, MLOps, and deployment strategies. Nice to Have : Experience with LLMOps and vector databases (e.g., FAISS, Pinecone). Knowledge of data governance and responsible AI practices. Azure certifications (e.g., Azure AI Engineer Associate, Azure Data Scientist Associate). Experience with REST APIs and containerization (Docker, Kubernetes).

Posted 2 weeks ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Must skills : TPM, Agile Scrum methodologies, Cloud, SDLC, DevOps, and SecOps We are currently seeking a Technical Program Manager in India to support a growing IT organization and Infra initiatives. This position is responsible for planning, organizing, and executing technical projects in support of an enterprise scale infrastructure and operations organization. The successful candidate will assist IT and Technology Services to execute against a book of work encompassing many technical disciplines. Under the direction of the India Lead and Head of Technical Program Management (USA) the TPM will be responsible for all aspects of executing and managing projects, tasks, schedules and reporting including project communications. 5 to 10 years experience in project management in IT Technology infrastructure project management at an enterprise scale experience PMP or equivalent certification Bachelors degree Exceptional organizational and time management skills. * Hands-on experience with Azure DevOps, including pipeline management, backlog grooming, and release management (preferred). Strong problem-solving aptitude and critical thinking. Expert proficiency in MS Office suite. ServiceNow experience (preferred). Smartsheet experience (desired). Excellent communication skills, both oral and written. Knowledge of relevant Project Management tools and platforms. Agile and Waterfall methodology experience. Data center infrastructure implementation experience (preferred). Proven ability to manage multiple competing priorities and deadlines with diverse stakeholders. Familiarity with enterprise IT networks, storage, virtual servers, and related technologies. Extensive experience in IT project delivery at an enterprise scale. Ability to manage multiple projects simultaneously and meet deadlines. Proactive and independent decision-making to plan work, make recommendations, and drive initiatives. Role, Responsibilities Work closely with leadership, technical teams, and other resources to plan, scope, and deliver multiple projects on budget, within scope, and on schedule. Develop and maintain detailed project plans, including milestones, deadlines, risks, and resource allocation. Identify requirements needed from external and cross-functional teams to ensure project success. Define project completion criteria and help teams achieve them. Identify and address risks in migration and infrastructure projects by working with relevant teams to implement mitigation strategies and contingency plans. Ensure familiarity with IT Security and Compliance processes throughout the project. Manage all aspects and phases of projects: technical resources, schedules, budgets, and reporting. Facilitate backlog grooming sessions with development and operations teams to ensure the backlog is well-defined, prioritized, and ready for sprint planning. Work closely with product owners and technical teams to ensure user stories, features, and tasks are clearly defined and properly estimated. Ensure that work is broken down into manageable pieces and appropriately prioritized based on business and technical needs. Lead standups, team meetings, lessons-learned, and other relevant meetings to ensure continuous collaboration and improvement. Regularly report on project status, risks, and key milestones to senior leadership and stakeholders. Prepare reports, status updates, and expected project artifacts to keep stakeholders informed and aligned. Create and maintain documentation, including project charters, plans, and retrospectives. Develop standardized processes to enhance project management efficiency. Assess program effectiveness and optimize ROI to ensure continued value and improvement. Leverage expert documentation skills to ensure accuracy and clarity in all project materials. Help engineers translate business requirements into IT deliverables to align technical outcomes with business objectives.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

27 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Preferred candidate profile 9+ years of overall experience in software development with a focus on data projects using Python, PySpark, and associated frameworks. Proven experience as a Data Engineer with experience in Azure cloud. Experience implementing solutions using Azure cloud services, Azure Data Factory, Azure Lake Gen 2, Azure Databases, Azure Data Fabric, API Gateway management, Azure Functions. Strong SQL skills with RDMS or NoSQL databases. Experience with developing APIs using FastAPI or similar frameworks in Python. Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes. Good understanding of ETL/ELT processes. Experience in the financial services industry, financial instruments, asset classes, and market data are a plus. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. In-depth knowledge of data warehousing concepts, architecture, and implementation. Extremely strong organizational and analytical skills with strong attention to detail. Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with the ability to work independently or as part of a team. Strong communication and interpersonal skills, with the ability to effectively engage with both technical and nontechnical stakeholders. Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts. Write new optimized SQL queries or Python scripts to improve performance and reduce run time.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

30 - 35 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Design and implement scalable, secure, and efficient cloud infrastructure using Azure IaaS (Virtual Machines, Storage, Networking) and PaaS (App Service, Functions, Logic Apps) services.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

19 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Marketing & Consumer Insights Location: Bangalore Reporting to: Senior Manager – Marketing & Consumer Insights COE 1. Purpose of the role We are looking for a highly motivated and strategic Senior Data Scientist to drive the global consumer sentiment analytics platform. This role will focus on building scalable, explainable, and accurate data science models using NLP and AI, enabling insights from social media, e-commerce platforms, and online review data across global markets. The candidate will own the end-to-end development cycle, from experimentation and model development to deployment and integration into Power BI dashboards. Mentoring junior data scientists, ensuring code quality, and exploring new AI techniques will also be part of the role. 2. Key tasks & accountabilities Own and manage data science workflows: data ingestion, cleaning, modeling, tuning, and interpretation. Apply and fine-tune NLP techniques for sentiment analysis, aspect extraction, multi-lingual translation and topic modeling. Ensure proper data modeling aligned with AB InBev’s data architecture; manage different data layers, handle data archiving, and continuously optimize the model as it matures. Integrate multiple datasets and data types (structured and unstructured) to improve model robustness and drive more insightful outputs. Oversee the deployment of models into production, ensuring integration with existing ABI systems and infrastructure. Ensure that the models are accurate, scalable and aligned with project objectives. Collaborate with functional and technical teams to translate business questions into modeling approaches. Participate actively during the dashboard visualization phase to ensure seamless user experience and usability; revisit model code to improve accuracy and simplify where possible. Work closely with visualization experts to ensure insights are consumable, intuitive, and decision-ready. Operate in a high-pressure, fast-paced environment across a global project with multiple stakeholders, diverse markets, and high-volume datasets. The ability to manage expectations, adapt to evolving requirements, and deliver results across geographies is critical. Drive experimentation and optimization cycles to improve model accuracy and business relevance. Partner with stakeholders across functions and geographies to ensure models address real-world needs. Maintain documentation and reproducibility standards across all modeling efforts. Mentor and support analysts and junior data scientists on best practices and solution design. 3. Qualifications, Experience, Skills Level of educational attainment required: Bachelor’s/Master’s degree in Data Science, Computer Science, Statistics, Engineering, or equivalent. Specialization or certifications in NLP, Deep Learning, or Applied AI preferred. Previous work experience required: 8+ years in data science with a focus on NLP and consumer analytics. Experience working with unstructured data, especially from social media, forums, or customer reviews. Prior exposure to end-to-end model deployment and integration with BI tools like Power BI. Experience in Agile teams, with familiarity in Azure DevOps or similar CI/CD environments. Proven ability to mentor and guide teams across multiple regions. IT Skills required: Python (essential), SQL, R, ML/DL frameworks (TensorFlow, Scikit-learn, SpaCy, HuggingFace) Experience with APIs (Twitter, Reddit, Facebook, YouTube, etc.) Familiarity with cloud environments (Azure preferred) Knowledge of Power BI integration and data pipelines Technical skills required: NLP, Sentiment Analysis, Topic Modeling, Text Classification Python, SQL, Machine Learning, Model Explainability API integration, Power BI readiness, data engineering fundamentals Text preprocessing, tokenization, and vectorization Sentiment analysis and topic modeling fundamentals Efficient coding practices and code optimization Working with multilingual corpora and translation APIs Knowledge of social listening tools (e.g., Brandwatch, Talkwalker) Advanced Data Visualization techniques Experience with multilingual data sets And above all of this, an undying love for beer! We dream big to create future with more cheers.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

Project description You'll be working in the GM Business Analytics team located in Pune. The successful candidate will be a member of the global Distribution team, which has team members in London and Pune. We work as part of a global team providing analytical solutions for IB distribution/sales people. Solutions deployed should be extensible globally with minimal localization. Responsibilities Are you passionate about data and analyticsAre you keen to be part of the journey to modernize a data warehouse/ analytics suite of application(s). Do you take pride in the quality of software delivered for each development iteration We're looking for someone like that to join us and be a part of a high-performing team on a high-profile project. solve challenging problems in an elegant way master state-of-the-art technologies build a highly responsive and fast updating application in an Agile & Lean environment apply best development practices and effectively utilize technologies work across the full delivery cycle to ensure high-quality delivery write high-quality code and adhere to coding standards work collaboratively with diverse team(s) of technologists You are: Curious and collaborative, comfortable working independently, as well as in a team Focused on delivery to the business Strong in analytical skills. For example, the candidate must understand the key dependencies among existing systems in terms of the flow of data among them. It is essential that the candidate learns to understand the 'big picture' of how IB industry/business functions. Able to quickly absorb new terminology and business requirements Already strong in analytical tools, technologies, platforms, etc. The candidate must also demonstrate a strong desire for learning and self-improvement. Open to learning home-grown technologies, support current state infrastructure and help drive future state migrations. imaginative and creative with newer technologies Able to accurately and pragmatically estimate the development effort required for specific objectives You will have the opportunity to work under minimal supervision to understand local and global system requirements, design and implement the required functionality/bug fixes/enhancements. You will be responsible for components that are developed across the whole team and deployed globally. You will also have the opportunity to provide third-line support to the application's global user community, which will include assisting dedicated support staff and liaising with the members of other development teams directly, some of which will be local and some remote. Skills Must have A bachelor's or master's degree, preferably in Information Technology or a related field (computer science, mathematics, etc.), focusing on data engineering. 5+ years of relevant experience as a data engineer in Big Data is required. Strong Knowledge of programming languages (Python / Scala) and Big Data technologies (Spark, Databricks or equivalent) is required. Strong experience in executing complex data analysis and running complex SQL/Spark queries. Strong experience in building complex data transformations in SQL/Spark. Strong knowledge of Database technologies is required. Strong knowledge of Azure Cloud is advantageous. Good understanding and experience with Agile methodologies and delivery. Strong communication skills with the ability to build partnerships with stakeholders. Strong analytical, data management and problem-solving skills. Nice to have Experience working on the QlikView tool Understanding of QlikView scripting and data model Other Languages EnglishC1 Advanced Seniority Senior

Posted 2 weeks ago

Apply

1.0 - 2.0 years

3 - 4 Lacs

Pune

Work from Office

Naukri logo

- Hands-on experience with Jupyter Notebooks, Google Colab, Git & GitHub .- Solid understanding of Data Visualization Tools and Dashboard Creation. - Prior teaching/training experience (online/offline) is a plus .- Excellent communication and presentati

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Job Description Job Overview We are looking for a skilled and proactive SRE (Site Reliability Engineer) to manage, maintain, and troubleshoot cloud data pipelines across our infrastructure. The ideal candidate is a data engineering expert with deep knowledge of cloud services, data pipeline architecture, and a software engineering mindset to optimize performance, reliability, and cost-efficiency. This role demands strong problem-solving abilities, hands-on experience with any cloud platforms (preferably GCP), and the capability to work independently in a fast-paced environment. Key Responsibilities Manage and support cloud data pipelines and associated infrastructure Monitor the performance and reliability of pipelines, including Informatica ETL workflows, MDM, and Control-M jobs Troubleshoot and resolve complex issues related to data pipelines and data processing systems Optimize data pipeline efficiency to reduce operational costs and failure rates Automate repetitive tasks and streamline data pipeline management processes Conduct post-incident reviews and implement improvements for future reliability Perform SLA-oriented monitoring and recommend enhancements to ensure compliance Collaborate with cross-functional teams to improve and document systems and workflows Support real-time monitoring and alerting for mission-critical data processes Continuously improve systems based on proactive testing and performance insights Required Skills and Qualifications 5+ years of experience in Data Engineering support and enhancement Proficiency in Python for data processing and automation Strong SQL skills and experience working with relational databases Solid understanding of data pipeline architectures and ETL processes Hands-on experience with any cloud platforms (GCP, Azure, AWS GCP preferred) Familiarity with version control systems like Git. Experience in monitoring and alerting solutions for data systems Skilled in conducting post-incident analysis and reliability improvements Exposure to data visualization tools such as Google Looker Studio, Tableau, Domo, or Power BI is a plus Strong analytical and problem-solving skills Excellent verbal and written communication abilities Ability to work in a 24x7 shift environment Preferred Qualifications Bachelor’s degree in computer science , Engineering, or a related technical field. Professional Cloud Certification (e.g., GCP Professional Data Engineer) is a plus.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 weeks ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python

Posted 2 weeks ago

Apply

3.0 - 5.0 years

15 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Qualifications for Data Engineer : 3+ Years of experience in building and optimizing big data solutions required to fulfill business and technology requirements. 4+ years of technical expertise in areas of design and implementation using big data technology Hadoop, Hive, Spark, Python/Java. Strong analytic skills to understand and create solutions for business use cases. Ensure best practices to implement data governance principles, data quality checks on each data layer. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 4+ years of experience in a Data Engineer role, who has attained a Graduate degree in B.Tech/B.E. They should also have experience using the following software/tool- Experience with big data : Hadoop, Map Reduce, Hive, Spark, Kafka, Airflow etc Experience with relational SQL and NoSQL databases: MySQL, Postgres, MongoDB, HBase, Cassandra etc. Experience with cloud Data platform: AWS, Azure-HDInsights, GCP, CDP Experience with real time data processing: Storm, Spark-Streaming etc. Experience with object-oriented/object function scripting languages: Java, Python, Scala, etc. If Interested: Kindly fill the google form given below: amulyavaish@paisabazaar.com

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Node JS Developer Position Overview: As a Backend Developer, you will play a critical role in designing and developing scalable, high-performance backend systems. You will collaborate closely with frontend developers, project managers, and other stakeholders to deliver robust server-side solutions that power our applications. This role demands a technically proficient developer who can deliver clean, efficient code while driving best practices in backend development. Years of Experience: 4+ yrs of experience in tech, at least 4.6 Yrs in development Education: BTech, MCA Location: Bangalore, Work from Office Candidate should meet the following criteria: Making product from scratch is mandatory SaaS experience of 2 years with total development experience of 5 years Must have worked in a smaller company of less than 100 Should have built solutions/softwares end to end and not just as a part of a bigger team . Key Roles and Responsibilities: Develop, test, and maintain server-side applications using Node.js and related technologies. Design and implement RESTful APIs and microservices architecture to support frontend functionality. Collaborate with cross-functional teams to understand business needs and translate them into technical solutions. Optimize application performance, ensuring scalability and reliability. Conduct code reviews, mentor junior developers, and ensure adherence to best practices. Stay updated with emerging backend technologies and methodologies to improve development processes. Skills: Advanced proficiency in server-side programming with Node.js Strong expertise in designing and building RESTful APIs, including authentication and versioning Extensive experience with database management systems (MongoDB), including query optimization Proficiency in integrating and customizing third-party services and APIs Mastery of version control systems like Git, including workflows and branching strategies In-depth knowledge of modern backend frameworks like Express (Node.js) Comprehensive understanding of security protocols and best practices for backend development Familiarity with cloud services and platforms like AWS, GCP, or Azure, and experience in deployment automation Strong analytical skills for debugging, troubleshooting, and resolving complex issues Leadership and mentorship skills to guide junior developers and ensure quality standards Requirements: Essential Proficiency with version control systems like Git. Experience with CI/CD pipelines and deployment automation. Knowledge of system design and scalable architecture. Understanding of security best practices and implementing secure APIs. Good to Have Experience with GraphQL and related backend technologies. Knowledge of containerization and orchestration tools like Docker and Kubernetes. Familiarity with message brokers like RabbitMQ or Kafka. Experience with platforms like Shopify, BigCommerce, or other e-commerce technologies. Exposure to frontend technologies such as React for better cross-functional collaboration. ****Mandatory SaaS experience of 2 years with total development experience of 5 years Must have worked in a smaller company of less than 100 Should have built solutions/softwares end to end and not just as a part of bigger team Should see themselves going deeper into the same role even 5 years down the line Should have knowledge on Node.js, MongoDB, and AWS Strong understanding of asynchronous programming, event-driven architecture, and Node.js design patterns. Proficiency in working with databases such as PostgreSQL, MongoDB, or MySQL. Experience with API development, including RESTful services and microservices. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Strong problem-solving skills and the ability to debug complex server-side issues.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com.In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience.The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Qualification Any Graduation,12th/PUC/HSC

Posted 2 weeks ago

Apply

7.0 - 11.0 years

11 - 15 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Tech Solution Architecture Designation: Solution Architecture Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. Join our dynamic Service Supply Chain (SSC) team and be at the forefront of helping world class organizations unlock their full potential. Imagine a career where your innovative work makes a real impact, and every day brings new challenges and opportunities for growth. We re on the lookout for passionate, talented individuals ready to make a difference. . If you re eager to shape the future and drive success, this is your chancejoin us now and lets build something extraordinary together!The Technical Solution Architect I is responsible for evaluating an organizations business needs and determining how IT can support those needs leveraging software like Azure, and Salesforce. Aligning IT strategies with business goals has become paramount, and a solutions architect can help determine, develop, and improve technical solutions in support of business goals. The Technical Solution Architect I also bridge communication between IT and business operations to ensure everyone is aligned in developing and implementing technical solutions for business problems. The process requires regular feedback, adjustments, and problem solving in order to properly design and implement potential solutions. To be successful as a Technical Solution Architect I, you should have excellent technical, analytical, and project management skills. What are we looking for Minimum of 5 years of IT experienceMinimum of 1 year of experience in solution architectureMinimum of 1 year of Enterprise-scale project delivery experienceMicrosoft Azure Cloud ServicesMicrosoft Azure Data FactoryMicrosoft Azure DatabricksMicrosoft Azure DevOpsWritten and verbal communicationAbility to establish strong client relationshipProblem-solving skillsStrong analytical skillsExpert knowledge of Azure Cloud ServicesExperience with Azure Data platforms (Logic apps, Service bus, Databricks, Data Factory, Azure integration services)CI/CD, version-controlling experience using Azure DevopsPython ProgrammingKnowledge of both traditional and modern data architecture and processing concepts, including relational databases, data warehousing, and business analytics. (e.g., NoSQL, SQL Server, Oracle, Hadoop, Spark, Knime). Good understanding of security processes, best practices, standards & issues involved in multi-tier cloud or hybrid applications. Proficiency in both high-level and low-level designing to build an architect using customization or configuration on Salesforce Service cloud, Field Service lightening, APEX, Visual Force, Lightening, Community. Expertise in designing and building real time/batch integrations between Salesforce and other systems. Design Apex and Lightning framework including Lightning Pattern, Error logging framework etc. Roles and Responsibilities: Meet with clients to understanding their needs (lead architect assessment meetings),and determining gaps between those needs and technical functionality. Communicate with key stakeholder, across different stages of the Software Development Life Cycle. Work on creating the high-level design and lead architectural decisions Interact with clients to create end-to-end specifications for Azure & Salesforce cloud solutions Provide clarification and answer any question regarding the solution architecture Lead the development of custom enterprise solutions Responsible for application architecture, ensuring high performance, scalability, and availability for those applications Responsible for overall data architect, modeling, and related standards enforced throughout the enterprise ecosystem including data, master data, and metadata, processes, governance, and change control Unify the data architecture used within all applications and identifying appropriate systems of record, reference, and management Share engagement experience with the internal audiences and enrich collective IP. Conduct architecture workshops and other enablement sessions. Qualification Any Graduation

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand project requirements and deliver high-quality solutions.- Develop and maintain applications using Microsoft Azure Databricks.- Troubleshoot and debug applications to ensure optimal performance.- Implement best practices for application development and deployment.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and services.- Experience with data processing and analytics using Azure services.- Knowledge of programming languages such as Python, Scala, or SQL.- Hands-on experience in building and deploying applications on Azure cloud platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies