Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 11.0 years
0 Lacs
remote, india
Remote
JOB DESCRIPTION JB-4: Senior Lead Engineer Our Purpose: At Majesco, we believe in connecting people and business to Insurance in ways that are Innovative, Hyper-Relevant, Compelling and Personal. We bring together the brightest minds to build the future of Insurance a world where Insurance makes life and business easier, more connected, and better protected. We are seeking a Senior Lead Engineer to deliver platform-scale automation across insurance domains by integrating advanced AI tools, orchestrated workflows, and modular service components. This role sits at the intersection of cloud-native backend engineering, AI-driven experiences, and cross-functional automation. All About the Role: Lead the development and scaling of document ingestion pipelines, classifier engines, and API gateways supporting intelligent P&C workflows. Build modular backend services using FastAPI, Django, or Flask , leveraging asynchronous design, microservices, and cloud-native scalability. Design and deploy event-driven automation using Azure Logic Apps , Functions , and Service Bus across claims, billing, and policy processes. Containerize services using Docker , deploy and scale on Kubernetes , and ensure high availability through best practices. Integrate platform services with Microsoft Copilot Studio and Power Automate , enabling reusable actions, conversational agents, and business rule orchestration. Establish telemetry, traceability, and structured logging standards across APIs and workflows using Azure Monitor , App Insights , and OpenTelemetry . Drive performance profiling and system optimization initiatives across ingestion, classification, and agent orchestration layers. Explore and integrate AI capabilities such as voice embeddings , vector search , and immersive UI elements into the platform. Participate actively in PI planning, backlog grooming, and agile ceremonies across engineering and product teams. Mentor junior developers and lead sprint-level technical delivery with a focus on modularity, scalability, and AI-readiness. What You'll Bring: Passion for staying ahead of AI developments and a builder mindset to turn AI capabilities into practical applications. Demonstrated ability to design systems with observability, orchestration, and automation at the core. A strong performance-first philosophy - ability to analyze, profile, and optimize services at scale. Vision for integrating AI into core insurance workflows, from agent recommendations to customer-facing explainability. Willing to work across time zones and with remote teams. All About You: 9+ years of experience in backend platform development and cloud-native architecture. Strong knowledge of FastAPI, Django, or Flask and event-driven microservice design. Minimum 5 years of experience on frontend framework - HTML5, React JS, Node JS, JavaScript, Angular Exposure to cloud applications including DevOps/DevSecOps, Scaling, Deployment and Automation Cloud Exposure - MS Azure/AWS, OpenShift, Docker, Kubernetes, Jenkins, GitHub, Jira Hands-on experience with Azure Cloud Services including Logic Apps, Functions, Cosmos DB, Blob Storage, and Service Bus. Proficient with Docker and Kubernetes for containerized deployment and service scaling. Experience building intelligent orchestration workflows using Power Automate and Copilot Studio . Working knowledge of vector databases , embedding APIs , and LLM integration workflows (OpenAI/Azure OpenAI). Exposure to AI-enhanced UIs - such as embedded assistants, predictive agents, or conversational UI. Proficient in system performance optimization, error tracing, logging frameworks, and monitoring pipelines. Experience working in agile teams with PI planning, story-pointing, sprint demos, and cross-functional delivery. P&C insurance domain familiarity is preferred but not mandatory. Other Qualifications: Bachelor's degree in computer science or engineering master's degree a plus . Experience with SAFe Agile Development practices and processes. SAFe Practitioner certification preferred. Experience with the Majesco Platforms/Products is a plus Preferred Experience in developing packaged software (Products) preferably in the banking or financial services areas.
Posted 2 weeks ago
9.0 - 11.0 years
0 Lacs
remote, india
Remote
JOB DESCRIPTION JB-4: Senior Lead Engineer Our Purpose: At Majesco, we believe in connecting people and business to Insurance in ways that are Innovative, Hyper-Relevant, Compelling and Personal. We bring together the brightest minds to build the future of Insurance a world where Insurance makes life and business easier, more connected, and better protected. We are seeking a Senior Lead Engineer to deliver platform-scale automation across insurance domains by integrating advanced AI tools, orchestrated workflows, and modular service components. This role sits at the intersection of cloud-native backend engineering, AI-driven experiences, and cross-functional automation. All About the Role: Lead the development and scaling of document ingestion pipelines, classifier engines, and API gateways supporting intelligent P&C workflows. Build modular backend services using FastAPI, Django, or Flask , leveraging asynchronous design, microservices, and cloud-native scalability. Design and deploy event-driven automation using Azure Logic Apps , Functions , and Service Bus across claims, billing, and policy processes. Containerize services using Docker , deploy and scale on Kubernetes , and ensure high availability through best practices. Integrate platform services with Microsoft Copilot Studio and Power Automate , enabling reusable actions, conversational agents, and business rule orchestration. Establish telemetry, traceability, and structured logging standards across APIs and workflows using Azure Monitor , App Insights , and OpenTelemetry . Drive performance profiling and system optimization initiatives across ingestion, classification, and agent orchestration layers. Explore and integrate AI capabilities such as voice embeddings , vector search , and immersive UI elements into the platform. Participate actively in PI planning, backlog grooming, and agile ceremonies across engineering and product teams. Mentor junior developers and lead sprint-level technical delivery with a focus on modularity, scalability, and AI-readiness. What You'll Bring: Passion for staying ahead of AI developments and a builder mindset to turn AI capabilities into practical applications. Demonstrated ability to design systems with observability, orchestration, and automation at the core. A strong performance-first philosophy - ability to analyze, profile, and optimize services at scale. Vision for integrating AI into core insurance workflows, from agent recommendations to customer-facing explainability. Willing to work across time zones and with remote teams. All About You: 9+ years of experience in backend platform development and cloud-native architecture. Strong knowledge of FastAPI, Django, or Flask and event-driven microservice design. Minimum 5 years of experience on frontend framework - HTML5, React JS, Node JS, JavaScript, Angular Exposure to cloud applications including DevOps/DevSecOps, Scaling, Deployment and Automation Cloud Exposure - MS Azure/AWS, OpenShift, Docker, Kubernetes, Jenkins, GitHub, Jira Hands-on experience with Azure Cloud Services including Logic Apps, Functions, Cosmos DB, Blob Storage, and Service Bus. Proficient with Docker and Kubernetes for containerized deployment and service scaling. Experience building intelligent orchestration workflows using Power Automate and Copilot Studio . Working knowledge of vector databases , embedding APIs , and LLM integration workflows (OpenAI/Azure OpenAI). Exposure to AI-enhanced UIs - such as embedded assistants, predictive agents, or conversational UI. Proficient in system performance optimization, error tracing, logging frameworks, and monitoring pipelines. Experience working in agile teams with PI planning, story-pointing, sprint demos, and cross-functional delivery. P&C insurance domain familiarity is preferred but not mandatory. Other Qualifications: Bachelor's degree in computer science or engineering master's degree a plus . Experience with SAFe Agile Development practices and processes. SAFe Practitioner certification preferred. Experience with the Majesco Platforms/Products is a plus Preferred Experience in developing packaged software (Products) preferably in the banking or financial services areas.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are an experienced Data Engineering expert with a focus on Azure Data Factory (ADF). In this role, you will be responsible for designing and implementing end-to-end data solutions using ADF. Your primary tasks will include collaborating with stakeholders to gather requirements and develop robust pipelines that support analytics and business insights. Your key responsibilities will involve designing and implementing complex data pipelines within Azure Data Factory. You will work on integrating data from various sources such as on-prem databases, APIs, and cloud storage. Additionally, you will develop ETL/ELT strategies to manage both structured and unstructured data effectively. Supporting data transformation, cleansing, and enrichment processes will also be a part of your role. Furthermore, you will be required to implement logging, alerting, and monitoring mechanisms for ADF pipelines. Collaboration with architects and business analysts to comprehend data requirements will be crucial. Writing and optimizing complex SQL queries for performance optimization will also be an essential aspect of your responsibilities. To excel in this role, you should possess strong hands-on experience with ADF, specifically in pipeline orchestration and data flows. Experience with Azure Data Lake, Azure Synapse Analytics, and Blob Storage will be beneficial. Proficiency in SQL and performance tuning is a must. Additionally, knowledge of Azure DevOps and CI/CD practices, along with a good understanding of DataOps and Agile environments, will be advantageous. If you meet these qualifications and are excited about this opportunity, please share your resume with us at karthicc@nallas.com. We look forward to potentially working with you in Coimbatore in this hybrid role that offers a stimulating environment for your Data Engineering expertise.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
Qualcomm India Private Limited is seeking a Software Engineer with strong experience in Python, Java, or Go. You should have a solid understanding of Restful APIs, gRPC, and GraphQL, along with experience in cloud technologies such as AWS, Azure, or GCP, as well as on-premises infrastructure. Familiarity with BLOB storage like S3 or Azure BLOB is required, and knowledge of cloud native technologies like API Gateways and Service Mesh is a plus. An understanding of databases including SQL and NoSQL is essential, and experience with IAC tools like Cloudformation and Terraform is preferred. You should also have knowledge of security best practices and possess good problem-solving and debugging skills. Experience with CI/CD systems like Jenkins, GitOps, Chef, and/or Ansible is highly valued. Minimum qualifications include a Bachelor's, Master's, or PhD degree in Engineering, Information Systems, Computer Science, or a related field, along with 2+ years of Software Engineering experience for Bachelor's degree holders, 1+ year for Master's degree holders, or relevant academic/work experience for Ph.D. holders. You should have 2+ years of experience with programming languages such as C, C++, Java, Python, etc., and work closely with team leads to understand use cases and requirements. Building proof-of-concepts, deploying, managing, and supporting scalable microservices on the cloud and on-premises are key responsibilities. Supporting users of the applications is also part of the role. The education requirements for this position are a B.E/M.E degree. Qualcomm is an equal opportunity employer and is committed to providing accessible processes for individuals with disabilities. If you need an accommodation during the application/hiring process, you can contact Qualcomm for support. Qualcomm expects its employees to adhere to all applicable policies and procedures, including security requirements for protecting company confidential information. Staffing and recruiting agencies are advised that Qualcomm's Careers Site is only for individuals seeking a job at Qualcomm, and unsolicited resumes or applications will not be accepted. For further information about this role, please reach out to Qualcomm Careers.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
YASH Technologies is seeking to hire .Net Core Professionals with 4-6 years of experience in Hyderabad, Pune, and Indore. As a .Net Developer or Application Developer, you will be expected to have expertise in ASP.NET framework, C#, SQL Server, and design/architectural patterns. Additionally, experience in UI Technologies like HTML5, CSS3, JavaScript, jQuery, React.js, and Angular 10+ is required. Knowledge of Azure development/deployment environment and Containerized applications is a plus. The ideal candidate should possess good debugging skills, strong knowledge of OOPS, and proficiency in using code versioning tools like TFS/GIT/Azure DevOps. Familiarity with architecture styles/APIs, troubleshooting, and communication skills is essential. You must be able to write clean, scalable code using .NET programming languages, adhere to application architecture and design, and have deployment knowledge. Other mandatory aspects include the ability to join early or immediately, complete tasks on time, and potentially work as an individual contributor. Strong personal skills such as good communication, attention to detail, integrity, ownership, flexibility, teamwork mindset, analytical thinking, and problem-solving skills are highly valued. In terms of technical/functional competencies, the role involves requirement gathering and analysis, application design, architecture tools and frameworks, estimation and resource planning, product/technology knowledge, test management, customer management, project management, and domain/industry knowledge. Additionally, knowledge of marketing and pre-sales activities is expected. Required behavioral competencies include accountability, collaboration, agility, customer focus, communication, driving results, and conflict resolution. Mandatory certifications are required. YASH Technologies offers a career-oriented skilling model, inclusive team environment, and continuous learning opportunities. Our Hyperlearning workplace is based on flexible work arrangements, agile self-determination, support for business goals, and stable employment with an ethical corporate culture.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
karnataka
On-site
As a Senior Azure IoT Architect with expertise in Infrastructure as Code (IaC), you will have the opportunity to play a crucial role in designing and implementing end-to-end IoT architecture solutions using Microsoft Azure services. With 10-15 years of experience in the field, you will lead the way in ensuring high availability, scalability, and security of IoT ecosystems. Your responsibilities will include defining and implementing IoT architectures, establishing best practices for data ingestion and real-time analytics, and designing secure device provisioning and communication protocols. You will also be involved in automating cloud infrastructure provisioning using tools like Terraform, ARM Templates, and Bicep, as well as implementing CI/CD pipelines for deploying and maintaining Azure IoT services. Collaboration with cross-functional teams, stakeholder management, and translating business requirements into technical solutions will be key aspects of your role. Additionally, integrating IoT solutions with enterprise applications, AI/ML models, and ensuring security, compliance, and governance standards will be part of your daily tasks. Your expertise in Azure IoT services, infrastructure automation, and cloud environment management will be critical in driving operational efficiency, scalability, and security in IoT solutions. As a member of CGI, you will have the opportunity to work in a dynamic environment that values ownership, teamwork, respect, and continuous growth. If you are passionate about leveraging cutting-edge technologies to create impactful solutions, thrive in a collaborative team environment, and have a strong background in Azure IoT and Infrastructure as Code, we invite you to join our team at CGI and be a part of our journey to drive innovation and success in the world of IT and business consulting.,
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
mumbai, maharashtra, india
Remote
Position Title Data Engineer I - Data Engineering Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS). For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security , Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through a provided . Purpose of the role The Enterprise Data Development team is responsible for designing & architecting solutions to integrate & transform business data into Data Warehouse to deliver data layer for the Enterprise using cutting edge cloud technologies like GCP. We design solutions to meet the expanding need for more and more internal/external information to be integrated with existing sources research, implement and leverage new technologies to deliver more actionable insights to the enterprise. We integrate solutions that combine process, technology landscapes and business information from the core enterprise data sources that form our corporate information factory to provide end to end solutions for the business. KEY ACCOUNTABILITIES . Design, create, code, and support a variety of GCP, ETL & SQL solutions. . Experience with agile techniques or methods . Work effectively in a distributed global team environment. . Effective technical & business communication with good influencing skills . Analyze existing processes and user development requirements to ensure maximum efficiency . Ability to manage multiple stakeholders, tasks and navigate through ambiguity & complexity . Turn information into insight by consulting with architects, solution managers, and analysts to understand the business needs & deliver solutions . Maintain strong technical skills and share knowledge with team members . Work with system users, other members of the IT Department, software vendors, and service providers to resolve problems . Support existing Data warehouses & related jobs. . Task / Job Scheduling experience (Talend, Tidal, Airflow, Linux) . Able to lead small projects/initiatives and contribute/lead effectively to the implementation of enterprise projects. . Proactive research into up-to-date technology or techniques for development . Should have automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes. . Train and educate internal team, IT functions, and business users . Familiarity with real time and streaming data processes MINIMUM QUALIFICATIONS . 5-8+ years of relevant experience of working as Data Engineer or similar levels . Hands on experience on cutting edge cloud data engineering services . Understanding of SAP Landscape . Data Governance Tools (Any) . Basic understanding of Cyber Security requirements . Excellent communication skills- verbal and written . Excellent analytical skills . Excellent stakeholder management skills . Expert level o SQL o Python o Data Warehousing Concepts . Intermediate Level o Cloud (Storage, Modelling, Real time) - GCP Preferred o Data Storage (S3 / Blob Storage) o Big Query o SQL o Composer o Cloud Functions (Lambda/Azure function) o dBT o . Basic Level/Preferred o Data Modelling concepts PREFERRED QUALIFICATIONS . GCP Data Engineer certification, GCP certification . Understanding of CPG industry
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
powai, maharashtra, india
On-site
ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the provided Purpose of the role The Enterprise Data Development team is responsible for designing and architecting solutions to integrate and transform business data into a data warehouse, delivering a data layer fo r the Enterprise using cutting-edge cloud technologies like GCP. We design solutions to meet the expanding need for more and more internal/external information to be integrated with existing sources research, implement, and leverage new technologies to deliver more actionable insights to the enterprise. We integrate solutions that combine process, technology landscapes, and business information from the core enterprise data sources that form our corporate information factory to provide end-to-end solutions for the business. KEY ACCOUNTABILITIES Design, create, code, and support a variety of data pipelines and models on any cloud technology (GCP preferred) Partner with business analysts, architects, and other key project stakeholders to deliver business initiatives Seeks to learn new skills, mentor newer team members, build domain expertise, and document processes Actively builds knowledge of D&T resources, people, and technology Participate in the evaluation, implementation, and deployment of emerging tools & processes in the big data space Collaboratively troubleshoot technical and performance issues in the data space Leans into ambiguity and partners with others to find solutions Ability to identify opportunities to contribute work to the broader GMI data community Ability to manage multiple stakeholders, tasks, and navigate through ambiguity & complexity Able to lead small projects/initiatives and contribute/lead effectively to the implementation of enterprise projects. Support existing Data warehouses & related jobs. Familiarity with real-time and streaming data processes Proactive research into up-to-date technology or techniques for development Should have an automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes MINIMUM QUALIFICATIONS Identified as the technical /project lead for global projects and have 5+ years of total experience in ETL/Data Space with 2+ years of relevant experience in the Cloud Space Actively coaches and mentors team of developers while proactively identifying potential issues/deadline slippage /opportunities in projects/tasks and takes timely decisions Demonstrates a strong affinity towards paying attention to details and delivery accuracy with strong analytical skills and communication (verbal & written) Collaborates with the business stakeholders and develops strong working relationships Self-motivated team player and should have the ability to overcome challenges and achieve desired results Expert level of experience in Cloud (Storage, Modeling, real time), Data Storage (S3/Blob Storage), Big Query, SQL, Composer, Cloud Functions (Lambda/Azure Function), Data Warehousing Intermediate level of experience with Python, Kafka, Pub/Sub, dBT
Posted 2 weeks ago
6.0 - 8.0 years
0 Lacs
gurgaon, haryana, india
On-site
Requisition Id : 1633643 As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom. At EY, we don't just focus on who you are now, but who you can become. We believe that it's your career and It's yours to build which means potential here is limitless and we'll provide you with motivating and fulfilling experiences throughout your career to help you on the path to becoming your best professional self. The opportunity : Manager-National-Forensics-ASU - Forensics - Investigations & Compliance - Gurgaon Your key responsibilities Technical Excellence 6+ years of professional software development experience, with at least 2 years in a technical leadership or engineering management role. Expertise inNode.js,Express.js,Angular, andJavaScript/TypeScript. Deep understanding ofdata structures & algorithms (DSA), along with proven analytical and problem-solving skills. Demonstrated experience designing and implementingHLDandLLD. Strong proficiency indesign patternsandmicroservice architectures. Advanced knowledge ofCI/CD pipelines(e.g., Jenkins, GitHub Actions, Azure DevOps) andGit. Experience integrating and managingmessaging queues(Redis,ActiveMQ,Azure Service Bus) for event-driven applications. Hands-on experience withMicrosoft Azure, particularlyFunction Apps,Logic Apps,SQL Databases, andBlob Storage. Extensive experience inAPI integration(REST, SOAP, GraphQL) and best practices. Prior development experience withworkflow applications(e.g., BPM, process automation, case management). Excellent communication, stakeholder management, and people leadership skills. Exposure to Agile/Scrum methodologies. Skills and attributes To qualify for the role you must have Qualification 6+ years of professional software development experience, with at least 2 years in a technical leadership or engineering management role. Bachelor's or master's degree in computer science Experience 6+ years of professional software development experience, with at least 2 years in a technical leadership or engineering management role. What we look for People with the ability to work in a collaborative manner to provide services across multiple client departments while following the commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions. We look for people who are agile, curious, mindful and able to sustain postivie energy, while being adaptable and creative in their approach. What we offer With more than 200,000 clients, 300,000 people globally and 33,000 people in India, EY has become the strongest brand and the most attractive employer in our field, with market-leading growth over compete. Our people work side-by-side with market-leading entrepreneurs, game- changers, disruptors and visionaries. As an organisation, we are investing more time, technology and money, than ever before in skills and learning for our people. At EY, you will have a personalized Career Journey and also the chance to tap into the resources of our career frameworks to better know about your roles, skills and opportunities. EY is equally committed to being an inclusive employer and we strive to achieve the right balance for our people - enabling us to deliver excellent client service whilst allowing our people to build their career as well as focus on their wellbeing. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
hyderabad, bengaluru, delhi / ncr
Work from Office
As a Senior Azure Data Engineer, your responsibilities will include: Building scalable data pipelines using Databricks and PySpark Transforming raw data into usable business insights Integrating Azure services like Blob Storage, Data Lake, and Synapse Analytics Deploying and maintaining machine learning models using MLlib or TensorFlow Executing large-scale Spark jobs with performance tuning on Spark Pools Leveraging Databricks Notebooks and managing workflows with MLflow Qualifications: Bachelors/Masters in Computer Science, Data Science, or equivalent 7+ years in Data Engineering, with 3+ years in Azure Databricks Strong hands-on in: PySpark, Spark SQL, RDDs, Pandas, NumPy, Delta Lake Azure ecosystem: Data Lake, Blob Storage, Synapse Analytics Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 2 weeks ago
1.0 - 5.0 years
7 - 10 Lacs
kolkata
Work from Office
Job Title : SSIS Developer Number of Positions : 5 Experience : 45 Years Location : Remote (Preferred : Ahmedabad, Gurgaon, Mumbai, Pune, Bangalore) Shift Timing : Evening/Night (Start time : 6 : 30 PM IST onwards) Job Summary We are seeking skilled SSIS Developers with 45 years of experience in developing and maintaining data integration solutions The ideal candidate will have strong expertise in SSIS and SQL, solid understanding of data warehousing concepts, and exposure to Azure data services This role requires clear communication and the ability to work independently during evening or night hours. Key Responsibilities Design, develop, and maintain SSIS packages for ETL processes. Write and optimize complex SQL queries and stored procedures. Ensure data accuracy, integrity, and performance across DWH systems. Collaborate with team members to gather and understand requirements. Work with Azure-based data platforms and services as needed. Troubleshoot and resolve data integration issues promptly. Document technical specifications and maintain version control. Required Skills Proficient in Microsoft SSIS (SQL Server Integration Services). Strong SQL skills, including performance tuning and debugging. Good understanding of data warehousing concepts and ETL best practices. Exposure to Azure (e.g., Data Factory, SQL Database, Blob Storage). Strong communication and collaboration skills. Ability to work independently during US-aligned hours. Preferred Qualifications Experience working in a remote, distributed team environment. Familiarity with agile methodologies and tools like JIRA, Git.
Posted 2 weeks ago
3.0 - 7.0 years
6 - 10 Lacs
mumbai, pune, chennai
Work from Office
Python backend developer with experience in creating web applications. Good at fast API framework, async processing with redis cache, unit test the code, production log code, docker, micro-services, mongo DB, blob storage
Posted 2 weeks ago
4.0 - 12.0 years
0 Lacs
coimbatore, tamil nadu
On-site
We are looking for a Data Architect with a minimum of 12 years of experience, including at least 4 years in Azure data services. As a Data Architect, you will be responsible for leading the design and implementation of large-scale data solutions on Microsoft Azure. Your role will involve leveraging your expertise in cloud data architecture, data engineering, and governance to create robust, secure, and scalable platforms. Key Skills: - Proficiency in Azure Data Factory, Synapse, Databricks, and Blob Storage - Strong background in Data Modeling and Lakehouse Architecture - Experience with SQL, Python, and Spark - Knowledge of Data Governance, Security, and Metadata Management - Familiarity with CI/CD practices, Infra as Code (ARM/Bicep/Terraform), and Git - Excellent communication skills and the ability to collaborate effectively with stakeholders Bonus Points For: - Azure Certifications such as Data Engineer or Architect - Hands-on experience with Event Hubs, Stream Analytics, and Kafka - Understanding of Microsoft Purview - Industry experience in healthcare, finance, or retail sectors Join our team to drive innovation through data, shape architecture strategies, and work with cutting-edge Azure technologies. If you are ready to make a significant impact in the field of data architecture, apply now or reach out to us for more information. For more information, please contact karthicc@nallas.com.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The position available is for a Senior Developer / Technical Architect specializing in Microsoft Power Platform & Azure. The role is based on a hybrid model, requiring at least 5 years of experience in the field. This is a full-time employment opportunity, and interested candidates can apply by reaching out to reachus@neelitech. As a successful candidate, you will be responsible for designing, developing, and implementing enterprise solutions utilizing the Microsoft low-code and AI ecosystem. Proficiency in Copilot Studio, Power Platform, and Azure services is essential, along with a profound understanding of solution architecture and system design. Your key responsibilities will include designing scalable, secure, and high-performance solutions using Microsoft Power Platform and Azure. You will play a crucial role in building automation and conversational experiences through Copilot Studio, integrating custom connectors, Power Automate flows, and Dataverse-based applications. Additionally, you will leverage various Azure services such as Azure Bot Services, Cognitive Services, and Azure Functions, along with collaborating with cross-functional teams to ensure end-to-end solution delivery aligns with business objectives. The ideal candidate should possess a minimum of 5 years of IT experience, with at least 3 years dedicated to Power Platform. Hands-on experience in Copilot Studio is mandatory, along with expertise in Solution Architecture and System Design for enterprise applications. Proficiency in Dataverse, Power Automate, custom connectors, and various Azure services like Bot Services, Cognitive Services, Azure Functions, and Blob Storage is required. Strong problem-solving and communication skills are essential for this role. Having knowledge of Prompt Engineering and conversational AI best practices, as well as possessing Microsoft certifications related to Power Platform and Azure, would be considered advantageous for this position.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
At EY, you are part of a team dedicated to shaping your future with confidence. You will thrive in a globally connected environment, collaborating with diverse teams to steer your career in any direction you desire. Join EY in our mission to contribute to building a better working world. At EY Technology, we recognize the pivotal role of technology in unlocking our clients" potential and driving lasting value through innovation. We are committed to enhancing the working world by equipping EY and our clients with the necessary products, services, support, and insights to excel in the market. We focus on creating value by assisting clients in resolving intricate business challenges through innovative technological solutions. Our approach involves pioneering methods of product delivery and enterprise support to ensure our mutual success. We prioritize safeguarding our work and client data against increasingly sophisticated threats. **The Opportunity:** The GDS Technology team is in search of an experienced Technical Manager to spearhead the development of solution architecture for software solutions utilizing the Microsoft stack. This includes technologies such as C#, .Net, SQL, Data Lake, Power Platform, as well as AI and ML technologies. Your role will entail designing and implementing highly modular, scalable, and robust applications, both on-premises and in the cloud, primarily leveraging Azure. Collaboration with teams to promote best practices and formalize processes for compliance and excellence in delivery is a key aspect of this role. As an advocate for change and growth, you will play a crucial role in integrating emerging technologies into various facets of EY's operations, offering you opportunities for personal and professional growth, learning, and impactful contributions. **Key Responsibilities:** - Lead the Solution Architecture team in developing software solutions using the Microsoft Technology stack. - Oversee solution delivery for complex projects, applying strong technical capabilities and hands-on experience. - Design and implement advanced software architectures and infrastructure architectures aligned with business and user requirements. - Integrate AI and ML solutions into software applications to enhance functionality and user experience. - Solve complex technical challenges in coding, testing, and deployment across multiple programming languages. - Ensure compliance with relevant standards throughout the design, development, delivery, and maintenance of solutions. - Engage in business development activities, including solutioning for pursuits and collaborating with sales teams on compelling proposals. - Provide advanced technical expertise to maximize efficiency, reliability, and value from existing solutions and emerging technologies. - Forge strong working relationships with peers across Development & Engineering and Architecture teams to develop leading solutions. **Skills and Attributes for Success:** To qualify for this role, you must have a B.E/ B.Tech/ MCA/ MS or equivalent degree in Computer Science discipline, with a minimum of 10-15 years of experience in software development and architecture. Proficiency in C#, .Net, and the Microsoft technology stack is essential, along with experience in architecting enterprise solutions. Hands-on experience with Azure services, AI/ML technologies, Agile Development methodologies, and exposure to Data Science concepts are highly desirable. **Additional Requirements:** - Familiarity with Power Platform and Agile Development methodologies. - Experience in large-scale technical implementations and digital transformation initiatives. - Knowledge of PMI & Agile Standards and Azure certifications would be an advantage. **What We Look For:** We seek individuals who are self-starters, independent thinkers, and creative minds with a drive for innovation. If you are passionate about pushing the boundaries of AI and possess the technical expertise to do so, we invite you to apply for this position and be part of redefining the possibilities of technology. EY offers a dynamic and global delivery network through EY Global Delivery Services (GDS), operating across six locations worldwide. In GDS, you will collaborate with diverse teams on exciting projects and engage with renowned brands, fostering continuous learning and personal growth opportunities that will shape your career trajectory. EY is dedicated to building a better working world by generating new value for clients, people, society, and the planet while fostering trust in capital markets. With a focus on data, AI, and advanced technology, EY teams are committed to addressing today's pressing challenges and shaping a confident future for all. Join EY and embark on a journey to transform the possibilities of technology while contributing to a better working world.,
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
kolkata, west bengal, india
Remote
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what's next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelor's degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Design, develop, and deploy cloud-native applications using .NET Core/.NET Framework on Microsoft Azure. Build scalable APIs and microservices using Azure App Services, Azure Functions, or AKS. Integrate Azure services like SQL Database, Blob Storage, Key Vault, and Service Bus into application architecture. Implement CI/CD pipelines with Azure DevOps for automated builds, testing, and deployments. Ensure application security using Azure AD, OAuth, and RBAC. Your Profile 4-6 years of experience in .NET development with hands-on exposure to Azure cloud services. Proficient in writing clean, testable code following SOLID principles and best practices. Experience with performance optimization using Azure Monitor, Application Insights, and caching strategies. Familiarity with infrastructure automation using ARM templates, Bicep, or Terraform. Exposure to Agile methodologies and tools like Jira, Confluence, and ServiceNow. What you'll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 2 weeks ago
0.0 years
0 Lacs
bengaluru, karnataka, india
On-site
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world's most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description We are seeking a skilled and motivated Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have hands-on experience with big data technologies, cloud platforms, and programming languages, and will play a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and optimize data pipelines for ETL processes using Apache Hadoop, Spark, and other big data tools. Implement and manage data workflows in cloud environments, primarily Microsoft Azure. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver robust solutions. Ensure data quality, integrity, and security across all stages of data processing. Develop and maintain scalable data architectures for structured and unstructured data. Write efficient SQL queries for data extraction, transformation, and analysis. Monitor and troubleshoot data pipeline performance and reliability. Document data engineering processes and best practices. Primary Skills: Big Data Technologies: Apache Hadoop, Spark Cloud Platforms: Microsoft Azure (Data Factory, Synapse, Blob Storage, etc.) Programming Languages: Python, Java ETL Tools & Techniques: Data ingestion, transformation, and loading SQL & Data Querying: Advanced SQL for data manipulation and analysis Data Processing & Management: Batch and real-time data processing Data Analysis & Business Intelligence: Integration with BI tools and dashboards Secondary Skills: Cloud Computing Concepts: Public cloud, hybrid cloud, cloud security Multi-Paradigm Programming: Functional and object-oriented programming Software Development Practices: Version control, CI/CD, testing Data Science Fundamentals: Understanding of statistical methods and machine learning workflows Information Technology: General IT knowledge including networking, storage, and system architecture Cloud Providers: Familiarity with AWS or Google Cloud Platform is a plus Communication & Collaboration: Ability to work cross-functionally and explain technical concepts to non-technical stakeholders Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 2 weeks ago
8.0 - 13.0 years
25 - 30 Lacs
mumbai
Work from Office
Job Summary A Software Developer must build and implement functional programs You will work with other Developers and Product Managers throughout the software development life cycle, In this role, you should be a team player with a keen eye for detail and problem-solving skills Your goal will be to build efficient programs and systems that serve user needs, As a software engineer, you'll work in a constantly evolving environment, due to technological advances and the strategic direction of the organisation You'll create, maintain, audit and improve systems to meet needs, often as advised by a systems analyst or architect, testing both hard and software systems to diagnose and resolve system faults, The role also covers writing diagnostic programs and designing and writing code for operating systems and software to ensure efficiency When required, you'll make recommendations for future developments, Job Overview, Roles & Key Responsibilities Debug and resolve issues using latest tools and technologies for identifying root cause of production issues, Perform database operations, Create and maintain CI, Release and automated deployment builds, Develop scripts, tools and other utilities as needed to detect issues, Perform other duties as assigned, Minimum Qualifications: Must have Graduation/Post Graduation in Computer Science from A or B grade colleges with Good academic scores, Must have at least 2 years relevant work experience (if Graduation is from Premier Institute, work ex maybe exempted) Must have Coding Skills in C#, WPF & SQL Server,IIS, Must have Sharp analytical and problem-solving skills, Should have Meticulous and organized approach to work Experience with version controlling would be a plus Required Skills/Behaviors to be successful in this role Excellent communication skills, verbal, and written Ability to thrive in a deadline-driven, team environment, while also delivering results, Driven, enthusiastic, and highly motivated, high attention to detail and ability to multitask We can offer A chance to join an engaging team of brilliant people with in-depth expertise and industry experience An opportunity to make an impact on the decarbonization of the shipping industry Competitive benefits Innovative tasks and development Development possibilities
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
kolkata, west bengal
On-site
You will be responsible for developing, scheduling, and managing ETL processes using SSIS and Azure Data Factory (ADF). Your role will also involve building and optimizing data models and OLAP cubes using SSAS and Azure Analysis Services, as well as designing and deploying reports using SSRS and Power BI. Migrating on-premises SQL workloads to Azure SQL, Data Lake, and Synapse will be part of your tasks, along with building scalable data pipelines and processing workflows using Databricks (PySpark). You will be required to write complex T-SQL queries, stored procedures, functions, and triggers, and implement DevOps pipelines for data deployment and automation using Azure DevOps. Collaborating with stakeholders to convert business requirements into technical solutions will be essential, along with tracking and resolving issues through tools like ServiceNow, JIRA, or ALM. Maintaining technical documentation, including Low-Level Design and testing artifacts, is also expected. Your role will involve incorporating temporal dynamics into modeling strategy to enhance prediction accuracy over time, as well as designing and implementing scalable ML pipelines using Azure AI Hub, Foundry, Blob Storage, and Cosmos DB. Collaborating with architecture and platform teams to ensure smooth lift-and-shift deployment of the solution into client environments will also be part of your responsibilities. Integrating outputs with Azure OpenAI (GPT-4o) to build conversational interfaces via Microsoft Copilot Studio or Power Apps is a key aspect of the role. You will need to evaluate model performance using robust metrics such as accuracy, AUC-ROC, and ranking correlation. Working closely with stakeholders to translate business requirements into ML model specifications and insights delivery is crucial. Staying up to date with advances in NLP, ranking models, and the Azure ML ecosystem will also be important. Required Skills: - Deep experience with MSBI tools: SSIS, SSRS, SSAS, Power BI - Advanced SQL skills and experience with SQL Server (2008R22019) - Strong knowledge of Azure PaaS services: Data Factory, Databricks, Cosmos DB, Analysis Services - Programming in Python and C# - Hands-on experience with CI/CD and Azure DevOps - Sound knowledge of data warehousing, data modeling, and ETL principles - Strong communication and analytical skills with a proactive mindset Additional Skills: - Deep proficiency in Python, with expertise in transformers, Python/TensorFlow, Hugging Face, and data science libraries (pandas, scikit-learn) - Hands-on experience with BERT, pairwise learning, and ranking models - Prior exposure to temporal modeling techniques (e.g., sliding windows, time-based feature engineering) - Experience designing or integrating generative AI solutions (e.g., GPT-4, Claude, Gemini) - Familiarity with CRISP-DM methodology or other structured data science lifecycles About Grant Thornton INDUS: Grant Thornton INDUS comprises GT U.S. Shared Services Center India Pvt Ltd and Grant Thornton U.S. Knowledge and Capability Center India Pvt Ltd. Established in 2012, Grant Thornton INDUS supports the operations of Grant Thornton LLP, the U.S. member firm of Grant Thornton International Ltd. The company employs professionals across various disciplines, including Tax, Audit, Advisory, and other operational functions. Grant Thornton INDUS is known for its collaborative approach, quality-driven mindset, and commitment to building strong relationships. Professionals at Grant Thornton INDUS are empowered, supported by bold leadership, and focused on delivering distinctive client service. The company values transparency, competitiveness, and excellence, offering significant opportunities for professionals to be part of a dynamic and impactful organization. Grant Thornton INDUS is actively involved in community service initiatives in India, reflecting its commitment to giving back to the communities it operates in. The company has offices in Bengaluru and Kolkata.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The role requires you to design, develop, and maintain complex, high-performance, and scalable MLOps systems that interact with AI models and systems. You will collaborate with cross-functional teams, including data scientists, AI researchers, and AI/ML engineers, to comprehend requirements, define project scope, and ensure alignment with business goals. Your expertise will be crucial in selecting, evaluating, and implementing software technologies, tools, and frameworks within a cloud-native (Azure + AML) environment. Troubleshooting and resolving intricate software issues to ensure optimal performance and reliability when interfacing with AI/ML systems is an essential part of your responsibilities. Additionally, you will contribute to software development project planning and estimation, ensuring efficient resource allocation and timely solution delivery. Your role involves contributing to the development of continuous integration and continuous deployment (CI/CD) pipelines, high-performance data pipelines, storage systems, and data processing solutions. You will drive the integration of GenAI models, such as LLMs and foundation models, into production workflows, including overseeing orchestration and evaluation pipelines. Moreover, you will provide support for edge deployment use cases through model optimization, conversion (e.g., to ONNX, TFLite), and containerization for edge runtimes. Your contribution to creating and maintaining technical documentation, including design specifications, API documentation, data models, data flow diagrams, and user manuals, will be vital for effective communication within the team. **Required Qualifications:** - Bachelor's degree in software engineering/computer science or related discipline - Minimum of 6 years of experience in machine learning operations or software/platform development - Strong familiarity with Azure ML, Azure DevOps, Blob Storage, and containerized model deployments on Azure - Proficiency in programming languages commonly used in AI/ML, such as Python, R, or C++ - Experience with Azure cloud platform, machine learning services, and industry best practices **Preferred Qualifications:** - Experience with machine learning frameworks like TensorFlow, PyTorch, or Keras - Familiarity with version control systems like Git and CI/CD tools such as Jenkins, GitLab CI/CD, or Azure DevOps - Knowledge of containerization technologies such as Docker and Kubernetes, along with infrastructure-as-code tools like Terraform or Azure Resource Manager (ARM) templates - Exposure to Generative AI workflows, including prompt engineering, LLM fine-tuning, or retrieval-augmented generation (RAG) - Understanding of GenAI frameworks like LangChain, LlamaIndex, Hugging Face Transformers, and OpenAI API integration - Experience in deploying optimized models on edge devices using ONNX Runtime, TensorRT, OpenVINO, or TFLite - Hands-on experience with monitoring LLM outputs, feedback loops, or LLMOps best practices - Familiarity with edge inference hardware such as NVIDIA Jetson, Intel Movidius, or ARM Cortex-A/NPU devices This is a permanent position requiring in-person work.,
Posted 3 weeks ago
0.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Description As an Data Engineer, you&aposll be responsible for designing, implementing, and managing data solutions on the Azure platform. Here are some common use cases and examples of tasks performed by Azure Data Engineers: Data Ingestion & Integration: Use tools like Azure Data Factory to extract, transform, and load (ETL) data from sources such as databases, files, APIs, and streams into Azure. Data Transformation & Processing: Prepare data for analysis using platforms like Azure Databricks or Synapse Analytics with Apache Spark. Data Storage & Management: Choose appropriate storage solutionsAzure SQL Database or Cosmos DB for structured data, Data Lake Storage for big data, and Blob Storage for files. Data Warehousing: Implement scalable analytics with Azure Synapse Analytics for large datasets. Data Modeling & Analysis: Collaborate on data models using Azure Analysis Services or Databricks to support analytics. Real-time Processing: Use Azure Stream Analytics to process and analyze live data from sources like IoT or social media. Governance & Security: Ensure data security and compliance through access controls, encryption, monitoring, and retention policies. DevOps & Automation: Automate pipelines and deployments using Azure DevOps, Monitor, and Automation tools. Additional Information At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity. Diversity,?equity and?inclusion (tietoevry.com) Show more Show less
Posted 3 weeks ago
2.0 - 5.0 years
2 - 5 Lacs
pune, maharashtra, india
On-site
Senior Data Engineer Skills: Azure Cloud technologies Mandatory Synapse (DataFlow, Pipeline (including web call), Notebook (Pyspark)) SQL expertise Key vault Blob Storage Preferable Logic Apps Function App (C#) API Management GIT
Posted 3 weeks ago
8.0 - 13.0 years
8 - 12 Lacs
hyderabad
Work from Office
Your role and responsibilities: In this role, you will have the opportunity to contribute to ABBs success with technology, software, product, and system development. Each day, you will identify and implement innovative solutions to relevant problems. You will also showcase your expertise by providing ideas and by being able to work both independently and as a part of a Research & Development (R&D) team. The work model for the role is:#LI-Onsite This role is contributing to the Electrification Business Area for the Smart Buildings Division based in Hyderabad. The ideal candidate plays a crucial role in ensuring cloud-based systems function reliably, securely, and efficiently. You will be mainly accountable for: The ideal candidate needs to work in tandem with our Product management, global R&D team and external firms on connected solutions in IoT, Microservices, Mobility, Cloud platforms and Edge devices in the ELSB SBED domain with scalability, security, multi-tenancy, service-oriented architecture AWS/Azure/Google Must have a good understanding of: Device connection device registration; connecting smart devices over MQTT and HTTPS protocols, with high availability and auto-scaling clusters. Device control and management - manage the life cycle of connected devices and standard instruction models for device control Hands-on Cloud development experience - understand fundamental services for smart solutions with experience in the strategy, design, development, and implementation of projects in the cloud, preferably Amazon Web Services (AWS). Knowledge of Serverless Solutions - Implement AWS Lambda, API Gateway, DynamoDB, and other serverless services or develop applications using Azure Functions, Logic Apps, and Event Grid. Cloud Migration - Assist in migrating legacy applications to AWS or Azure serverless infrastructure. Security & Compliance: Ensure cloud solutions adhere to security best practices and compliance standards. Familiarity with frontend frameworks, cyber security and secure coding practices will be an added advantage. Good Knowledge of Multithreading, SDLC, Agile practices & Iterative development process Qualifications for the role : B. Tech/M. Tech in CSE/ECE/EE from reputed institutes with 8+ years of hands-on experience in the development of IoT/Digital Solutions and Industrial protocols Proficiency in programming languages like C#, Python, Node.js, or Java, and experience in SQL, PL/SQL for writing Database queries, stored procedures, trigger packages. Hands-on experience in AWS services like Lambda, S3, DynamoDB, CloudFormation, and IAM, or Azure services like Functions, Cosmos DB, API Management, and Blob Storage. Good Understanding of DevOps practices and CI/CD pipelines Experience in the Electrical protection/controls Firmware domain preferred
Posted 3 weeks ago
3.0 - 5.0 years
11 - 17 Lacs
bengaluru, karnataka, india
Remote
Sapiens is on the lookout for a Developer (Power BI) to become a key player in our Bangalore team. If you&aposre a seasoned PowerBI pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens Digital (Data Suite) division, for more information about it, click here: https://sapiens.com/solutions/digitalsuite-customer-experience-and-engagement-software-for-insurers/ As a Power BI Developer at Sapiens, you will design and develop Power BI reports and dashboards, implement data solutions for clients, and collaborate with internal teams to ensure reporting requirements and project timelines are met. Roles & Responsibilities Responsible for Design, development, testing, implementation and maintenance of Reports and Dashboards on Microsoft Tool stack/Power BI Building Power BI Models based on different modelling designs like Snowflake, Star and Relational data models. Suggest and implement secure and suitable connectivity to required sources of data from Microsoft Tool stack/Power BI Should be capable to guide team of developers across multiple projects and tools Discuss with Business users to define and document new or enhanced reporting and dashboarding requirements. Translate business requirements into functional and technical specification documents that can be used for development Assist with data quality, cleansing, and governance activities. Provide training and assistance to business users on the tools used in delivering the solution (including reports and dashboards). Ensure proper configuration management and change controls implemented in your sphere of influence Establish quality metrics and ensure its implementation. Required Qualifications And Skills 3+ years of related experience in developing and implementing reports and dashboards primarily on Microsoft Tool stack/Power BI. Must have ability to solution and resolve complex modelling issues in Power BI with minimal support. Must have ability to code complex SQL, MDX and DAX queries and extensive experience in SQL performance tuning and debugging Must have extensive experience in creating complex models and creative visualizations for impactful dashboards in a way to drive better and more informed decisions. Experience in Power Query, M, R, Python is strong advantage Experience in Data Analysis, Data Modelling and Data Mart design Good understanding and familiarity of ETL Tools and Technologies (SSIS, Informatica, DataStage, Talend etc.,) is added advantage Very good understanding of BI and Analytics and Best practices followed Familiarity of all or any of Cloud Technologies (AWS, Azure, GCP etc.,) will be additional advantage Experience on Azure Stack (Data Factory, Logic Apps, Azure SQL DBs, blob storage, Azure Tables, etc.) is a Strong Advantage Familiarity with the Power Platform tools & MS365/SharePoint/Teams are nice to have Ability to work with minimal guidance or supervision in a time critical environment. About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform?to over 600 customers in more than 30 countries. Sapiens platform offers pre-integrated, low-code capabilities to accelerate customers digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000+ employees globally. For More information visit us on www.sapiens.com . Disclaimer : Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |