Home
Jobs
Companies
Resume

9389 Tuning Jobs - Page 47

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: DevOps Engineer Experience: 3 to 5 Years Location: [Insert Location or Remote] Employment Type: Full-Time Work Mode: [On-site/Hybrid/Remote] Job Summary: We are seeking a skilled DevOps Engineer with 3 to 5 years of experience to join our technology team. The ideal candidate will be responsible for automating and streamlining our development, testing, and deployment processes, and ensuring the stability, scalability, and security of our cloud-based infrastructure. Key Responsibilities: Design, build, and maintain CI/CD pipelines for automated deployments. Manage and maintain cloud infrastructure (AWS/Azure/GCP). Implement infrastructure as code using tools like Terraform, CloudFormation, or Ansible. Monitor system performance and ensure high availability and disaster recovery. Collaborate with development, QA, and IT teams to deliver reliable and scalable solutions. Maintain version control systems and implement branching strategies. Ensure system security through proper access controls, firewalls, and monitoring. Automate repetitive tasks and support production deployments. Troubleshoot production issues and coordinate with the development team for resolutions. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 3 to 5 years of hands-on experience in a DevOps role. Proficiency with cloud platforms such as AWS , Azure , or Google Cloud . Strong experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD, CircleCI). Experience with Docker and Kubernetes for containerization and orchestration. Familiarity with configuration management tools like Ansible, Puppet, or Chef. Strong scripting skills in Shell , Python , or Bash . Experience with monitoring tools like Prometheus , Grafana , ELK Stack , or Datadog . Familiarity with Git , Agile , and DevSecOps practices. Preferred Qualifications: Certifications in AWS/Azure/Google Cloud. Experience with microservices and service mesh architecture. Exposure to security best practices in DevOps (DevSecOps). Knowledge of networking concepts and performance tuning. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Join Tether and Shape the Future of Digital Finance At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our cutting-edge solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. By harnessing the power of blockchain technology, Tether enables you to store, send, and receive digital tokens instantly, securely, and globally, all at a fraction of the cost. Transparency is the bedrock of everything we do, ensuring trust in every transaction. Innovate with Tether Tether Finance: Our innovative product suite features the world’s most trusted stablecoin, USDT , relied upon by hundreds of millions worldwide, alongside pioneering digital asset tokenization services. But that’s just the beginning: Tether Power: Driving sustainable growth, our energy solutions optimize excess power for Bitcoin mining using eco-friendly practices in state-of-the-art, geo-diverse facilities. Tether Data: Fueling breakthroughs in AI and peer-to-peer technology, we reduce infrastructure costs and enhance global communications with cutting-edge solutions like KEET , our flagship app that redefines secure and private data sharing. Tether Education : Democratizing access to top-tier digital learning, we empower individuals to thrive in the digital and gig economies, driving global growth and opportunity. Tether Evolution : At the intersection of technology and human potential, we are pushing the boundaries of what is possible, crafting a future where innovation and human capabilities merge in powerful, unprecedented ways. Why Join Us? Our team is a global talent powerhouse, working remotely from every corner of the world. If you’re passionate about making a mark in the fintech space, this is your opportunity to collaborate with some of the brightest minds, pushing boundaries and setting new standards. We’ve grown fast, stayed lean, and secured our place as a leader in the industry. If you have excellent English communication skills and are ready to contribute to the most innovative platform on the planet, Tether is the place for you. Are you ready to be part of the future? About The Job As a member of the AI model team, you will drive innovation in supervised fine-tuning methodologies for advanced models. Your work will refine pre-trained models so that they deliver enhanced intelligence, optimized performance, and domain-specific capabilities designed for real-world challenges. You will work on a wide spectrum of systems, ranging from streamlined, resource-efficient models that run on limited hardware to complex multi-modal architectures that integrate data such as text, images, and audio. We expect you to have deep expertise in large language model architectures and substantial experience in fine-tuning optimization. You will adopt a hands-on, research-driven approach to developing, testing, and implementing new fine-tuning techniques and algorithms. Your responsibilities include curating specialized data, strengthening baseline performance, and identifying as well as resolving bottlenecks in the fine-tuning process. The goal is to unlock superior domain-adapted AI performance and push the limits of what these models can achieve. Responsibilities: Develop and implement new state-of-the-art and novel fine-tuning methodologies for pre-trained models with clear performance targets. Build, run, and monitor controlled fine-tuning experiments while tracking key performance indicators. Document iterative results and compare against benchmark datasets. Identify and process high-quality datasets tailored to specific domains. Set measurable criteria to ensure that data curation positively impacts model performance in fine-tuning tasks. Systematically debug and optimize the fine-tuning process by analyzing computational and model performance metrics. Collaborate with cross-functional teams to deploy fine-tuned models into production pipelines. Define clear success metrics and ensure continuous monitoring for improvements and domain adaptation. A degree in Computer Science or related field. Ideally PhD in NLP, Machine Learning, or a related field, complemented by a solid track record in AI R&D (with good publications in A* conferences). Hands-on experience with large-scale fine-tuning experiments, where your contributions have led to measurable improvements in domain-specific model performance. Deep understanding of advanced fine-tuning methodologies, including state-of-the-art modifications for transformer architectures as well as alternative approaches. Your expertise should emphasize techniques that enhance model intelligence, efficiency, and scalability within fine-tuning workflows. Strong expertise in PyTorch and Hugging Face libraries with practical experience in developing fine-tuning pipelines, continuously adapting models to new data, and deploying these refined models in production on target platforms. Demonstrated ability to apply empirical research to overcome fine-tuning bottlenecks. You should be comfortable designing evaluation frameworks and iterating on algorithmic improvements to continuously push the boundaries of fine-tuned AI performance. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading provider of recruitment and human resource management services in India. We specialize in connecting top talent with dynamic companies across various industries. Our mission is to empower businesses by providing exceptional workforce solutions tailored to their needs. We value integrity, collaboration, and innovation, and strive to foster a culture of excellence in our services. Job Title: Snowflake Data Engineer Location: On-site, India Role Responsibilities Design and develop scalable data models in Snowflake. Create and manage ETL processes to integrate data from multiple sources into Snowflake. Perform data migration and validate data integrity throughout the process. Develop SQL queries for data extraction, transformation, and load operations. Collaborate with analytics and data science teams to understand data needs. Optimize performance of data pipelines and Snowflake environments. Ensure best practices for data governance and compliance are followed. Monitor and troubleshoot data issues in Snowflake and provide timely resolutions. Participate in code reviews and maintain documentation for all processes. Stay updated with new features in Snowflake and implement them as necessary. Assist in the training and onboarding of new team members. Provide technical support and guidance to stakeholders. Collaborate with cross-functional teams to define and manage project timelines. Review and analyze business requirements to ensure data solutions align with business objectives. Engage in continuous improvement initiatives within the data architecture. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering or a related role. Strong understanding of Snowflake and its architecture. Proficient in SQL and experience with data warehousing concepts. Hands-on experience with ETL tools and processes. Knowledge of Python or similar programming languages. Ability to design data flows and workflows to optimize data processing. Experience with data modeling and database design. Familiarity with cloud technologies (AWS, Azure, GCP). Excellent problem-solving and analytical skills. Strong communication and teamwork abilities. Ability to manage multiple tasks and projects simultaneously. Detail-oriented with a focus on quality. Knowledge of data security and compliance standards. Experience with data visualization tools is a plus. Skills: data warehousing,data migration,problem solving,data security,data modeling,cloud technologies,sql,data engineer,sql proficiency,data visualization tools,performance tuning,data governance,snowflake,python,etl processes Show more Show less

Posted 4 days ago

Apply

11.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Software means you’ll be part of a team that transforms our customer’s challenges into solutions. Seeking new possibilities and always staying curious, we are a team dedicated to creating the world’s leading AI-powered, cloud-native software solutions for our customers. Our renowned legacy creates endless global opportunities for our IBMers, so the door is always open for those who want to grow their career. IBM’s product and technology landscape includes Research, Software, and Infrastructure. Entering this domain positions you at the heart of IBM, where growth and innovation thrives. Your Role And Responsibilities Lead the design, development, and deployment of scalable, secure backend systems using Java, J2EE, and GoLang. Architect and implement robust RESTful APIs and microservices aligned with enterprise cloud-native standards. Collaborate closely with DevOps, QA, and frontend teams to deliver end-to-end product functionality. Set coding standards, influence architectural direction, and drive adoption of best practices across backend systems. Own performance tuning, monitoring, and high availability for backend services using tools like Prometheus, ELK, and Grafana. Implement security, compliance, and privacy by design principles in backend systems. Lead incident response and resolution of complex production issues across multi-cloud environments (e.g., AWS, Azure, OCP). Mentor and guide junior developers and contribute to team-wide knowledge sharing and skill development. Actively participate in Agile ceremonies and contribute to continuous delivery and process improvement. Preferred Education Bachelor's Degree Required Technical And Professional Expertise 11+ years of backend software development experience focused on scalable, secure, cloud-native enterprise systems. Deep expertise in Java, J2EE, and GoLang for building distributed backend systems. Advanced experience in architecting and implementing RESTful APIs, service meshes, and inter-service communication. Expert in Postgres or equivalent RDBMS — data modeling, indexing, and performance optimization at scale. Proven track record with microservices architecture, including Docker, Kubernetes, and service deployment patterns. Expert-level familiarity with backend-focused CI/CD tooling (Jenkins, GitLab CI/CD, ArgoCD) and IaC tools (Terraform, CloudFormation). Strong knowledge of monitoring/logging tools such as Prometheus, Grafana, ELK, and Splunk, focusing on backend telemetry and observability. Experience deploying applications on cloud platforms: AWS (EKS, ECS, Lambda, CloudFormation), Azure, or GCP. Familiarity with DevSecOps, secure coding practices, and compliance-aware architecture for regulated environments. Proficient in integration, load, and unit testing using JMeter, RestAssured, JUnit, etc. Leadership in backend architecture, performance tuning, platform modernization, and mentoring of technical teams. Effective cross-functional collaboration skills in multi-team, multi-region environments. Preferred Technical And Professional Experience Deep understanding of backend architecture patterns including microservices, event-driven architecture, and domain-driven design. Experience implementing security and privacy by design principles in cloud-native backend systems. Hands-on expertise with cryptographic protocols and standards such as TLS, FIPS, and experience integrating with Java security frameworks (e.g., JCE, Spring Security). Strong grasp of secure coding practices, with experience identifying and mitigating OWASP Top 10 vulnerabilities. Exposure to designing and developing shared platform services or backend frameworks reused across products or tenants (e.g., in multi-tenant SaaS environments). Familiarity with API security patterns, including OAuth2, JWT, API gateways (e.g., Kong, Apigee). Prior experience working on compliance-oriented systems (e.g., SOC2, HIPAA, FedRAMP) or architecting for high-assurance environments. Proficiency with Shell scripting, Python, or Node.js for infrastructure automation or backend utilities. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Purpose: The primary purpose of this role is to work on the QA and development work for the Product Processor Technology area. This will involve working closely with the business and SMEs to prioritize business requests, manage the ETL development workslate, QA Automation efforts, provide estimate efforts, and ensure timely delivery on committed items and to project manage all aspects of software development according to the Software Development Lifecycle (SDLC). Job Background/context: The role forms part of Product Processor Development Team in Pune and supports the GTPL application which helps Product Control & Finance Department. GTPL is the global finance product control’s strategic product processor for all cash products and internally traded futures. GTPL will be the one stop shop to enable consistent and granular accounting globally, accepting latest global reference and market data to reduce manual adjustments and cleaner reconciliations. GTPL will continue enabling several global functions like Compliance, Risk including BASEL, Tax and Regulatory Reporting and firm-wide strategic initiatives by being the gateway to 100+ systems Key Responsibilities: Understanding Business Requirements and Functional Requirements provided by Business Analysts and to convert into Technical Design Documents and leading the development team to deliver on those requirements. Leading a Technical Team in Pune supporting GTPL in Product Processor Departments. Ensure projects Plans are created and PTS documentation is up to date. Work closely with Cross Functional Teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Controls and Production Support. Prepare handover documents, manage SIT , UAT, automation of Unit Tsting. Identify and proactively resolve issues that could impact system performance, reliability, and usability. Demonstrates an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives; requires a good understanding of the industry. Work proactively & independently to address testing requirements and articulate issues/challenges with enough lead time to address risks Ability to understand complex data problems, analyze and provide generic solutions compatible with existing Infrastructure. Design, Implement, Integrate and test new features. Owns success – Takes responsibility for successful delivery of the solutions. Mentoring other developers on their implementation as needed, and organize review activities like design review, code review and technical document review etc. to ensure successful delivery. Explore existing application systems, determines areas of complexity, potential risks to successful implementation. Contribute to continual improvement by suggesting improvements to software architecture, software development process and new technologies etc. Ability to build relationship with business and technology stakeholders. Knowledge/Experience: 10+ Year Software development and QA t experience. 6+ Year Oracle PL/SQL experience 6+ Year ETL QA Experience (AbInitio or Informatica). Hands on experience in testing complex ETL applications. Development experience in a fast-paced, time-to-market driven environment Experience with test automation, test scenario and test scripts creation and modification Comfortable with writing complex queries Experience with reporting tools. Hands on experience with testing automation tools. Proficiency with Oracle PL/SQL, SQL tuning, writing packages, triggers, functions and procedures. Experience with data conversion / migration Excellent trouble shooting and debugging skills. Worked in Onsite - offshore model. Skills: Strong analytic skills. Excellent communication and internal customer management skills. Excellent written and verbal communication skills. Excellent facilitation skills. Ability to build relationships at all levels. Qualifications: B.E/B.Tech or Master degree in Computer Science or Engineering or related discipline. Competencies: Strong work organization and prioritization capabilities. Takes ownership and accountability for assigned work. Ability to manage multiple activities. Focused and determined in getting the job done right. Ability to identify and manage key risks and issues. Personal maturity and sense of responsibility. Shows drive, integrity, sound judgment, adaptability, creativity, self-awareness and an ability to multitask and prioritize. Sensitive to cultural and background differences and environments Confident and assertive. Values diversity: Demonstrates an appreciation of a diverse workforce. Appreciates differences in style or perspective. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Technology Quality ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading provider of recruitment and human resource management services in India. We specialize in connecting top talent with dynamic companies across various industries. Our mission is to empower businesses by providing exceptional workforce solutions tailored to their needs. We value integrity, collaboration, and innovation, and strive to foster a culture of excellence in our services. Job Title: Snowflake Data Engineer Location: On-site, India Role Responsibilities Design and develop scalable data models in Snowflake. Create and manage ETL processes to integrate data from multiple sources into Snowflake. Perform data migration and validate data integrity throughout the process. Develop SQL queries for data extraction, transformation, and load operations. Collaborate with analytics and data science teams to understand data needs. Optimize performance of data pipelines and Snowflake environments. Ensure best practices for data governance and compliance are followed. Monitor and troubleshoot data issues in Snowflake and provide timely resolutions. Participate in code reviews and maintain documentation for all processes. Stay updated with new features in Snowflake and implement them as necessary. Assist in the training and onboarding of new team members. Provide technical support and guidance to stakeholders. Collaborate with cross-functional teams to define and manage project timelines. Review and analyze business requirements to ensure data solutions align with business objectives. Engage in continuous improvement initiatives within the data architecture. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering or a related role. Strong understanding of Snowflake and its architecture. Proficient in SQL and experience with data warehousing concepts. Hands-on experience with ETL tools and processes. Knowledge of Python or similar programming languages. Ability to design data flows and workflows to optimize data processing. Experience with data modeling and database design. Familiarity with cloud technologies (AWS, Azure, GCP). Excellent problem-solving and analytical skills. Strong communication and teamwork abilities. Ability to manage multiple tasks and projects simultaneously. Detail-oriented with a focus on quality. Knowledge of data security and compliance standards. Experience with data visualization tools is a plus. Skills: data warehousing,data migration,problem solving,data security,data modeling,cloud technologies,sql,data engineer,sql proficiency,data visualization tools,performance tuning,data governance,snowflake,python,etl processes Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is dedicated to connecting top talent with forward-thinking companies. Our mission is to provide exceptional talent acquisition services while fostering a culture of trust, integrity, and collaboration. We prioritize our clients' needs and work tirelessly to ensure the ideal candidate-job match. Join us in our commitment to excellence and become part of a dynamic team focused on driving success for individuals and organizations alike. Role Responsibilities Design, develop, and implement data pipelines using Azure Data Factory. Create and maintain data models for structured and unstructured data. Extract, transform, and load (ETL) data from various sources into data warehouses. Develop analytical solutions and dashboards using Azure Databricks. Perform data integration and migration tasks with Azure tools. Ensure optimal performance and scalability of data solutions. Collaborate with cross-functional teams to understand data requirements. Utilize SQL Server for database management and data queries. Implement data quality checks and ensure data integrity. Work on data governance and compliance initiatives. Monitor and troubleshoot data pipeline issues to ensure reliability. Document data processes and architecture for future reference. Stay current with industry trends and Azure advancements. Train and mentor junior data engineers and team members. Participate in design reviews and provide feedback for process improvements. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in a data engineering role. Strong expertise in Azure Data Factory and Azure Databricks. Proficient in SQL for data manipulation and querying. Experience with data warehousing concepts and practices. Familiarity with ETL tools and processes. Knowledge of Python or other programming languages for data processing. Ability to design scalable cloud architecture. Experience with data modeling and database design. Effective communication and collaboration skills. Strong analytical and problem-solving abilities. Familiarity with performance tuning and optimization techniques. Knowledge of data visualization tools is a plus. Experience with Agile methodologies. Ability to work independently and manage multiple tasks. Willingness to learn and adapt to new technologies. Skills: etl,azure databricks,sql server,azure,data governance,azure data factory,python,data warehousing,data engineer,data integration,performance tuning,python scripting,sql,data modeling,data migration,data visualization,analytical solutions,pyspark,agile methodologies,data quality checks Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Position Overview: We are looking for an experienced DevOps Engineer – ServiceNow who will be responsible for enabling automation, continuous integration, and seamless deployment within the ServiceNow ecosystem. This role demands strong technical knowledge of CI/CD pipelines, Infrastructure as Code (IaC), cloud platforms, and API integrations, as well as a working understanding of platform performance optimization and security basics. Key Responsibilities:  Set up and maintain CI/CD pipelines for ServiceNow applications  Automate deployments using Infrastructure as Code (IaC) tools such as Terraform or Ansible  Manage ServiceNow instances: performance tuning, system upgrades, and configuration  Integrate ServiceNow APIs with external tools for end-to-end automation  Handle cloud infrastructure optimization, scaling, and security configurations Technical Skills Required: ServiceNow Scripting & Configuration:  JavaScript, Glide API, Flow Designer, UI Policies CI/CD Tools:  Jenkins, GitLab, Azure DevOps (any) Cloud Platforms (Hands-on with any one):  AWS, Azure, or GCP Infrastructure as Code (IaC):  Terraform  Ansible  CloudFormation Containers & Orchestration (Preferred but not mandatory):  Docker  Kubernetes  Helm API Integrations:  REST  SOAP Monitoring Tools (Any exposure is beneficial):  Splunk  ELK Stack  Prometheus Security & Networking Basics:  VPN  Firewalls  Access Control Soft Skills:  Strong troubleshooting and debugging abilities  Clear and effective communication skills  Ability to work collaboratively in Agile environments  High attention to detail and commitment to quality Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. Job Family Definition We are looking for a highly skilled HPE Aruba Senior Network Engineer – Premium Support to provide expert-level technical assistance for premium enterprise customers using HPE Aruba WLAN (AOS8 & AOS10) and HPE Aruba Central. This role requires deep expertise in troubleshooting, optimizing, and supporting HPE Aruba Mobility Controllers, HPE Aruba Access Points, ClearPass, and HPE Aruba Central in mission-critical environments. The ideal candidate will be a trusted technical advisor, ensuring seamless network performance, stability, and security for high-profile clients. What You'll Do Deliver premium support services to enterprise customers using HPE Aruba WLAN AOS8/AOS10 and HPE Aruba Central. Troubleshoot and resolve complex wireless performance, authentication, and roaming issues. Provide advanced technical guidance, best practices, and proactive recommendations for HPE Aruba WLAN environments. Work closely with customers to diagnose issues in HPE Aruba Mobility Controllers, APs, ClearPass (NAC), and cloud-managed networks in HPE Aruba Central. Perform Wi-Fi site analysis, RF tuning, and optimization for high-density environments. Analyze logs, packet captures, and HPE Aruba telemetry data to identify root causes of network issues. Collaborate with Engineering teams and product management team for escalations and fixes. Monitor and maintain customer networks using HPE Aruba Central, AirWave, and AI-driven analytics. Assist with firmware upgrades, security patching, and HPE ArubaOS version migrations. Document troubleshooting steps, resolutions, and best practices for internal teams and customers. Conduct customer training sessions to improve operational efficiency and best practices. What You Need To Bring Required Qualifications & Experience: 5+ years of hands-on experience supporting and troubleshooting HPE Aruba WLAN solutions in large-scale enterprise environments. Expert knowledge of 802.11 standards, RF design, and enterprise Wi-Fi troubleshooting. Hands-on experience with HPE Aruba Mobility Controllers (AOS 8 & AOS 10), HPE Aruba APs, and HPE Aruba Central. Strong understanding of VLANs, QoS, AAA (RADIUS/TACACS), firewall rules, and network security best practices. Experience working in a high-pressure premium support environment with enterprise customers. Excellent problem-solving, analytical, and communication skills. Preferred Certifications HPE Aruba Certified Professional (ACP) HPE Aruba Certified ClearPass Professional (ACCP) HPE Aruba Certified Mobility Professional (ACMP) HPE Aruba Certified Network Security Expert (ACNX) HPE Aruba Certified Switching Professional (ACSP) Nice To Have Experience with HPE Aruba SD-Branch, AI-driven networking, and automation. Familiarity with Python, Ansible, or scripting for network automation. HPE Aruba Certified Mobility Expert (ACMX) certification. HPE Aruba Senior Network Engineer – Premium Support - Deliver premium support services to enterprise customers Additional Skills Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, User Experience (UX) What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job Engineering Job Level TCP_03 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner . What You’ll Be DOING What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to Technical Lead. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21st century human experience – then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We’re looking for a Big Data Engineer to join our dh strategic team of Loyality and Personalisation which builds products the retailer can use to find the optimal customer segments and send personalised offers and digital recommendations to the consumer. These products are strategic assets to retailer to improve the loyality of their consumer. by that these products are very important for retailers and therefore for dunnhumby What We Expect From You: 3 to 6 years of experience in software development using Python. - Hands on experience in Python OOPS, Design patterns, Dependency Injection, data libraries(Panda), data structures - Exposure to Spark: PySpark, Architecture of Spark, Best practices to optimize jobs - Experience in Hadoop ecosystem: HDFS, Hive, or YARN - Experience of Orchestration tools: Airflow, Argo workflows, Kubernetes - Experience on Cloud native services(GCP/Azure/AWS) preferable GCP - Database knowledge of: SQL, NoSQL - Hands on expsoure to CI/CD pipelines for data engineering workflows - Testing: pytest for unit testing. pytest-spark to create a test Spark Session. Spark UI for Performance tuning & monitoring Good To Have: - Scala What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone’s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) Show more Show less

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description The recruiter will be responsible for all levels of talent acquisition, recruiting, and recruitment programs, procedures, and plans. Serve as consultant and partner staying current on business and market trends, assisting on both the strategic and tactical level. Possesses strong understanding of client needs and hiring conditions external and internal. Provides advice, expertise and assistance to all levels of personnel both internal and external on various recruiting/talent acquisition related issues. Serves as trusted member of Global HR organization driving great partnerships with internal & external customers. Provides sourcing and candidate generation to hiring managers across multiple teams, organizations and locations. To be great in this role the candidate must be able to successfully manage, prioritize and close searches against a timeline and have experience setting benchmarks, metrics, and understand how to prioritize to hit all customer SLAs. They thrive in an innovative, fast-paced environment, can roll up their sleeves, work hard, have fun, and get the job done. Key job responsibilities Partner with hiring teams to build effective sourcing, assessment, and closing approaches with an ability to manage customer/partner expectations through a deep understanding of return on investment. Be able to recruit passive candidates and possess the mentality to "profile people and gauge chemistry of candidates for fit and understand their motivation" rather than sell a role. Possess strong ability to screen interview candidates within the framework of the position specifications and prepare an ideal candidate slate within an appropriate and consistent timeline. Build and maintain network of potential candidates through pro-active market research and on-going relationship management; conducts in-depth interviews of potential candidates, demonstrating ability to anticipate hiring manager preferences through high offer-to-interview ratios. Communicate effectively with the hiring manager and interview team to ensure preparedness during the interview process. Share and exchange information with all levels of management. Recommend ideas and strategies related to recruitment that will contribute to the long-range growth of the company, implementing any new processes and fine-tuning standard processes for recruiting that fits within Amazon's mission to deliver the highest quality results to the customer. Provide a great candidate experience and act as a candidate advocate. Articulate in writing a plan with deliverable, timelines and a formal tracking process. Participate in special projects/recruiting initiatives including assessment of best practices in interviewing techniques, leveraging of internal sources of talent and identification of top performers for senior-level openings. Basic Qualifications Graduate from a reputable university. 0-4 years of prior work experience Working knowledge of Social media recruiting and should be updated with current market trends. Ability to source Business/tech talent, with prior experience to hire for roles like Product, Program, Sales, Tech Engineering and senior leadership roles. Experience managing and prioritizing multiple searches, projects and client relationships. Preferred Qualifications Analytic skills with ability to create, measure, and scale the right workflow between candidates, hiring managers, and the recruiting team. Strong consulting skills and demonstrated ability to work in a team environment, as a team leader and member. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Show more Show less

Posted 4 days ago

Apply

300.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

GG11 Software Eng Senior Software Engineer (SQL Database) LSEG (London Stock Exchange Group) is more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a dedication to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It’s how we’ve contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services – and our open-access model – we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. LSEG is headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. Role Description: We are looking for enthusiastic, talent software engineers to join the team as we strive to deliver excellence to our customers. You're a self-starter who will join our empowered agile team working on the LSEG T1 product suite. LSEG T1 is a solution for North American Wealth advisors with over 100,000 installations at some of the largest Wealth Advisors in the United States and Canada Whilst your background will be in database software development, your curiosity, desire to learn and passion for technology means you can the job done. Quality is non-negotiable, so a good focus on code quality, unit testing and enabling automated testing is important. You will be responsible for the analysis, definition, design, construction, testing, installation, modification, and maintenance of efficient, reusable, reliable and secure code based on User Stories and software designs, working within a multi-functional agile team. Roles and Responsibilities You will build and maintain efficient, reusable, reliable and SQL code based on User Stories and software designs. Working within a multi-functional agile team, you'll develop enterprise software, adhering to company standards and established software methodology. Demonstrating a consistent focus on quality, you'll ensure that your team delivers reliable, robust software through the creation and execution of automated tests in conjunction with the team’s quality engineers. Through agile retrospectives and reviews, you will inspect and adapt, finding innovative ways to make your team work more effectively. Through participation in refinement and planning sessions, you will work with other team members to analyse development requirements, provide design options and complexity estimates, and agree how to deliver the requirements. Actively participating in agile meetings, you will give timely status updates on areas for which you are responsible. Provide technical support to operations or other development teams Create, review, and maintain all required documentation to ensure supportability and reuse Assist with improvements to prevent problems, including problem resolution workflow. You will actively participate in team and status meetings, providing timely updates for areas of individual responsibilities within the project Requirements Proven experience (10+ years) as a Senior Software Engineer with project experience leading the implementation of sophisticated software deliverables. Experience using Microsoft SQL Server (2014/2016/2019) stored procedures, SQL Server Reporting Services (SSRS), and Microsoft Visual Studio. Experience designing and developing SQL Server Integration Services Packages Experience with SQL Error Handling, SQL Views, SQL Triggers, SQL Stored Procedures and Functions Experience writing SQL on one or more relational databases such as Oracle or SQL Server Experience with logical and physical data modelling, and database design Experience working in Agile SDLC to deliver iterative value to business. Experience with Test Driven Development and / or Behaviour Driven Development. Knowledge of software design patterns with solid technical background and understanding of programming styles, frameworks, and different software testing scenarios. Skill for writing clean, readable code and reusable components while being able to communication the implementation of those components to both technology and business/production teams. Database performance tuning skills Working experience in different cloud environments Source Repositories (GIT) and associated pipeline development (Jenkins/Gitlab). Agile development experience is strong plus Good working knowledge on SNOW, Jira and Confluence tools. Strong understanding of infrastructure, very good problem-solving skills Outstanding analytical, problem-solving, and communication skills. Hard-working, flexible, and innovative. Fair understanding of Incident, Problem, Change and Release management processes. Nice-to-haves Financial markets/banking experience Experience with designing and architecting solutions in Cloud environments. Strong understanding of security and compliance managements. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership, Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership, Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less

Posted 4 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description We’re seeking a hands-on AI/ML Engineer with deep expertise in large language models, retrieval-augmented generation (RAG), and cloud-native ML development on AWS. You'll be a key driver in building scalable, intelligent learning systems powered by cutting-edge AI and robust AWS infrastructure. If you’re passionate about combining NLP, deep learning, and real-world application at scale—this is the role for you. 4+ years of specialized experience in AI/ML is required. Core Skills & Technologies LLM Ecosystem & APIs • OpenAI, Anthropic, Cohere • Hugging Face Transformers • LangChain, LlamaIndex (RAG orchestration) Vector Databases & Indexing • FAISS, Pinecone, Weaviate AWS-Native & ML Tooling • Amazon SageMaker (training, deployment, pipelines) • AWS Lambda (event-driven workflows) • Amazon Bedrock (foundation model access) • Amazon S3 (data lakes, model storage) • AWS Step Functions (workflow orchestration) • AWS API Gateway & IAM (secure ML endpoints) • CloudWatch, Athena, DynamoDB (monitoring, analytics, structured storage) Languages & ML Frameworks • Python (primary), PyTorch, TensorFlow • NLP, RAG systems, embeddings, prompt engineering What You’ll Do • Model Development & Tuning o Designs architecture for complex AI systems and makes strategic technical decisions o Evaluates and selects appropriate frameworks, techniques, and approaches o Fine-tune and deploy LLMs and custom models using AWS SageMaker o Build RAG pipelines with LlamaIndex/LangChain and vector search engines • Scalable AI Infrastructure o Architect distributed model training and inference pipelines on AWS o Design secure, efficient ML APIs with Lambda, API Gateway, and IAM • Product Integration o Leads development of novel solutions to challenging problems o Embed intelligent systems (tutoring agents, recommendation engines) into learning platforms using Bedrock, SageMaker, and AWS-hosted endpoints • Rapid Experimentation o Prototype multimodal and few-shot learning workflows using AWS services o Automate experimentation and A/B testing with Step Functions and SageMaker Pipelines • Data & Impact Analysis o Leverage S3, Athena, and CloudWatch to define metrics and continuously optimize AI performance • Cross-Team Collaboration o Work closely with educators, designers, and engineers to deliver AI features that enhance student learning o Mentors junior engineers and provides technical leadership Who You Are • Deeply Technical: Strong foundation in machine learning, deep learning, and NLP/LLMs • AWS-Fluent: Extensive experience with AWS ML services (especially SageMaker, Lambda, and Bedrock) • Product-Minded: You care about user experience and turning ML into real-world value • Startup-Savvy: Comfortable with ambiguity, fast iterations, and wearing many hats • Mission-Aligned: Passionate about education, human learning, and AI for good Bonus Points • Hands-on experience fine-tuning LLMs or building agentic systems using AWS • Open-source contributions in AI/ML or NLP communities • Familiarity with AWS security best practices (IAM, VPC, private endpoints) Show more Show less

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

This role has been designed as ‘’Onsite’ with an expectation that you will primarily work from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description In the HPE Hybrid Cloud , we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what’s next for you. Job Family Definition The Cloud Developer builds from the ground up to meet the needs of mission-critical applications, and is always looking for innovative approaches to deliver end-to-end technical solutions to solve customer problems. Brings technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing cloud-based solutions. Collaborates with project managers and development partners to ensure effective and efficient delivery, deployment, operation, monitoring, and support of Cloud engagements. The Cloud Developer provides business value expertise to drive the development of innovative service offerings that enrich HPE's Cloud Services portfolio across multiple systems, platforms, and applications. Contributions include applying developed subject matter expertise to solve common and sometimes complex technical problems and recommending alternatives where necessary. Might act as project lead and provide assistance to lower level professionals. Exercises independent judgment and consults with others to determine best method for accomplishing work and achieving objectives. What You Will Do Design, develop, and maintain scalable backend solutions using Java, Spring Boot, and Microservices. Build and deploy REST-based stateless APIs with performance and reliability in mind. Work with modern cloud-native application architectures, containers, and orchestration platforms like Docker and Kubernetes. Collaborate with cross-functional teams to gather requirements, design systems, and deliver high-impact features. Apply DevOps practices: implement CI/CD pipelines, Infrastructure as Code, and container-based deployments. Perform database design, schema optimization, and data access layer implementation using Java-based tools. Ensure secure coding practices and apply security concepts when building distributed applications. Use profiling and performance tuning tools to optimize application behavior and throughput. Actively contribute to design reviews, code reviews, and documentation processes. Must have knowledge on Copilot prompting to get work done. (Able to provide the context to the copilot and get the required things to be done) What You Will Need 5-8 years of backend development experience. Strong programming expertise in Java; working knowledge of Python, Golang, or JavaScript is a plus. Deep understanding of distributed systems, event-driven architecture, and system performance optimization. Experience with REST APIs, multi-threading, caching strategies, and data modeling. Hands-on experience with Docker, Kubernetes, and CI/CD tools like Jenkins or GitHub Actions. Experience with cloud-native development and familiarity with services on AWS, Azure, or GCP. Familiarity with code versioning (Git) and profiling/debugging tools. Excellent communication skills and the ability to adapt to fast-changing environments. Solid understanding of Agile development processes. Desired Experience (Nice To Have) Domain knowledge in process automation platforms. Familiarity with low-code/no-code platforms for workflow automation. Experience with performance monitoring, alerting, and diagnostics in production environments. Ability to contribute to architectural discussions and align with enterprise tech strategy. Proficiency in documenting best practices and design standards for reusable solutions. Experience in Full Stack Development and knowledge of frontend frameworks. Expertise in building and consuming REST APIs using Swagger, Postman, and OpenAPI specs. Prior exposure to workflow automation tools such as Camunda, and experience with process analysis and optimization. Understanding of object-oriented programming, exception handling, and design patterns. Experience with enterprise integration patterns and deploying solutions in global-scale environments. Engage in process analysis and optimization using workflow automation tools like Camunda (or similar). Soft Skills Self-starter with a proactive attitude and a willingness to learn. Strong communication and collaboration across global and cross-functional teams. Critical thinking and ability to troubleshoot complex distributed systems. Enthusiastic about automation, scalability, and clean architecture. Additional Skills Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Release Management, Security-First Mindset, User Experience (UX) What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #hybridcloud Job Engineering Job Level TCP_03 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

🌐 Web Development Intern (Remote ) Duration: 3 to 6 Months Mode: Remote Stipend: Unpaid with potential to convert into full time opportunity 🚀 About Coreline Solutions At Coreline Solutions, we are passionate about technology, innovation, and empowering the next generation of developers. As a digital transformation and consulting firm, we specialize in web development, data intelligence, and scalable software solutions. We are committed to nurturing talent by offering meaningful learning experiences in a collaborative and supportive environment. 🔗 Website: https://corelinesolutions.in/ 📧 Contact: hello@corelinesolutions.in 💻 About the Internship We're looking for motivated and creative individuals to join us as Web Development Interns. This role is ideal for students or recent graduates seeking hands-on experience in frontend and backend development using real projects, mentorship, and modern technologies. 🔧 Key Responsibilities Assist in developing responsive, user-friendly websites and web applications Use HTML, CSS, JavaScript, and modern frameworks (e.g., React.js, Vue.js, Bootstrap) Collaborate with backend developers to integrate APIs and dynamic content Support bug fixing, performance tuning, and cross-browser testing Participate in regular code reviews and team meetings Contribute to project documentation and updates ✅ What We’re Looking For Students or recent graduates in Computer Science, Information Technology, or a related field Solid understanding of HTML, CSS, and JavaScript Basic familiarity with Git and version control workflows Strong problem-solving and organizational skills Self-motivated with a desire to learn and grow in a team setting --- 🌟 Preferred (Bonus) Skills Knowledge of React.js, Next.js, Tailwind CSS, or other frontend libraries Exposure to REST APIs, Firebase, or basic backend logic Experience with tools like Figma, Adobe XD, or similar design platforms Familiarity with deployment tools such as Vercel, Netlify, or GitHub Pages 🎁 What You’ll Gain Real-world project experience in full-stack web development Guidance from experienced developers and mentors Internship Certificate of Completion and Letter of Recommendation Access to curated learning resources and team collaboration tools Consideration for future paid opportunities based on performance 🤝 Equal Opportunity We are an equal opportunity organization. Coreline Solutions does not discriminate on the basis of race, color, religion, gender identity, sexual orientation, disability, or any other legally protected status. 📬 How to Apply Please email your resume and a brief cover letter or introduction to: 📩 hr@corelinesolutions.site Subject Line: Application for Web Development Intern – [Your Full Name] 💡 Tip: Include links to your GitHub, personal website, or any web projects to boost your application! Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Company Description zipp.ai is building AI to improve Food & Drug Quality. We are revolutionizing Quality & GxP compliance for Pharma and Food industries. Our mission is to ensure 100% compliance with Good Manufacturing Practices (GMP) by proactively detecting deviations and analyzing process gaps. We provide innovative solutions to tackle quality and compliance challenges faced by manufacturers, leveraging advanced technology and AI to enhance product quality and operational efficiency. Why Join zipp.ai? High Impact: You will be a foundational member of the AI team, with your work directly shaping our core products and value proposition. Cutting-Edge Work: Tackle challenging problems at the intersection of AI and a critical, regulated industry. Growth & Mentorship: Work directly with an experienced founding team (ex-Microsoft, ex-Accenture) and a world-class advisory board. Startup Culture: Experience a fast-paced, agile, and collaborative environment with significant learning opportunities. Competitive Compensation: We offer a competitive salary and the potential for equity (ESOPs) for a core team member. Role Description As an AI/ML Engineer at zipp.ai, you will be at the heart of our product innovation. You will research, design, develop, and deploy the core AI models that power our platform, from analyzing complex regulatory documents to identifying anomalies in manufacturing data. This role requires a strong foundation in machine learning and a passion for applying cutting-edge technology, especially Large Language Models (LLMs) and Natural Language Processing (NLP), to solve real-world problems. Key Responsibilities Design, train, fine-tune, and deploy ML/NLP models for tasks such as document understanding, gap analysis, data classification, and trend analysis. Work extensively with LLMs, including prompt engineering, fine-tuning, and implementing retrieval-augmented generation (RAG) pipelines for our domain-specific conversational AI features. Develop and maintain data processing pipelines for cleaning, preparing, and managing large datasets (primarily textual and structured data). Integrate trained models into our cloud-based application (hosted on Azure) by building and maintaining robust APIs. Rigorously evaluate model performance using appropriate metrics (accuracy, precision, recall, etc.) and continuously iterate to improve results. Collaborate closely with our CTO, product team, and GxP domain experts to translate business requirements into technical AI solutions. Stay current with the latest advancements in AI, ML, and NLP and champion new techniques that can enhance our product suite. Skills & Qualifications We Require: Proven hands-on experience in building and deploying machine learning or generative AI models. Strong proficiency in Python and core AI/ML libraries (e.g., Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch). Solid experience with Natural Language Processing (NLP) techniques and libraries (e.g., Hugging Face Transformers, NLTK, spaCy). Demonstrable experience working with Large Language Models (LLMs) and associated technologies. Familiarity with software engineering best practices, including version control (Git) and API development. What Will Make You Stand Out (Preferred Skills): Experience with cloud platforms, particularly Microsoft Azure and its AI/ML services. Understanding of MLOps principles for managing the model lifecycle. Experience with vector databases and semantic search. Startup experience and a proactive, problem-solving mindset. If you are driven to build intelligent solutions that have a real-world impact on product quality and safety, we would love to hear from you. How to Apply: Please apply directly via LinkedIn or send your resume and a brief introduction to hr@zipp-ai.com Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead, Data Engineer Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview The Mastercard Services Technology team is looking for a Lead in Data Engineering, to drive our mission to unlock potential of data assets by consistently innovating, eliminating friction in how we manage big data assets, store those assets, accessibility of data and, enforce standards and principles in the Big Data space both on public cloud and on-premises set up. We are looking for a hands-on, passionate Data Engineer who is not only technically strong in PySpark, cloud platforms, and building modern data architectures, but also deeply committed to learning, growing, and lifting others. The person will play a key role in designing and building scalable data solutions, shaping our engineering culture, and mentoring team members. This is a role for builders and collaborators—engineers who love clean data pipelines, cloud-native design, and helping teammates succeed. Role Design and build scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Mentor and guide other engineers, sharing knowledge, reviewing code, and fostering a culture of curiosity, growth, and continuous improvement. Create robust, maintainable ETL/ELT pipelines that integrate with diverse systems and serve business-critical use cases. Lead by example—write high-quality, testable code and participate in architecture and design discussions with a long-term view in mind. Decompose complex problems into modular, efficient, and scalable components that align with platform and product goals. Champion best practices in data engineering, including testing, version control, documentation, and performance tuning. Drive collaboration across teams, working closely with product managers, data scientists, and other engineers to deliver high-impact solutions. Support data governance and quality efforts, ensuring data lineage, cataloging, and access management are built into the platform. Continuously learn and apply new technologies, frameworks, and tools to improve team productivity and platform reliability. Own and optimize cloud infrastructure components related to data engineering workflows, storage, processing, and orchestration. Participate in architectural discussions, iteration planning, and feature sizing meetings Adhere to Agile processes and participate actively in agile ceremonies Stakeholder management skills All About You 5+ years of hands-on experience in data engineering with strong PySpark and Python skills. Solid experience designing and implementing data models, pipelines, and batch/stream processing systems. Proven ability to work with cloud platforms (AWS, Azure, or GCP), especially in data-related services like S3, Glue, Data Factory, Databricks, etc. Strong foundation in data modeling, database design, and performance optimization. Understanding of modern data architectures (e.g., lakehouse, medallion) and data lifecycle management. Comfortable with CI/CD practices, version control (e.g., Git), and automated testing. Demonstrated ability to mentor and uplift junior engineers—strong communication and collaboration skills. Bachelor’s degree in computer science, Engineering, or related field—or equivalent hands-on experience. Comfortable working in Agile/Scrum development environments. Curious, adaptable, and driven by problem-solving and continuous improvement. Good To Have Experience integrating heterogeneous systems and building resilient data pipelines across cloud environments. Familiarity with orchestration tools (e.g., Airflow, dbt, Step Functions, etc.). Exposure to data governance tools and practices (e.g., Lake Formation, Purview, or Atlan). Experience with containerization and infrastructure automation (e.g., Docker, Terraform) will be a good addition. Master’s degree, relevant certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer), or demonstrable contributions to open source/data engineering communities will be a bonus. Exposure to machine learning data pipelines or MLOps is a plus. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251380 Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

• Must have experience in "leading" calibration engineering project. • Must have experience in calibrating MPFI or GDI engines. • Must be highly proficient with Powertrain. • Must have experience in calibrating engines for gasoline, CNG, or Flex fuel. • Must have experience in performing emissions testing and tuning. • Must have experience in setting up and testing - start ability, idle, and transient calibrations. • Must be familiar with using INCA, MDA for real-time tuning. • Must have experience in OBD-1 and OBD-2B calibration. • Good to have experience with vehicle test trips and making field-based adjustments. Show more Show less

Posted 4 days ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Danfoss Power Solutions India is seeking an enthusiastic candidate to join our AE (Application Engineering) diverse team. Together with our OEM & Distributors, we develop optimized solutions related to propel, work and control functions by specifying and verifying Danfoss Power Solutions products and systems for mobile machinery in markets such as agriculture and construction. You will primarily work from the office together with other qualified engineers in India-Pune or work on-site with the customers. The job involves travel activity of approx. 10~15% per year Job Responsibilities As our new Application Engineering Team member, you will have direct OEM /Distributor contact, and your main task will be to apply and validate application solutions based on Danfoss Power Solutions’ mobile hydraulic and electronic control products across the entire product line. More In Details, You Will Support in pre-selling, leading technical presentations to customers and training Offer system support, including technical specifications based on customer requirements, system design and sizing, proposing proof-of-concept, on-site machine testing and tuning, for different mobile Off Highway machine applications using Danfoss Components. Work closely with sales team to meet sales target and growth - action based on agreed sales and marketing plans Work in Matrix organization, Example EU/US/APAC- AE & PAE members to ensure appropriate solutions for customer applications. Manage the application sign off at customers end along with sales team. Product Support & after Sales Application related issues. Background & Skills Position : Application Engineer /Sr. Application Engineer /Manager Application Engineering Qualification: Graduate/Postgraduate in Mechanical or Mechatronics Engineering. N o of years of Experience: 5~15 years Required Skills/Knowledge/Abilities Excellent knowledge of Mobile hydraulic systems & off Highway machines Ability to independently design Hydraulic systems (Mobile / Industrial) Basic Electric & Electronic knowledge, Candidate with Mechatronics background will be an added advantage. Hands on Experience on the Hydraulic components. Experience in Field Validation & Trouble shooting. Good Communication skills in English/ Hindi IT-Skills. Employee Benefits We are excited to offer you the following benefits with your employment: Opportunity to join Employee Resource Groups Referral Program This list does not promise or guarantee any particular benefit or specific action. They may depend on country or contract specifics and are subject to change at any time without prior notice. Danfoss – Engineering Tomorrow At Danfoss, we are engineering solutions that allow the world to use resources in smarter ways - driving the sustainable transformation of tomorrow. No transformation has ever been started without a group of passionate, dedicated and empowered people. We believe that innovation and great results are driven by the right mix of people with diverse backgrounds, personalities, skills, and perspectives, reflecting the world in which we do business. To make sure the mix of people works, we strive to create an inclusive work environment where people of all backgrounds are treated equally, respected, and valued for who they are. It is a strong priority within Danfoss to improve the health, working environment and safety of our employees. Following our founder’s mindset “action speaks louder than words”, we set ourselves ambitious targets to protect the environment by embarking on a plan to become CO2 neutral latest by 2030. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or other protected category. Show more Show less

Posted 4 days ago

Apply

20.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Linkedin logo

Who is Forcepoint? Forcepoint simplifies security for global businesses and governments. Forcepoint’s all-in-one, truly cloud-native platform makes it easy to adopt Zero Trust and prevent the theft or loss of sensitive data and intellectual property no matter where people are working. 20+ years in business. 2.7k employees. 150 countries. 11k+ customers. 300+ patents. If our mission excites you, you’re in the right place; we want you to bring your own energy to help us create a safer world. All we’re missing is you! Senior Software Engineer – Dashboarding, Reporting & Data Analytics Location: Mumbai (Preferred) Experience: 8-10 years Job Summary We are looking for a Senior Software Engineer with expertise in dashboarding, reporting applications, and data analytics . The ideal candidate should have strong programming skills in Golang and Java , experience working with AWS services like Kinesis, Redshift, and Elasticsearch , and the ability to build scalable, high-performance data pipelines and visualization tools . This role is critical in delivering data-driven insights that help businesses make informed decisions. Key Responsibilities Design, develop, and maintain dashboards and reporting applications for real-time and batch data visualization. Build data pipelines and analytics solutions leveraging AWS services like Kafka, Redshift, Elasticsearch, Glue, and S3. Work with data engineering teams to integrate structured and unstructured data for meaningful insights. Optimize data processing workflows for efficiency and scalability. Develop APIs and backend services using Golang and Java to support reporting and analytics applications. Collaborate with business stakeholders to gather requirements and deliver customized reports and visualizations. Implement data security, governance, and compliance best practices. Conduct code reviews, troubleshooting, and performance tuning. Stay updated with emerging data analytics and cloud technologies to drive innovation. Required Qualifications 8-10 years of experience in software development, data analytics, and dashboarding/reporting applications. Proficiency in Golang and Java for backend development. Strong expertise in AWS data services (Kinesis, Redshift, Elasticsearch, S3, Glue). Experience with data visualization tools (Grafana, Tableau, Looker, or equivalent). Proficiency in SQL and NoSQL databases, with a solid understanding of data modeling and performance optimization. Experience in building and managing scalable data pipelines. Experience in big data processing frameworks (Spark, Flink). Strong problem-solving skills with a focus on efficiency and performance. Excellent communication and collaboration skills. Preferred Qualifications Experience with real-time data streaming and event-driven architectures. Familiarity with big data processing frameworks (Spark, Flink). Experience in CI/CD pipelines and DevOps practices. Don’t meet every single qualification? Studies show people are hesitant to apply if they don’t meet all requirements listed in a job posting. Forcepoint is focused on building an inclusive and diverse workplace – so if there is something slightly different about your previous experience, but it otherwise aligns and you’re excited about this role, we encourage you to apply. You could be a great candidate for this or other roles on our team. The policy of Forcepoint is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to affirmatively seek to advance the principles of equal employment opportunity. Forcepoint is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by sending an email to recruiting@forcepoint.com. Applicants must have the right to work in the location to which you have applied. Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Role Overview We are hiring a Technical Lead – AI Security to join our CISO team in Mumbai. This is a critical, hands-on role — ensuring the trustworthiness, resilience, and compliance of AI/ML systems, including large language models (LLMs). You will work at the intersection of cybersecurity and AI, shaping secure testing, understanding secure MLOps/LLMOps workflows, and leading technical implementation of defenses against emerging AI threats. This role requires both strategic vision and strong engineering depth. Key Responsibilities · Lead and operationalize the AI/ML and LLM security roadmap across training, validation, deployment, and runtime to enable AI Security Platform Approach. · Design and implement defenses against threats like adversarial attacks, data poisoning, model inversion, prompt injection, and fine-tuning exploits using industry leading open source and commercial tools. · Build hardened workflows for model security, integrity verification, and auditability in production AI environments. · Leverage AI security tools for scanning, fuzzing, and penetration testing models. · Apply best practices from OWASP Top 10 for ML/LLMs, MITRE ATLAS, NIST AI RMF, and ISO/IEC 42001 to test AI/ML assets. · Ensure AI model security testing framework aligns with internal policy, national regulatory requirements, and global best practices. · Plan and execute security tests for AI/LLM systems, including jailbreaking, RAG hardening, and bias/toxicity validation. Required Skills & Experience · 8+ years in cybersecurity, with at least 3+ years hands-on in AI/ML security or secure MLOps/LLMOps · Proficient in Python, TensorFlow/PyTorch, HuggingFace, LangChain, and common data science libraries · Deep understanding of adversarial ML/LLM, model evaluation under threat conditions, and inference/training-time attack vectors · Experience securing cloud-based AI workloads (AWS, Azure, or GCP) · Familiarity with secure DevOps and CI/CD practices · Strong understanding of AI-specific threat models (MITRE ATLAS) and security benchmarks (OWASP Top 10 for ML/LLMs) · Ability to communicate technical risk clearly to non-technical stakeholders · Ability to guide developers and data scientists to solve the AI Security risks. · Certifications: CISSP, OSCP, GCP ML Security, or relevant AI/ML certificates · Experience with AI security tools or platforms (e.g., model registries, lineage tracking, policy enforcement) · Experience with RAG, LLM-based agents, or agentic workflows · Experience in regulated sectors (finance, public sector) Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Talent Sourcers play a critical role in achieving our growth & hiring objectives for BCG’s Consulting business. Reporting to the APAC Director of Executive Search & Talent Sourcing, this role partners with Recruiting & Consulting stakeholders (including Managing Directors / Partners) across our Tech / Digital businesses (BCG Platinion & BCG X) in APAC to identify and secure best-in-class talent across lateral and executive search. Sourcers learn through practical experience and top-tier coaching & mentorship. Our Talent Sourcers drive the search process – from developing and executing search strategies to interviewing, assessing and building a pipeline of qualified candidates for active roles. Talent Sourcers deeply immerse themselves in industry, role and functional research, acting as the ‘eyes’ and ‘ears’ in the market to provide strategic insights and serve as trusted thought partners. Role Responsibilities Talent Research Create a focused search strategy vis-à-vis the hiring requirement and drive its execution, continuously fine-tuning the approach via leveraging of market knowledge. Act as trusted advisor, solutioning on talent pools / strategies to successfully close searches Conduct industry and function-specific research, mapping the talent landscape to bring market intelligence to the table Candidate & Stakeholder Engagement Reach out to prospective candidates to pitch / elicit interest in opportunities, gather referrals and collect market intelligence and feedback Assist in conducting in-depth interviews, evaluating candidates against role requirements Build strong candidate relationships. Considering the scarcity of certain niche talent pools, consistently plan and implement effective outreach strategies to nurture relationships and engagement with prospective candidates Collaborate internationally, across BCG as part of a global sourcing community, bringing creative ideas together and providing strategic input Living BCG Values Challenging the status quo: drive innovation Diversity of thought: contributing via your expertise Teaming: build strong, trust-based relationships with diverse stakeholders across the firm, fostering a collaborative and productive work environment. Be a culture carrier and contribute to firm, practice and / or office initiatives Ethics & Excellence Ensure accuracy, integrity and quality of candidate information across internal and external systems Maintain detailed status reports and market intelligence summaries for effective decision-making and presentations Manage multiple projects simultaneously, delivering high-quality outcomes within deadlines Demonstrate resilience and perseverance, staying committed to doing what’s right What You'll Bring A minimum of 3 years’ transferable experience, ideally in Tech-led Start-ups (Consumer Tech, Fintech etc.) / Software companies or alternately, Recruitment firms with a strong background in Tech hiring An undergraduate degree is required, with a Masters’ degree being an added advantage Exhibits skill and growing proficiency in articulating ideas and information clearly, with confidence and impact Ability to collaborate with internal clients and candidates, with a developing understanding of influencing techniques to build positive relationships. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

EXL (NASDAQ: EXLS) is a $7 billion public-listed NASDAQ company and a rapidly expanding global digital data-led AI transformation solutions company with double digit growth. EXL Digital division spearheads the development and implementation of Generative AI (GenAI) business solutions for our clients in Banking & Finance, Insurance, and Healthcare. As a global leader in analytics, digital transformation, and AI innovation, EXL is committed to helping clients unlock the potential of generative AI to drive growth, efficiency, and innovation. Job Summary We are seeking a highly skilled AI/ML Engineer - Generative AI to design, develop, and deploy production-grade AI systems and agentic applications. The ideal candidate will have a strong background in Python 3.11+, deep learning, large language models, and distributed systems, with experience building performant, clean, and scalable services. Key Responsibilities Build and maintain high-performance REST/WebSocket APIs using FastAPI (Pydantic v2). Implement and optimize agentic AI systems using LangGraph, AutoGen, and LangChain. Architect real-time event-driven microservices using Apache Kafka 4.0 and KRaft. Design clean, testable services using SOLID principles, Python async, and type hints. Integrate vector databases like Pinecone and Weaviate for embedding storage and retrieval. Implement graph databases like Neo4j for knowledge graph-based use cases. Manage experiment tracking and model lifecycle using MLflow 3.0 or Weights & Biases. Build and deploy containers using Docker, GitHub Actions, and Kubernetes (nice-to-have). Maintain CI/CD pipelines and infrastructure as code with Git and optionally Terraform. Stay current with trends in GenAI, deep learning, and orchestration frameworks. Minimum Qualifications Bachelor's degree in computer science, Data Science, or related field. 5+ years of experience in AI/ML engineering with focus on LLMs and NLP. 2–3 years of hands-on experience with GenAI and LLMs (e.g., GPT, Claude, LLaMA3). Proficiency in Python 3.11+ (async, typing, OOP, SOLID principles). Experience with FastAPI, Pydantic v2, PyTorch 2.x, Hugging Face Transformers. Working knowledge of agentic frameworks like LangChain, LangGraph, or AutoGen. Experience building REST/WebSocket APIs and microservices with Kafka streams. Proficient in SQL, Pandas, and NumPy for data manipulation. Preferred Qualifications Master’s or PhD degree in Computer Science, Data Science, or related field. Familiarity with Graph DBs such as Neo4j for knowledge graphs. Experience with Vector DBs like Pinecone, Weaviate. Proficiency in MLflow 3.0 or Weights & Biases for experiment tracking. Experience with CI/CD pipelines, containerization (Docker), and orchestration (K8s) and automated deployment workflows. Exposure to Infrastructure as Code (IaC) using Terraform. Knowledge of advanced optimization, quantization, and fine-tuning techniques. Skills and Competencies Proven ability to architect GenAI solutions and multi-agent systems. Strong testing skills (unit, integration, performance). Excellent communication and cross-functional collaboration. Strong analytical and problem-solving skills. Leadership and mentoring capability for engineering teams. Show more Show less

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies