Home
Jobs
Companies
Resume

9074 Tuning Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

With a startup spirit and 115,000 + curious and courageous minds, we have the expertise to go deep with the world’s biggest brands—and we have fun doing it! We dream in digital, dare in reality, and reinvent the ways companies work to make an impact far bigger than just our bottom line. We’re harnessing the power of technology and humanity to create meaningful transformation that moves us forward in our pursuit of a world that works better for people. Now, we’re calling upon the thinkers and doers, those with a natural curiosity and a hunger to keep learning, keep growing. People who thrive on fearlessly experimenting, seizing opportunities, and pushing boundaries to turn our vision into reality. And as you help us create a better world, we will help you build your own intellectual firepower. Welcome to the relentless pursuit of better. Inviting applications for the role of AI Senior Engineer In this role you’ll be leveraging Azure’s advanced AI capabilities or AWS Advance Ai capability, including Azure Machine Learning , Azure OpenAI, PrompFlow, Azure Cognitive Search, Azure AI Document Intelligence,AWS Sage Maker, AWS Bedrocks to deliver scalable and efficient solutions. You will also ensure seamless integration into enterprise workflows and operationalize models with robust monitoring and optimization. Responsibilities AI Orchestration - Design and manage AI Orchestration flow using tools such as: Prompt Flow, Or LangChain; Continuously evaluate and refine models to ensure optimal accuracy, latency, and robustness in production. Document AI and Data Extraction, Build AI-driven workflows for extracting structured and unstructured data fromLearning, receipts, reports, and other documents using Azure AI Document Intelligence, and Azure Cognitive Services. RAG Systems - Design and implement retrieval-augmented generation (RAG) systems using vector embeddings and LLMs for intelligent and efficient document retrieval; Optimize RAG workflows for large datasets and low-latency operations. Monitoring and Optimization - Implement advanced monitoring systems using Azure Monitor, Application Insights, and Log Analytics to track model performance and system health; Continuously evaluate and refine models and workflows to meet enterprise-grade SLAs for performance and reliability. Collaboration and Documentation - Collaborate with data engineers, software developers, and DevOps teams to deliver robust and scalable AI-driven solutions; Document best practices, workflows, and troubleshooting guides for knowledge sharing and scalability. Qualifications we seek in you Proven experience with Machine Learning, Azure OpenAI, PrompFlow, Azure Cognitive Search, Azure AI Document Intelligence, AWS Bedrock, SageMaker; Proficiency in building and optimizing RAG systems for document retrieval and comparison. Strong understanding of AI/ML concepts, including natural language processing (NLP), embeddings, model fine-tuning, and evaluation; Experience in applying machine learning algorithms and techniques to solve complex problems in real-world applications; Familiarity with state-of-the-art LLM architectures and their practical implementation in production environments; Expertise in designing and managing Prompt Flow pipelines for task-specific customization of LLM outputs. Hands-on experience in training LLMs and evaluating their performance using appropriate metrics for accuracy, latency, and robustness; Proven ability to iteratively refine models to meet specific business needs and optimize them for production environments. Knowledge of ethical AI practices and responsible AI frameworks. Experience with CI/CD pipelines using Azure DevOps or equivalent tools; Familiarity with containerized environments managed through Docker and Kubernetes. Knowledge of Azure Key Vault, Managed Identities, and Azure Active Directory (AAD) for secure authentication. Experience with PyTorch or TensorFlow. Proven track record of developing and deploying Azure-based AI solutions for large-scale, enterprise-grade environments. Strong analytical and problem-solving skills, with a results-driven approach to building scalable and secure systems. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Show more Show less

Posted 15 hours ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Join phData, a dynamic and innovative leader in the modern data stack. We partner with major cloud data platforms like Snowflake, AWS, Azure, GCP, Fivetran, Pinecone, Glean and dbt to deliver cutting-edge services and solutions. We're committed to helping global enterprises overcome their toughest data challenges. phData is a remote-first global company with employees based in the United States, Latin America and India. We celebrate the culture of each of our team members and foster a community of technological curiosity, ownership and trust. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results. 5x Snowflake Partner of the Year (2020, 2021, 2022, 2023, 2024) Fivetran, dbt, Atlation, Matillion Partner of the Year #1 Partner in Snowflake Advanced Certifications 600+ Expert Cloud Certifications (Sigma, AWS, Azure, Dataiku, etc) Recognized as an award-winning workplace in US, India and LATAM phData provides end-to-end services for data engineering, machine learning, and data analytics. Our services and software are used by the world's largest companies to solve their most challenging data problems. We enjoy the work we do and it's reflected in high-quality solutions that provide value to our clients and our communities. Joining the team at phData means giving yourself the opportunity to do your most exciting work. Our work is challenging and our standards are high, but we invest heavily in our employees, starting with training and bootcamp to ensure your success. Plus, you’ll get to work with the brightest minds in the industry and the best in class platforms on the market. And, because the data and ML industry is changing rapidly, you will always have the opportunity to learn - whether that’s a new technology, diving deeper into your preferred stack, or picking up an entirely new skill set. Our consulting services emphasize analytics enablement, data visualization, data preparation, and data science. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results. Our award winning workplace fosters learning, creativity, teamwork and diversity. Awards & Recognition: Best Places to Work (2017, 2018, 2019, 2020, 2021, 2022) Inc. 5000 Fastest Growing US Companies (2019, 2020, 2021, 2022) Tableau Premier Services Partner Microsoft PowerBI Gold Partner Alteryx Premier Partner We’re looking for a talented Analytics Consultant, with an emphasis on Power BI Server Administration, able to help our customers gain tangible value from their data platforms. Gather requirements, pain points, and service from clients on their current Tableau Server environment Provide solutions and recommendations for optimizing their use of Tableau Server and/or Tableau Cloud (Tableau Online) Deliver on project-based consulting engagements; help clients deploy, manage, govern, and tune their Tableau Server environments Manage clients expectations by leading weekly status reports to clients; proactively solicit client feedback by running working sessions Partner with sales team to identify additional sales opportunities within a client; assist with account growth and expansion Overview We are seeking qualified Sr. DevOps engineer, with Power BI as primary and Tableau Server Administration as a secondary skill to help deliver our Elastic Operations service from our Managed Services team in Bangalore, India, as we continue our rapid growth with an expansion of our Indian subsidiary, phData Solutions Private Limited. This expansion comes at the right time with increasing customer demand for data and platform solutions. In addition to the phenomenal growth and learning opportunities, we offer a competitive compensation plan, including base salary, annual bonus, training, and certifications. As a Senior DevOps Engineer on our Consulting Team, you will be responsible for technical delivery for technology projects related to Power BI, and Tableau, and services hosted in the cloud. Responsibilities: Administer and manage PowerBI Service, including gateways, workspaces, security, data refreshes, etc. Troubleshoot issues related to refresh failures, broken datasets, access requests or slow reports. Expertise in Power BI desktop, publishing the reports, and troubleshooting the issues. Provide recommendations for optimising the performance of datasets and visuals. Document best practices and develop user onboarding and training materials. Integrate Power BI with Microsoft Teams, Sharepoint, Power Automate, and DevOps pipelines. Manage BI report subscriptions, alerts, and dashboard embedding. Gather requirements, pain points, and service from clients on their current Tableau Server environment Provide solutions and recommendations for optimizing their use of Tableau Server and/or Tableau Cloud (Tableau Online) Deliver on project-based consulting engagements; help clients deploy, manage, govern, and tune their Tableau Server environments. Manage clients expectations by leading weekly status reports to clients; proactively solicit client feedback by running working sessions Required Experience: 4+ years of relevant experience administering, configuring, and developing in Power BI and Tableau Server Experience coordinating infrastructure in cloud-based environments such as AWS and/or Azure Ability to provide configuration, infrastructure, and performance tuning recommendations Experience with PowerShell scripts, bash scripting, command line and Power BI REST API for administrative automation. Experience with TSM commands for maintenance and upkeep procedures in Tableau Server Exceptional customer facing skills, including but not limited to communication skills and project management skills Strong problem solving skills with a passion for learning and mastering new technologies, techniques, and procedures Preferred Experience: Power BI certification (PL-300 / PL- 900/ DA-100) is strongly preferred Tableau server admin associate or Architect is preferred Experience with enterprise data governance and compliance frameworks and best practices for managing Power BI and Tableau Server, plus if on cloud IaaS (AWS/Azure) Cloud infrastructure experience Experience in Tableau Cloud (Tableau Online) Experience with both Windows and Linux deployments of Tableau Server Experience with deploying automation pipelines Plus if interested in learning server environments for Alteryx, and KNIME Perks and Benefits: Medical Insurance for Self & Family Medical Insurance for Parents Term Life & Personal Accident Wellness Allowance Broadband Reimbursement Professional Development Allowance Reimbursement of Skill Upgrade Certifications Certification Reimbursement phData celebrates diversity and is committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at phData. We are proud to be an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at People Operations. Show more Show less

Posted 15 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview We are looking for a Software Engineer 2 to help us build a next-generation API platform for tax preparation, designed to replace our existing infrastructure. You'll have the unique opportunity to work on a brand new, modern software stack, contributing directly to a transformative technology solution. What you'll bring 3+ years of professional experience in backend software engineering, specifically in Java Proven experience building and maintaining production-grade software systems at scale, with a keen focus on high availability and performance. Hands-on experience with microservices architectures, including design patterns, service communication, and operational considerations. Demonstrated ability in performance tuning, system optimization, and proactive operational troubleshooting. A collaborative, high-energy mindset, eager to mentor others and continuously improve your craft. How you will lead Design, develop, and deploy robust, scalable backend software in Java, handling billions of transactions reliably. Write clean, high-quality, well-tested code that meets our high standards for performance and maintainability. Actively participate in software design, building scalable, reliable microservices. Own operational excellence: proactively identify performance bottlenecks, troubleshoot, tune, and continuously optimize system performance. Collaborate closely with cross-functional teams to rapidly ship impactful features to production. Show more Show less

Posted 15 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview We are looking for a Software Engineer 2 to help us build a next-generation API platform for tax preparation, designed to replace our existing infrastructure. You'll have the unique opportunity to work on a brand new, modern software stack, contributing directly to a transformative technology solution. What you'll bring 3+ years of professional experience in backend software engineering, specifically in Java Proven experience building and maintaining production-grade software systems at scale, with a keen focus on high availability and performance. Hands-on experience with microservices architectures, including design patterns, service communication, and operational considerations. Demonstrated ability in performance tuning, system optimization, and proactive operational troubleshooting. A collaborative, high-energy mindset, eager to mentor others and continuously improve your craft. How you will lead Design, develop, and deploy robust, scalable backend software in Java, handling billions of transactions reliably. Write clean, high-quality, well-tested code that meets our high standards for performance and maintainability. Actively participate in software design, building scalable, reliable microservices. Own operational excellence: proactively identify performance bottlenecks, troubleshoot, tune, and continuously optimize system performance. Collaborate closely with cross-functional teams to rapidly ship impactful features to production. Show more Show less

Posted 15 hours ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary We are seeking an experienced Senior Business Analyst to join our team and play a key role in developing and enhancing a cutting-edge Healthcare Professional (HCP) portal built on Drupal Web CMS. The ideal candidate will have deep expertise in business analysis, requirements gathering, workflow modeling, and communication, along with prior experience working on Drupal-based systems and Healthcare-centric platform. you need to Act as an advisor providing guidance to challenge and improve global business processes, products, services, and software through data analysis. -Engage with global business leaders and leverage the appropriate DDIT teams and Functions to determine requirements and deliver data driven recommendations to improve efficiency and add value. About The Role Position Title: Assoc. Dir. DDIT US&I BA Web CMS(Business Analyst) Location – Hyd-India# LI Hybrid Role Purpose We are seeking an experienced Senior Business Analyst to join our team and play a key role in developing and enhancing a cutting-edge Healthcare Professional (HCP) portal built on Drupal Web CMS. The ideal candidate will have deep expertise in business analysis, requirements gathering, workflow modeling, and communication, along with prior experience working on Drupal-based systems and Healthcare-centric platform. you need to. Act as an advisor providing guidance to challenge and improve global business processes, products, services, and software through data analysis. -Engage with global business leaders and leverage the appropriate DDIT teams and Functions to determine requirements and deliver data driven recommendations to improve efficiency and add value. Your Responsibilities Include But Are Not Limited To Ensure consistency and traceability between user requirements, functional specifications, and testing and validation. Lead sessions with stakeholders to gather, analyze, and document business requirements, ensuring alignment with project objectives. Collaborate with technical teams to translate business needs into functional designs, workflows, and user stories for the Drupal-based HCP portal. Ensure Drupal Web CMS functionality is optimized to meet business goals, including customizing modules, integrations, and UX improvements. Support validation and testing as appropriate and ensure adherence to Security and Compliance policies and procedures within Service Delivery scope -Keep abreast with internal IT systems and documentation requirements, standards (including quality management and IT security), regulatory environments / requirements (if applicable), DDIT Service Portfolio and with industry best practices in leveraging technologies for the business and taking advantage of reusable products, solutions and services with no or minimal customizations wherever applicable. Identify opportunities to refine HCP portal features post-launch, addressing evolving business requirements and user feedback effectively. Additional specification required for testing -Reporting of technical complaints / adverse events / special case scenarios related to Novartis products within 24 hours of receipt. Distribution of marketing samples (where applicable) What You’ll Bring To The Role Feedback on Project execution (quality, time, cost) -Degree of customization vs configuration of COTS solutions -Process efficiency -Steady/Uninterrupted process flow -Completeness and accuracy of Business Process Model (BPM) -Business process documentation up to date Leveraging digital technology / big data.Proven record of crafting strategies and overseeing the successful implementation of complex web/digital solutions. Experience managing large-scale web platforms for HCP portals or similar platforms in regulated industries (e.g., healthcare), Influencing without authority, Relationship Management. Collaborating across boundaries.Working experience within the pharmaceutical industry and Multi-national global experience. Interactions with senior management. Deep knowledge of Drupal platform, including custom module development, theming, API integrations, and CMS optimization. Experience in managing Drupal upgrades, migrations, and performance tuning. Proficiency in web technologies such as HTML, CSS, JavaScript frameworks (React, Angular, etc.), and APIs. Familiarity with responsive design principles and cross-browser compatibility.Strong expertise in integrating web CMS platforms with databases, CRMs (e.g., Salesforce), analytics tools, and backend systems. Experience in building reporting dashboards and analyzing user behavior and performance metrics within portals. Desirable Requirements 12+ years of overall experience in IT, web solutions, or digital platform delivery, preferably in the healthcare, life sciences, or pharmaceutical domains. Minimum 5+ years of hands-on experience with Drupal Web CMS, including portal design, development, integration, and customization. Strong experience in business analysis roles, with 5+ years leading cross-functional teams to gather requirements, define scope, and deliver projects. Extensive experience managing projects using Agile/Scrum methodologies and delivering within time and budget constraints Commitment To Diversity & Inclusion Novartis embraces diversity, equal opportunity, and inclusion. We are committed to building diverse teams, representative of the patients and communities we serve, and we strive to create an inclusive workplace that cultivates bold innovation through collaboration and empowers our people to unleash their full potential. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Show more Show less

Posted 15 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life As a Power BI Developer, where you will create dynamic, data-driven dashboards and reports that provide meaningful insights for financial and business decision-making. You will work closely with Finance, Data Science, and Engineering teams to develop interactive visualizations that drive data accessibility. Data Visualization (Power BI) Developer – Global Finance Analytics COE Careers that Change Lives Join our Global Finance Analytics Center of Excellence (COE) as a Power BI Developer , where you will create dynamic, data-driven dashboards and reports that provide meaningful insights for financial and business decision-making. You will work closely with Finance, Data Science, and Engineering teams to develop interactive visualizations that drive data accessibility. This role requires an average of 2-3 days per week of overlapping work hours with the USA team to ensure seamless collaboration. A Day in the Life As a Power BI Developer , you will: Design and develop Power BI dashboards and reports with intuitive user experiences. Optimize data models, ensuring performance efficiency and best practices in DAX, M Query, and data transformations. Integrate data from Snowflake, SQL databases, and enterprise systems for analytics and reporting. Collaborate with stakeholders to understand business needs and translate them into actionable visual solutions. Ensure data governance, security, and role-based access controls in reporting solutions. Automate reporting processes and drive self-service BI adoption within Finance and Business teams. Stay up to date with emerging trends in BI, data visualization, and cloud analytics. Must Have: Minimum Requirements Bachelor’s degree in Computer Science, Information Systems, Business Analytics, or a related field. 5+ years of experience developing Power BI dashboards and reports. Strong proficiency in DAX, Power Query (M), and SQL. Experience integrating Power BI with cloud platforms (Azure, Snowflake, or AWS). Strong data modeling skills and performance tuning expertise. Ability to interpret business requirements and translate them into compelling data visualizations. Nice to Have Experience with Python and AI-powered analytics in Power BI. Knowledge of financial reporting and forecasting dashboards. Understanding of SAP, OneStream, or other ERP systems for financial reporting. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary. Learn more about our business, mission, and our commitment to diversity here Show more Show less

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description: About Holiday Tribe Holiday Tribe is a Great Place To Work® Certified™, seed-stage VC-funded travel-tech brand based in Gurugram. We specialize in crafting unforgettable leisure travel experiences by integrating advanced technology, leveraging human expertise, and prioritizing customer success. With holidays curated across 30+ destinations worldwide, partnerships with renowned tourism boards, and recognition as the Emerging Holiday Tech Company at the India Travel Awards 2023, Holiday Tribe is transforming the travel industry. Our mission is to redefine how Indians experience holidays making travel planning faster, smarter, and more personalized, ensuring every trip is truly seamless and unforgettable. Key Responsibilities AI System Development Design and implement Retrieval Augmented Generation (RAG) systems for travel recommendation and itinerary planning Build and optimize large language model integrations using frameworks like Lang Chain for travel-specific use cases Develop semantic search capabilities using vector databases and embedding models for travel content discovery Create tool-calling architectures that enable AI agents to interact with booking systems, inventory APIs, and external travel services Implement intelligent conversation flows for customer interactions and sales assistance Travel Intelligence Platform Build personalized recommendation engines that understand traveler preferences, seasonal factors, and destination characteristics Develop natural language processing capabilities for interpreting customer travel requests and preferences Implement real-time itinerary generation systems that consider multiple constraints (budget, time, preferences, availability) Create AI-powered tools to assist travel experts in creating customized packages faster Build semantic search engines for finding relevant travel content based on user intent and contextual understanding AI Agent & Tool Integration Design and implement function calling systems that allow LLMs to execute actions like booking confirmations, inventory checks, and pricing queries Build multi-agent systems where specialized AI agents handle different aspects of travel planning (accommodation, transportation, activities) Create tool orchestration frameworks that enable AI systems to chain multiple API calls for complex travel operations Implement safety and validation layers for AI-initiated actions in critical systems Data & Model Operations Work with travel knowledge graphs to enhance AI understanding of destinations, accommodations, and activities Implement hybrid search systems combining semantic similarity with traditional keyword-based search Build vector indexing strategies for efficient similarity search across large travel content databases Implement model evaluation frameworks to ensure high-quality AI outputs Optimize AI model performance for cost-efficiency and response times Collaborate with data engineers to build robust data pipelines for AI training and inference Cross-functional Collaboration Partner with product teams to translate travel domain requirements into AI capabilities Work closely with backend engineers to integrate AI services into the broader platform architecture Collaborate with UX teams to design intuitive AI-human interaction patterns Support sales and customer success teams by improving AI assistant capabilities Required Qualifications Technical Skills 3+ years of experience in AI/ML engineering with focus on natural language processing and large language models Strong expertise in RAG (Retrieval Augmented Generation) systems including vector databases, embedding models, and retrieval strategies Hands-on experience with Lang Chain or similar LLM orchestration frameworks, including tool calling and agent patterns Proficiency with semantic search technologies including vector databases, embedding models, and similarity search algorithms Experience with tool calling and function calling in LLM applications, including API integration and action validation Proficiency with major LLM APIs (Open AI, Anthropic, Google, etc.) and understanding of prompt engineering best practices Experience with vector databases such as Milvus, Weaviate, Chroma, or similar solutions Strong Python programming skills with experience in AI/ML libraries (transformers, sentence-transformers, scikit-learn) AI/ML Foundation Solid understanding of transformer architectures, attention mechanisms, and modern NLP techniques Deep knowledge of embedding models and semantic similarity techniques (sentence transformers, dense retrieval methods) Experience with hybrid search architectures combining dense and sparse retrieval methods Knowledge of fine-tuning approaches and model adaptation strategies Understanding of agent-based AI systems and multi-step reasoning capabilities Understanding of AI evaluation metrics and testing methodologies Familiarity with MLOps practices and model deployment strategies Software Engineering Experience building production-grade AI applications with proper error handling and monitoring Experience with API integration and orchestration for complex multi-step workflows Understanding of API design and microservices architecture Familiarity with cloud platforms (AWS, GCP, Azure) and their AI/ML services Experience with version control, CI/CD, and collaborative development practices Preferred Qualifications Advanced AI Experience Experience with multi-modal AI systems (text, images, structured data) Advanced knowledge of agent frameworks (LangGraph, CrewAI, AutoGen) and agentic workflows Experience with advanced semantic search techniques including re-ranking, query expansion, and result fusion Experience with model fine-tuning, especially for domain-specific applications Knowledge of tool use optimization and function calling best practices Understanding of AI safety, bias mitigation, and responsible AI practices Technical Depth Experience with advanced RAG techniques (hybrid search, re-ranking, query expansion, contextual retrieval) Knowledge of vector search optimization including indexing strategies, similarity metrics, and performance tuning Experience building tool-calling systems that integrate with external APIs and services Knowledge of graph databases and knowledge graph construction Familiarity with conversational AI and dialogue management systems Experience with A/B testing frameworks for AI systems Technical Challenges You'll Work On Building semantic search that understands travel intent ("romantic getaway" vs "adventure trip" vs "family vacation") Creating AI agents that can book multi-leg journeys by coordinating with multiple service providers Implementing tool calling systems that safely execute real booking actions with proper validation Designing RAG systems that provide accurate, up-to-date travel information from diverse sources Building conversational AI that can handle complex travel planning requirements Show more Show less

Posted 16 hours ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Role: Data Engineering Lead Experience: 7-10 Years Location: Hyderabad We need immediate joiners only (Max. 15 days) This is work from office role-5 days (No Hybrid/ Remote opportunities) We are looking for candidates with strong experience in data architecture About company: We provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes . Job Description: We are looking for an accomplished and dynamic Data Engineering Lead to join our team and drive the design, development, and delivery of cutting-edge data solutions. This role requires a balance of strong technical expertise, strategic leadership, and a consulting mindset. As the Lead Data Engineer, you will oversee the design and development of robust data pipelines and systems, manage and mentor a team of 5 to 7 engineers, and play a critical role in architecting innovative solutions tailored to client needs. You will lead by example, fostering a culture of accountability, ownership, and continuous improvement while delivering impactful, scalable data solutions in a fast-paced, consulting environment. Key Responsibilities:- Client Collaboration Act as the primary point of contact for US-based clients, ensuring alignment on project goals, timelines, and deliverables. Engage with stakeholders to understand requirements and ensure alignment throughout the project lifecycle. Present technical concepts and designs to both technical and non-technical audiences. Communicate effectively with stakeholders to ensure alignment on project goals, timelines, and deliverables. Set realistic expectations with clients and proactively address concerns or risks. Data Solution Design And Development Architect, design, and implement end-to-end data pipelines and systems that handle large-scale, complex datasets. Ensure optimal system architecture for performance, scalability, and reliability. Evaluate and integrate new technologies to enhance existing solutions. Implement best practices in ETL/ELT processes, data integration, and data warehousing. Project Leadership And Delivery Lead technical project execution, ensuring timelines and deliverables are met with high quality. Collaborate with cross-functional teams to align business goals with technical solutions. Act as the primary point of contact for clients, translating business requirements into actionable technical strategies. Team Leadership And Development Manage, mentor, and grow a team of 5 to 7 data engineers; Ensure timely follow-ups on action items and maintain seamless communication across time zones. Conduct code reviews, validations, and provide feedback to ensure adherence to technical standards. Provide technical guidance and foster an environment of continuous learning, innovation, and collaboration. Support collaboration and alignment between the client and delivery teams. Optimization And Performance Tuning Be hands-on in developing, testing, and documenting data pipelines and solutions as needed. Analyze and optimize existing data workflows for performance and cost-efficiency. Troubleshoot and resolve complex technical issues within data systems. Adaptability And Innovation Embrace a consulting mindset with the ability to quickly learn and adopt new tools, technologies, and frameworks. Identify opportunities for innovation and implement cutting-edge technologies in data engineering. Exhibit a "figure it out" attitude, taking ownership and accountability for challenges and solutions. Learning And Adaptability Stay updated with emerging data technologies, frameworks, and tools. Actively explore and integrate new technologies to improve existing workflows and solutions. Internal Initiatives And Eminence Building Drive internal initiatives to improve processes, frameworks, and methodologies. Contribute to the organization’s eminence by developing thought leadership, sharing best practices, and participating in knowledge-sharing activities. Qualifications Education Bachelor’s or master’s degree in computer science, Data Engineering, or a related field. Certifications in cloud platforms such as Snowflake Snowpro, Data Engineer is a plus. Experience 8+ years of experience in data engineering with hands-on expertise in data pipeline development, architecture, and system optimization Demonstrated success in managing global teams, especially across US and India time zones. Proven track record in leading data engineering teams and managing end-to-end project delivery. Strong background in data warehousing and familiarity with tools such as Matillion, dbt, Striim, etc. Technical Skills Lead the design, development, and deployment of scalable data architectures, pipelines, and processes tailored to client needs Expertise in programming languages such as Python, Scala, or Java. Proficiency in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. Solid understanding of database systems (relational and NoSQL) and data modeling techniques. Hands-on experience of 2+ years in designing and developing data integration solutions using Matillion and/or dbt. Strong knowledge of data engineering and integration frameworks. Expertise in architecting data solutions. Successfully implemented at least two end-to-end projects with multiple transformation layers. Good grasp of coding standards, with the ability to define standards and testing strategies for projects. Proficiency in working with cloud platforms (AWS, Azure, GCP) and associated data services. Enthusiastic about working in Agile methodology. Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines. Soft Skills Exceptional problem-solving and analytical skills. Strong communication and interpersonal skills to manage client relationships and team dynamics. Ability to thrive in a consulting environment, quickly adapting to new challenges and domains. Ability to handle ambiguity and proactively take ownership of challenges. Demonstrated accountability, ownership, and a proactive approach to solving problems. Why Join Us? Be at the forefront of data innovation and lead impactful projects. Work with a collaborative and forward-thinking team. Opportunity to mentor and develop talent in the data engineering space. Competitive compensation and benefits package. Skills: etl/elt processes,cloud platforms (aws, azure, gcp),data pipeline development,python,sql, nosql & data modeling,data modeling techniques,data engineering,data warehousing,programming languages (python, scala, java),devops process,ci/cd pipelines,data integration,system optimization,azure,agile methodology,github integration,data architecture,aws Show more Show less

Posted 16 hours ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Wobot AI is hiring a Senior Backend Developer (Node.js + ClickHouse) to help build the data backbone of our automation and vision intelligence platform. Explore the details below and see if you’re the right fit! What you'll do: Design and implement ingestion pipelines into ClickHouse for Computer Vision and other high-volume structured insights. Model efficient, scalable schemas using MergeTree, ReplacingMergeTree, and appropriate partitioning strategies. Implement deduplication, version control, and update-safe ingestion strategies tailored for real-time and mutable data. Build and maintain backend services and APIs that expose ClickHouse data to other systems such as product dashboards and internal workflows. Collaborate with CV and backend teams to ensure seamless data flow, system integration, and ingestion resilience. Work with product and data consumers to support high-performance analytical queries and structured data access. Monitor and maintain ingestion health, performance, observability, and error handling across the pipeline. Contribute to future-facing system design that enables AI agent integration, context-aware workflows, and evolving protocols such as MCP. What we are looking for: Must Have: 4 to 6 years of backend development experience with strong proficiency in Node.js. At least 1 year of production-grade experience with ClickHouse, including schema design and performance tuning. Experience building data pipelines using RabbitMQ, Pub/Sub, or other messaging systems. Solid understanding of time-series data, analytical query patterns, and distributed ingestion design. Familiarity with Google Cloud Platform and serverless development practices. Good to have: Experience with TypeScript in production backend systems. Exposure to building serverless applications using Cloud Run or AWS Lambda. Experience working with materialized views, TTL-based retention, and ingestion optimization in ClickHouse. Prior experience with Computer Vision pipelines or real-time data flows. Awareness of modern backend patterns that support AI/ML-generated insights, structured data orchestration, and agent-based interactions. Familiarity with designing systems that could interface with evolving protocols such as MCP or context-rich feedback systems. How we work: We use Microsoft Teams for daily communication, conduct daily standups and team meetings over Teams. We value open discussion, ownership, and a founder mindset. We prioritize design, amazing UI/UX, documentation, to-do lists, and data-based decision-making. We encourage team bonding through bi-weekly town halls, destressing sessions with a certified healer, and fun company retreats twice a year. We offer a 100% remote workplace model, health insurance, top performers eligible for attractive equity options, mental health consultations, company-sponsored upskilling courses, growth hours, the chance to give back with 40 hours for community causes, and access to a financial advisor. Wobot is an Equal Opportunity Employer If you have a passion for developing innovative solutions and want to work on cutting-edge technology, we encourage you to apply for this exciting opportunity. Show more Show less

Posted 16 hours ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Position: Co-Founder (CTO) Company: Friendly AI Location: Remote About Us: We're building an AI companion app that aims to address the global loneliness epidemic through culturally aware, multilingual conversation AI. Our mission is to provide meaningful connections through technology while being sensitive to cultural nuances. Position Overview: Looking for a technical visionary to join as Co-founder and CTO for an AI companion app focusing on social connections. This person will be responsible for both building the initial product and leading long-term technical strategy. Primary Skills Required: 1. AI/ML & NLP - Deep expertise in LLMs (GPT/Claude/PaLM) - Experience with conversational AI - Multi-language NLP (especially Hinglish) - Model fine-tuning and optimization - AI response accuracy and safety 2. Mobile Development - React Native/Flutter expertise - Real-time chat systems - Social features implementation - Performance optimization - App Store experience 3. Backend & Infrastructure - Cloud architecture (AWS/GCP) - Scalable systems design - Database optimization - Security implementation - Real-time communication 4. Leadership & Vision - Previous startup experience - Product strategy skills - Team building capability - Technical roadmap planning - Business acumen Key Responsibilities: 1. Initial Phase - Build MVP independently - Design system architecture - Implement core AI features - Develop mobile applications - Set up infrastructure 2. Growth Phase - Hire and lead tech team - Scale systems and infrastructure - Optimize AI performance - Manage technical operations - Guide product evolution 3. Strategic - Define technical vision - Align tech with business goals - Manage resources and budget - Drive innovation - Handle investor relations Required Experience: - 7+ years in software development - Previous startup experience - Consumer app development - AI/ML project leadership - Team management experience Ideal Qualities: - Strong hands-on coding abilities - Entrepreneurial mindset - Problem-solving skills - Communication excellence - Business understanding - User-centric thinking Compensation: - Significant equity (50%) - Decision-making authority - Technology stack freedom - Growth opportunities Let's build something great together. Show more Show less

Posted 16 hours ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

This role is for one of Weekday's clients Salary range: Rs 1200000 - Rs 1600000 (ie INR 12-16 LPA) Min Experience: 4 years Location: Remote (India) JobType: full-time Requirements About the Role: We are looking for a skilled and experienced Java Developer with a strong background in migration projects and hands-on experience with Microsoft Azure to join our dynamic development team. The ideal candidate will play a critical role in modernizing legacy systems and ensuring seamless migration to cloud-native environments. If you're passionate about designing robust, scalable applications and navigating cloud-based transformations, we'd love to hear from you. As part of this role, you will be involved in analyzing legacy Java applications , developing strategies for their migration, implementing enhancements, and deploying them on Azure cloud infrastructure . You will collaborate closely with DevOps, QA, and solution architects to ensure high-performance, secure, and scalable systems. Key Responsibilities: Lead or contribute to the migration of legacy systems to modern Java-based architectures on Microsoft Azure. Analyze existing monolithic or on-prem systems to plan and execute cloud migration strategies. Design and develop Java applications, APIs, and services using Spring Boot and modern frameworks. Ensure smooth integration with Azure cloud components such as Azure App Services, Azure SQL, Azure Storage, etc. Optimize code for performance and scalability across distributed systems. Collaborate with solution architects and stakeholders to define migration goals, timelines, and deliverables. Implement automation tools and pipelines to streamline migration and deployment processes. Work closely with QA and DevOps teams to establish continuous integration and deployment pipelines. Troubleshoot issues in migration and production environments, and provide root cause analysis. Create documentation, including technical specifications, migration runbooks, and architectural diagrams. Required Skills and Qualifications: 4+ years of experience in Java development, with strong hands-on expertise in Java 8+, Spring/Spring Boot, and object-oriented programming principles. Proven experience in legacy system modernization and application migration projects. Strong knowledge of Azure services and cloud-native development, especially in deploying Java apps on Azure. Experience with RESTful API design, microservices, and containerized environments (Docker/Kubernetes preferred). Familiarity with databases such as Azure SQL, PostgreSQL, or MySQL, including data migration and schema evolution. Understanding of CI/CD pipelines, source control (Git), and build tools (Maven/Gradle). Strong analytical, problem-solving, and communication skills. Experience working in Agile or Scrum development environments. Preferred Skills (Good to Have): Knowledge of other cloud platforms (AWS, GCP) is a plus. Familiarity with DevOps tools such as Azure DevOps, Terraform, or Ansible. Experience in performance tuning, system monitoring, and cost optimization on Azure. Exposure to container orchestration tools like Kubernetes. Show more Show less

Posted 16 hours ago

Apply

14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary: A highly skilled Big Data (Hadoop) Administrator responsible for the installation, configuration, engineering, and architecture of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, scripting, and infrastructure-as-code for automating and optimizing operations is highly desirable. Experience in collaborating with cross-functional teams, including application development, infrastructure, and operations, is highly preferred. Job Responsibilities: Manages the design, distribution, performance, replication, security, availability, and access requirements for large and complex Big Data clusters. Designs and develops the architecture and configurations to support various application needs; implements backup, recovery, archiving, conversion strategies, and performance tuning; manages job scheduling, application release, cluster changes, and compliance. Identifies and resolves issues utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of Hadoop cluster management; consults and advises application development teams on security, query optimization, and performance. Writes scripts to automate routine cluster management tasks and documents maintenance processing flows per standards. Implement industry best practices while performing Hadoop cluster administration tasks. Works in an Agile model with a strong understanding of Agile concepts. Collaborates with development teams to provide and implement new features. Debugs production issues by analyzing logs directly and using tools like Splunk and Elastic. Address organizational obstacles to enhance processes and workflows. Adopts and learns new technologies based on demand and supports team members by coaching and assisting. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 14+ years of IT and Infrastructure engineering work experience. Experience: 14+ Years Total IT experience & 10+ Years relevant experience in Big Data database Technical Skills: Big Data Platform Management : Big Data Platform Management: Expertise in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Data Infrastructure & Security : Proficient in designing and implementing robust data infrastructure solutions with a strong focus on data security, utilizing tools like Apache Ranger and Kerberos. Performance Tuning & Optimization : Skilled in performance tuning and optimization of big data environments, leveraging advanced techniques to enhance system efficiency and reduce latency. Backup & Recovery : Experienced in developing and executing comprehensive backup and recovery strategies to safeguard critical data and ensure business continuity. Linux & Troubleshooting : Strong knowledge of Linux operating systems , with proven ability to troubleshoot and resolve complex technical issues, collaborating effectively with cross-functional teams. DevOps & Scripting : Proficient in scripting and automation using tools like Ansible, enabling seamless integration and automation of cluster operations. Experienced in infrastructure-as-code practices and observability tools such as Elastic. Agile & Collaboration : Strong understanding of Agile SAFe for Teams, with the ability to work effectively in Agile environments and collaborate with cross-functional teams. ITSM Process & Tools : Knowledgeable in ITSM processes and tools such as ServiceNow. Other Critical Requirements: Automation and Scripting : Proficiency in automation tools and programming languages such as Ansible and Python to streamline operations and improve efficiency. Analytical and Problem-Solving Skills : Strong analytical and problem-solving abilities to address complex technical challenges in a dynamic enterprise environment. 24x7 Support : Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Team Management and Leadership : Proven experience managing geographically distributed and culturally diverse teams, with strong leadership, coaching, and mentoring skills. Communication Skills : Exceptional written and oral communication skills, with the ability to clearly articulate technical and functional issues, conclusions, and recommendations to stakeholders at all levels. Stakeholder Management : Prior experience in effectively managing both onshore and offshore stakeholders, ensuring alignment and collaboration across teams. Business Presentations : Skilled in creating and delivering impactful business presentations to communicate key insights and recommendations. Collaboration and Independence : Demonstrated ability to work independently as well as collaboratively within a team environment, ensuring successful project delivery in a complex enterprise setting. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 16 hours ago

Apply

6.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Strong knowledge and experience in database design, modelling and development using PL SQL. Minimum of 6 years. Proficiency with Oracle databases and tools, such as SQL Developer and Toad In-depth understanding of SQL tuning and optimization techniques Knowledge of database performance monitoring and troubleshooting Familiarity with ETL processes and data integration techniques and Strong analytical and problem-solving skills Preferred Technical And Professional Experience Ability to work in a fast-paced environment and meet deadlines Knowledge of agile software development practices is a plus Bachelor's degree in computer science or a related field is preferred, but not required Show more Show less

Posted 16 hours ago

Apply

6.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Strong knowledge and experience in database design, modelling and development using PL SQL. Minimum of 6 years. Proficiency with Oracle databases and tools, such as SQL Developer and Toad In-depth understanding of SQL tuning and optimization techniques Knowledge of database performance monitoring and troubleshooting Familiarity with ETL processes and data integration techniques and Strong analytical and problem-solving skills Preferred Technical And Professional Experience Ability to work in a fast-paced environment and meet deadlines Knowledge of agile software development practices is a plus Bachelor's degree in computer science or a related field is preferred, but not required Show more Show less

Posted 16 hours ago

Apply

6.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Strong knowledge and experience in database design, modelling and development using PL SQL. Minimum of 6 years. Proficiency with Oracle databases and tools, such as SQL Developer and Toad In-depth understanding of SQL tuning and optimization techniques Knowledge of database performance monitoring and troubleshooting Familiarity with ETL processes and data integration techniques and Strong analytical and problem-solving skills Preferred Technical And Professional Experience Ability to work in a fast-paced environment and meet deadlines Knowledge of agile software development practices is a plus Bachelor's degree in computer science or a related field is preferred, but not required Show more Show less

Posted 16 hours ago

Apply

6.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Strong knowledge and experience in database design, modelling and development using PL SQL. Minimum of 6 years. Proficiency with Oracle databases and tools, such as SQL Developer and Toad In-depth understanding of SQL tuning and optimization techniques Knowledge of database performance monitoring and troubleshooting Familiarity with ETL processes and data integration techniques and Strong analytical and problem-solving skills Preferred Technical And Professional Experience Ability to work in a fast-paced environment and meet deadlines Knowledge of agile software development practices is a plus Bachelor's degree in computer science or a related field is preferred, but not required Show more Show less

Posted 16 hours ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Proficient in core Java concepts, including object-oriented programming (OOP) principles, collections, exception handling, and multithreading. Understanding of Java 8 features and beyond, such as lambda expressions, streams, and functional interfaces. Thorough knowledge of the Spring framework, including Spring Core, Spring MVC, Spring Boot, Spring Data, and Spring Security. Experience in configuring and using dependency injection and inversion of control (IoC) in Spring. Proficient in creating RESTful APIs using Spring MVC or Spring Boot. Proficient in interacting with databases using Spring Data JPA or other persistence frameworks within the Spring ecosystem. Experience in writing SQL queries, managing transactions, and working with various databases such as MySQL, PostgreSQL, Oracle, etc. Understanding and implementation of security features using Spring Security, including authentication, authorization, and securing RESTful endpoints. Experience in designing and developing RESTful APIs using Spring, adhering to best practices and standards. Familiarity with tools like Postman for API testing and documentation. Proficient in using build tools such as Maven or Gradle for project build automation and dependency management. Understanding of performance tuning and optimization techniques for Spring applications. Awareness of code quality standards and the ability to conduct and participate in code reviews. Strong analytical and problem-solving skills to identify and resolve technical issues effectively. -> Security-Centric Development: Develop secure Java Spring Boot applications, following best practices for authentication, authorization, data protection, and secure communication. -> Microservices (Good to Have): Design, implement, and maintain microservices using Spring Boot, adhering to microservices architecture principles for scalability, maintainability, and fault tolerance. -> Containerization and Orchestration (Good to Have): Utilize Docker and Kubernetes for containerization and orchestration to optimize application deployment, scaling, and management. CI/CD with Jenkins (Good to Have): Implement and optimize continuous integration and continuous deployment (CI/CD) pipelines using Jenkins for Spring Boot applications. Show more Show less

Posted 16 hours ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Are you ready to make your mark with a true industry disruptor? ZineOne, a subsidiary of Session AI, the pioneer of in-session marketing, is looking to add talented team members to help us grow into the premier revenue tool for e-commerce. We work with some of the leading brands nationwide and we innovate how brands connect with and convert customers. Job Description This position offers a hands-on, technical opportunity as a vital member of the Site Reliability Engineering Group. Our SRE team is dedicated to ensuring that our Cloud platform operates seamlessly, efficiently, and reliably at scale. The ideal candidate will bring over five years of experience managing cloud-based Big Data solutions, with a strong commitment to resolving operational challenges through automation and sophisticated software tools. Candidates must uphold a high standard of excellence and possess robust communication skills, both written and verbal. A strong customer focus and deep technical expertise in areas such as Linux, automation, application performance, databases, load balancers, networks, and storage systems are essential. Key Responsibilities: As a Session AI SRE, you will: Design and implement solutions that enhance the availability, performance, and stability of our systems, services, and products Develop, automate, and maintain infrastructure as code for provisioning environments in AWS, Azure, and GCP Deploy modern automated solutions that enable automatic scaling of the core platform and features in the cloud Apply cybersecurity best practices to safeguard our production infrastructure Collaborate on DevOps automation, continuous integration, test automation, and continuous delivery for the Session AI platform and its new features Manage data engineering tasks to ensure accurate and efficient data integration into our platform and outbound systems Utilize expertise in DevOps best practices, shell scripting, Python, Java, and other programming languages, while continually exploring new technologies for automation solutions Design and implement monitoring tools for service health, including fault detection, alerting, and recovery systems Oversee business continuity and disaster recovery operations Create and maintain operational documentation, focusing on reducing operational costs and enhancing procedures Demonstrate a continuous learning attitude with a commitment to exploring emerging technologies Preferred Skills: Experience with cloud platforms like AWS, Azure, and GCP, including their management consoles and CLI Proficiency in building and maintaining infrastructure on: AWS using services such as EC2, S3, ELB, VPC, CloudFront, Glue, Athena, etc Azure using services such as Azure VMs, Blob Storage, Azure Functions, Virtual Networks, Azure Active Directory, Azure SQL Database, etc GCP using services such as Compute Engine, Cloud Storage, Cloud Functions, VPC, Cloud IAM, BigQuery, etc Expertise in Linux system administration and performance tuning Strong programming skills in Python, Bash, and NodeJS In-depth knowledge of container technologies like Docker and Kubernetes Experience with real-time, big data platforms including architectures like HDFS/Hbase, Zookeeper, and Kafka Familiarity with central logging systems such as ELK (Elasticsearch, LogStash, Kibana) Competence in implementing monitoring solutions using tools like Grafana, Telegraf, and Influx Benefits Comparable salary package and stock options Opportunity for continuous learning Fully sponsored EAP services Excellent work culture Opportunity to be an integral part of our growth story and grow with our company Health insurance for employees and dependents Flexible work hours Remote-friendly company Show more Show less

Posted 16 hours ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: EBS Apps DBA Work Location: Any Oracle Global Services Center is a unit within oracle that establishes long-term relationships with many of Oracle's customers through annuity-based service contracts and project-based one-time services. Oracle GSC team sells from a broad IT-services portfolio in both the fixed price and T&M basis. Orace GSC services are typically requested by large Oracle customers that require the utmost attention to real mission-critical applications and processes. Oracle GSC covers many large-scale Oracle customers. Oracle Global Services Center provides unmatched, tailored support that ensures organization’s Oracle technology investments deliver the cutting-edge innovation and performance your business requires to compete, all while coexisting within your IT environment. Detailed Job Description: An experienced EBS Apps DBA consulting professional who understands solutions, best practices, processes, or technology designs within and surrounding Oracle E-Business Suite Platform. Operates independently to deliver quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, to implement Oracle EBS product technology meeting customer needs, by applying Oracle methodology, company procedures, and leading practices. Consultant may act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. 8 - 12 years of experience relevant to this position including consulting experience preferred. Undergraduate degree or equivalent experience. Product or technical expertise relevant to practice focus. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Required Skills: Experience in Installation, Migration and Upgrade of Oracle EBS Applications 12.1.3, 12.2+ Experience in Administration of EBS (Cloning, Patching, Health Checks, Troubleshooting Issues) Experience in Managing Oracle EBS Production Environments Experience in Upgrading EBS databases (eg:- 12c to 19c) Setting up Multi-Node for EBS applications including DMZ external tiers Setting up Grid/ASM and RAC for Oracle EBS database Performance Tuning of EBS R12 applications and RAC databases Experience in using tools like RMAN, EXP/IMP or TTS for databases Experience in backup and restore operations for EBS applications and database Knowledge in setting up disaster recovery environment for EBS Knowledge on Cross Platform Migrations of EBS applications Knowledge in setting up Single Sign-On for EBS Ready to work in 24x7 shift Ready to Travel (within India or Abroad) EBS Cloud- Migration exposure OCI Foundation Certification Desired Skills: OCI Certification Foundation / Architect / professional is added advantage. Willingness to Travel both domestic or out of the country. Show more Show less

Posted 16 hours ago

Apply

4.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Hello, We are hiring for "Postgresql Administrator" for Jaipur location. Exp : 4+Years Work Mode : 5 Days Work from office Loc: Jaipur Notice period : Immediate joiners(notice period served candidates) Mandatory Skills : Postgresql Administration, SQL Queries, PL/pgSQL, Data Modeling, Data Migration. NOTE: We are looking for Immediate joiners(notice period served candidates) Apply only If you are having 4+Years of relevant experience into the above skills. Irrelevant profiles are not entertained. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: 4+ years of experience in database administration, with a strong focus on PostgreSQL. Technical Skills: Proficiency in SQL and PL/pgSQL. Experience with database design and modeling. Experience with database performance tuning and optimization. Experience with data migration and integration. Familiarity with Linux/Unix operating systems. Show more Show less

Posted 16 hours ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Siebel Administrator Work Location: Any Oracle Global Services Center is a unit within oracle that establishes long-term relationships with many of Oracle's customers through annuity-based service contracts and project-based one-time services. Oracle GSC team sells from a broad IT-services portfolio in both the fixed price and T&M basis. Orace GSC services are typically requested by large Oracle customers that require the utmost attention to real mission-critical applications and processes. Oracle GSC covers many large-scale Oracle customers. Oracle Global Services Center provides unmatched, tailored support that ensures organization’s Oracle technology investments deliver the cutting-edge innovation and performance your business requires to compete, all while coexisting within your IT environment. Detailed Job Description: An experienced consulting professional who has an understanding of solutions, industry best practices, multiple business processes, or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. 8 - 12 years of experience relevant to this position including consulting experience preferred. Undergraduate degree or equivalent experience. Product or technical expertise relevant to practice focus. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Required Skills: Experience on Siebel installation on Windows and Linux In-depth knowledge & experience on Siebel migrations & Upgrade to latest version eg: IP17 and later versions Experience on Siebel Gateway clustering, multimode AI load balancing etc Experience on Siebel Performance tuning of server, AOM, AI, Gateway, tomcat’s etc Experience on Troubleshooting EAI component crashes and analysing crashes & fdr and component log files. Knowledge on System Administration activities such as configuring application components, and parameters and Troubleshooting component crashes. SSO, LDAP setup to AD and Troubleshooting Good overall troubleshooting skills Automation of regular administrative tasks Preferable experience on WLS/BIP/OAS/OAP installation, upgrade, and integration with Siebel Experience on DR setup and testing Experience on managing Siebel on OCI (or any cloud) is preferable. Performance Tuning of Siebel CRM Ready to work in 24x7 shift Ready to Travel Cloud- Migration exposure Desired Skills: OCI Certification Foundation / Architect / professional is added advantage. Willingness to Travel both domestic or out of the country. Show more Show less

Posted 16 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your Primary Responsibilities Include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Preferred Education Master's Degree Required Technical And Professional Expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio.. Data Modeling and Analysis:. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.. Analyze and model data to ensure optimal ETL design and performance.. Ab Initio Components:. . Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions.. Implement best practices for reusable Ab Initio component Preferred Technical And Professional Experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization.. Conduct performance tuning and troubleshooting as needed.. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.. Participate in design reviews and provide technical expertise to enhance overall solution quality Documentation Show more Show less

Posted 16 hours ago

Apply

5.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company: Indian / Global Engineering & Manufacturing Organization Key Skills: Machine Learning, ML, AI Artificial intelligence, Artificial Intelligence, Tensorflow, Python, Pytorch. Roles and Responsibilities: Design, build, and rigorously optimize the complete stack necessary for large-scale model training, fine-tuning, and inference--including dataloading, distributed training, and model deployment--to maximize Model Flop Utilization (MFU) on compute clusters. Collaborate closely with research scientists to translate state-of-the-art models and algorithms into production-grade, high-performance code and scalable infrastructure. Implement, integrate, and test advancements from recent research publications and open-source contributions into enterprise-grade systems. Profile training workflows to identify and resolve bottlenecks across all layers of the training stack--from input pipelines to inference--enhancing speed and resource efficiency. Contribute to evaluations and selections of hardware, software, and cloud platforms defining the future of the AI infrastructure stack. Use MLOps tools (e.g., MLflow, Weights & Biases) to establish best practices across the entire AI model lifecycle, including development, validation, deployment, and monitoring. Maintain extensive documentation of infrastructure architecture, pipelines, and training processes to ensure reproducibility and smooth knowledge transfer. Continuously research and implement improvements in large-scale training strategies and data engineering workflows to keep the organization at the cutting edge. Demonstrate initiative and ownership in developing rapid prototypes and production-scale systems for AI applications in the energy sector. Experience Requirement: 5-9 years of experience building and optimizing large-scale machine learning infrastructure, including distributed training and data pipelines. Proven hands-on expertise with deep learning frameworks such as PyTorch, JAX, or PyTorch Lightning in multi-node GPU environments. Experience in scaling models trained on large datasets across distributed computing systems. Familiarity with writing and optimizing CUDA, Triton, or CUTLASS kernels for performance enhancement is preferred. Hands-on experience with AI/ML lifecycle management using MLOps frameworks and performance profiling tools. Demonstrated collaboration with AI researchers and data scientists to integrate models into production environments. Track record of open-source contributions in AI infrastructure or data engineering is a significant plus. Education: M.E., B.Tech M.Tech (Dual), BCA, B.E., B.Tech, M. Tech, MCA. Show more Show less

Posted 16 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company: Indian / Global Engineering & Manufacturing Organization Key Skills: Machine Learning, AI Artificial intelligence, Artificial Intelligence, Python, Pytorch. Roles and Responsibilities: Develop, fine-tune, and deploy AI/ML models using frameworks such as TensorFlow, PyTorch, and Scikit-learn. Design and implement data pipelines, perform feature engineering, and handle data preprocessing for training and inference. Integrate AI models into business applications and APIs, ensuring scalability and performance. Collaborate with MLOps teams to deploy solutions on cloud platforms like AWS, Azure, or Google Cloud using CI/CD pipelines, Docker, and Kubernetes. Conduct model evaluation, hyperparameter tuning, and implement strategies for continuous performance improvement. Apply AI techniques to solve real-world problems in domains such as NLP, computer vision, or reinforcement learning. Work closely with cross-functional teams including data scientists, software engineers, and business stakeholders to deliver end-to-end AI solutions. Communicate complex technical concepts and model behavior clearly to non-technical stakeholders. Experience Requirement: 3-5 years of experience in building and deploying AI/ML models into production environments. Experience in setting up and managing CI/CD pipelines for model deployment. Proven ability in optimizing underperforming models through data analysis, feature selection, and algorithm tuning. Demonstrated collaboration on cross-functional projects to embed AI in real-world business applications. Strong background in model lifecycle management and production-grade AI system development. Education: M.E., MCA, BCA, B.E., B.Tech. Show more Show less

Posted 16 hours ago

Apply

8.0 - 14.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Company: GSPANN is a US California Bay Area-based consulting services provider focused on implementations in Enterprise Content Management, Business Intelligence & Mobile Solution initiatives. More than 90% of our current clientele are FORTUNE 1000 organizations. We specialize in strategy, architecture, delivery and support of solutions in the ECM, BI and Mobility space Position: Infra Engineer - Storage Experience: 8 Yrs - 15 Yrs Job Location: Hyderabad/ Gurgaon Job Summary: We are seeking a highly skilled and experienced Manager, Infrastructure Support to oversee our IT infrastructure operations. The ideal candidate will have a strong background in Windows Administration, VMWare, endpoint administration, Mobile Device Management (MDM), software management, SolarWinds, people management, and governance. Key Responsibilities: Storage Management: Design, implement, and manage SAN storage solutions, ensuring optimal performance and reliability. Splunk Monitoring: Utilize Splunk for monitoring and analyzing storage infrastructure performance and issues. Performance Optimization: Optimize storage systems for efficiency, including capacity planning and performance tuning. Issue Resolution: Troubleshoot and resolve storage-related issues, ensuring minimal downtime and maximum availability. Backup and Recovery: Implement and manage backup and recovery processes to ensure data integrity and availability. Security: Develop and maintain security measures for storage systems, including access controls and data encryption. Documentation: Document storage configurations, procedures, and protocols for reference and compliance. Collaboration: Work closely with IT teams to understand storage requirements and provide solutions. Updates: Stay updated with the latest storage technologies and implement necessary updates and upgrades. Required skills: Certifications: Relevant certifications such as Splunk, SAN, or other storage technologies are preferred. 8-14 years of experience in storage infrastructure engineering, including SAN storage and Splunk monitoring. Hands-on experience in troubleshooting and optimizing storage performance. Strong knowledge of SAN storage systems and principles. Proficiency in using Splunk for monitoring and analysis. Excellent problem-solving and analytical skills. Effective communication and teamwork abilities. Desired Skills: Experience with data conversion tools and techniques. Ability to advise on process accountability, data monitoring, and exception monitoring. Experience in managing and optimizing technical business performance, including automation and simplification of business processes Why choose GSPANN “We GSPANNians” are at the heart of the technology that we pioneer. We do not service our customers, we co-create. With the passion to explore solutions to the most challenging business problems, we support and mentor the technologist in everyone who is a part of our team. This translates into innovations that are path-breaking and inspirational for the marquee clients, we co-create a digital future with. GSPANN is a work environment where you are constantly encouraged to sharpen your abilities and shape your growth path, We support you to become the best version of yourself by feeding your curiosity, providing a nurturing environment, and giving ample opportunities to take ownership, experiment, learn and succeed. We’re a close-knit family of more than 1400 people that supports one another and celebrates successes, big or small. We work together, socialize together, and actively serve the communities we live in. We invite you to carry forward the baton of innovation in technology with us. At GSPANN, we do not service. We Co-create. Discover your inner technologist - Explore and expand the boundaries of tech innovation without the fear of failure. Accelerate your learning - Shape your career while scripting the future of tech. Seize the ample learning opportunities to grow at a rapid pace Feel included - At GSPANN, everyone is welcome. Age, gender, culture, and nationality do not matter here, what matters is YOU Inspire and Be Inspired - When you work with the experts, you raise your game. At GSPANN, you’re in the company of marquee clients and extremely talented colleagues Enjoy Life - We love to celebrate milestones and victories, big or small. Ever so often, we come together as one large GSPANN family Give Back - Together, we serve communities. We take steps, small and large so we can do good for the environment, weaving in sustainability and social change in our endeavors We invite you to carry forward the baton of innovation in technology with us. Let’s Co-create. Show more Show less

Posted 16 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies