Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Job Title - Senior Ansys Analyst (Aero) Company - Drones Tech Lab™ Experience - 5+ years Industry - Aerospace & Defence (UAV) Location - Kolkata, West Bengal Company Description Drones Tech Lab™ is a leader in India's unmanned aerial systems sector, driving innovation in drone design, manufacturing, pilot training and mission-critical deployments. With a focus on both hardware and simulation excellence, we develop high-performance UAV systems for a range of applications, including surveillance, mapping, defense, and industrial automation. Our in-house R&D and testing capabilities make us a preferred partner for end-to-end drone solutions. About the Role We are looking for a highly experienced Senior Ansys Analyst (Aero) with a strong background in aerodynamics simulation, CFD, and FEA using Ansys suite of tools. You will be responsible for conducting aerodynamic simulations and structural analyses on fixed-wing and rotary UAV platforms to optimize performance, stability, and efficiency. This role involves working closely with mechanical, flight systems, and design teams to validate critical aero-structural performance across all phases of UAV development. Responsibilities Perform CFD (Computational Fluid Dynamics) analysis for fixed-wing and multirotor UAVs under various flight conditions. Conduct FEA (Finite Element Analysis) for structural integrity, vibration, and thermal stress evaluations. Optimize aerodynamic performance for stability, lift-to-drag ratio, and energy efficiency. Analyze flow behavior, pressure distribution, boundary layers, and wake regions. Correlate simulation results with wind tunnel and field data to validate model accuracy. Contribute to design refinement and structural improvements based on simulation insights. Support airframe development by validating load-bearing structures and flight loads. Work with design and systems engineering teams to implement simulation-driven development. Generate detailed reports, validation matrices, and documentation for internal and regulatory use. Contribute to best practices in simulation workflows and automation processes. Qualifications 5+ years of experience in aero-structural simulation using Ansys Fluent, Ansys Mechanical, or equivalent tools. Bachelor’s or Master’s degree in Aerospace Engineering, Mechanical Engineering, or related discipline. Strong background in aerodynamics, fluid mechanics, and structural mechanics. Proven experience with meshing techniques, turbulence models, boundary condition setup, and solver settings. Ability to perform modal, static, dynamic, and thermal analysis for UAV structures. Familiarity with airfoil theory, Reynolds numbers, and aero design constraints in small UAVs. Proficiency in report generation, result interpretation, and integration with CAD tools (e.g., SolidWorks, Fusion 360). Knowledge of scripting (Python/MATLAB) for simulation automation is a plus. Desirable Skills & Interests Prior work experience in UAV airframe design, prototyping or testing environments. Exposure to wind tunnel testing, structural fatigue analysis and material selection. Familiarity with UAV control surfaces, aero-elastic behaviour and flutter analysis. Interest in drone performance tuning, optimization, or flight envelope expansion. Skills Ansys Fluent, Ansys Mechanical, CFD, FEA, Aerodynamic Simulation, Structural Analysis, UAV Airframe Validation, Meshing, Lift-to-Drag Optimization, Vibration Analysis, Python, MATLAB, Airfoil Performance, Wind Tunnel Correlation Benefits Competitive salary Work on industry-grade UAV systems used in real-world missions Collaborative work culture with engineers across avionics, mechanical, and embedded teams Access to simulation hardware and cloud compute resources Opportunities to lead advanced aero-R&D projects Join Our Team If you are passionate about aero simulation, drone design and pushing the boundaries of aero-structural performance, Drones Tech Lab™ invites you to join our simulation and analysis team.
Posted 1 day ago
2.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Company Description Drones Tech Lab™ is a pioneer in India’s UAV ecosystem, offering end-to-end drone solutions including manufacturing, pilot training, forensics, and drone-as-a-service operations across sectors like surveillance, mapping, precision agriculture, and disaster management. Our team combines technical depth, domain experience, and a passion for innovation to design and deploy reliable, mission-ready unmanned systems for real-world challenges. About the Role As an Integration & Testing Engineer, you will be responsible for performing field testing of Dronestechlab’s Unmanned Aerial System (UAS) products and providing comprehensive system support. Your primary responsibilities will include the integration, testing, and validation of all R&D and new product development efforts. Additionally, you will be tasked with developing flight test plans, preparing remotely piloted aircraft for flight, conducting flight tests as the test pilot, and supporting other field-testing activities as needed. You will collaborate closely with the System Engineer & Product Development Engineering team to advance Dronestechlab’s products through field trials and client demonstrations. Responsibilities Integrate and configure flight controllers, ESCs, GPS, power systems, telemetry modules, and sensors. Ensure seamless communication across modules using UART, I2C, SPI, and PWM protocols. Perform ground testing including system checks, communication verification, and component calibration. Plan and execute system-level tests such as pre-flight checks, mission simulations, and range validation. Assist in firmware flashing, PID tuning, and signal diagnostics. Collaborate with cross-functional teams to identify, replicate, and resolve integration issues. Maintain logs, generate issue reports, and document testing procedures and protocols. Ensure adherence to SOPs and safety standards across the testing lifecycle. Support drone flight teams with mission readiness checks, tuning, and calibration. Assist in outdoor field testing, data logging, and system diagnostics under real-world conditions. Hands-on experience in integration, testing, and validation during R&D and new product development phases, with a focus on Hybrid VTOL and fixed-wing platforms. Manage development and production flight test operations with an emphasis on safety and quality control. Coordinate with other engineering teams for the integration of complementary technologies. Conduct flight testing and performance evaluation of prototypes and final production units. Assist in product documentation and user manual development. Practical, hands-on experience with UAV system assembly processes. Support the development of end-user training materials. Provide training and field support to end-users of UAS products. Qualifications Bachelor’s degree in Aeronautical/Mechanical/Other applicable engineering field. Minimum 2 years of experience in UAV integration, embedded systems, or system testing. Hands-on experience with UAV components: Flight controllers, GPS modules, ESCs, telemetry units, sensors. Basic to intermediate knowledge of communication interfaces: UART, I2C, SPI, PWM. Familiar with GCS tools like Mission Planner, QGroundControl, INAV Configurator. Capable of using diagnostic tools such as multimeters and oscilloscopes. Competent in basic firmware flashing and hardware-software integration workflows. Desirable Skills & Interests Comprehensive understanding of UAV systems. Strong knowledge of RC aircraft, UAV, and multirotor dynamics, control, and flight operations. Skilled in diagnosing technical issues and identifying root causes (log analysis capabilities). Capable of troubleshooting and resolving issues effectively. Experienced in performing timely repairs and maintenance. Proficient in documenting system testing procedures and repair activities. Hands-on experience in the construction and assembly of UAVs. Excellent interpersonal and communication skills. Expert in UAV configuration, performance evaluation, payload integration, autopilot tuning, and system repairs. Knowledgeable in UAS hardware, software, and sensor integration. Ability to prioritize and manage tasks to meet critical project timelines in a fast-paced environment. Skills UAV Integration, Ground Testing, Electrical Wiring, Embedded Debugging, Mission Planning Tools, Flight Controller Setup, Telemetry Systems, Firmware Flashing, Troubleshooting, Multimeter use, UART, I2C, SPI, Safety Protocols Benefits Competitive salary Exposure to real-world UAV testing and deployment Skill development in drone electronics and system integration Hands-on experience with cross-disciplinary drone systems Opportunities for travel during field testing and mission support Join Our Team If you are passionate about integrating real-world drone systems and love solving multidisciplinary engineering challenges, this is your chance to contribute to the future of UAV technology with Drones Tech Lab™.
Posted 1 day ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Mandatory Skills 5+ years in designing and developing interfaces/APIs using Oracle Service Bus Hands-on experience in Oracle Service Bus, Java, Weblogic, Oracle database , PL SQL Strong technical knowledge and problem solving skills Extremely strong in communication skills Experience in troubleshooting Oracle Service bus and weblogic and suggest configuration changes/performance optimization Good to Have Skills Experience with IBM Integration Bus Experience in Core Banking Integration Do I have a deep understanding of Oracle Service Bus? Do I have 5 yrs of experience Middleware developer? Self Test Questions If this role interests you, ask yourself below question to check if you meet the minimum qualification to apply. Do I have good experience Integration & integration patterns? Responsibilities Developing a good understanding of the System(s) Architecture Fully understand the project requirements and define the To-Be architecture and translate them to the Design Working independently and perform responsibilities under minimal supervision Working in global environment with people from diverse cultural backgrounds Identifying the appropriate integration patterns Design and develop interfaces using Oracle Service Bus v 12 and above Performing Design reviews, Code reviews, performance tuning and optimization Provide technical Support during Test Phases Working on Agile based development methodology About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 day ago
4.0 years
0 Lacs
India
Remote
Job Title: Monitoring & Observability Engineer – Datadog Specialist Experience: 4+ Years Location: [Specify Location or Remote] Job Type: Full-Time Job Summary: We are looking for a talented Observability Engineer with hands-on experience in Datadog to enhance our infrastructure and application monitoring capabilities. The ideal candidate will have a strong understanding of performance monitoring, alerting, and observability in cloud-native environments. Key Responsibilities: Design, implement, and maintain observability solutions using Datadog for applications, infrastructure, and cloud services. Set up dashboards, monitors, and alerts to proactively detect and resolve system issues. Collaborate with DevOps, SRE, and application teams to define SLOs, SLIs, and KPIs for performance monitoring. Integrate Datadog with services such as AWS, Kubernetes, CI/CD pipelines, and logging tools. Conduct performance tuning and root cause analysis of production incidents. Automate observability processes using infrastructure-as-code and scripting (e.g., Terraform, Python). Stay up-to-date with the latest features and best practices in Datadog and observability space. Must-Have Skills: 4+ years of experience in monitoring/observability, with 2+ years hands-on experience in Datadog Strong experience with Datadog APM, infrastructure monitoring, custom metrics, and dashboards Familiarity with cloud platforms like AWS, GCP, or Azure Experience monitoring Kubernetes, containers, and microservices Good knowledge of log management, tracing, and alert tuning Proficient with scripting (Python, Shell) and IaC tools (Terraform preferred) Solid understanding of DevOps/SRE practices and incident management Nice-to-Have Skills: Datadog certifications (e.g., Datadog Certified Observability Engineer) Experience integrating Datadog with CI/CD tools, ticketing systems, and chatops Familiarity with other monitoring tools (e.g., Prometheus, Grafana, New Relic, Splunk) Knowledge of performance testing tools (e.g., JMeter, k6)
Posted 1 day ago
5.0 - 15.0 years
0 Lacs
India
Remote
**********************************4 months contract opportunity********************************** Remote - Must Work EST Hours [Flexible and around 11.00 AM EST] Prefer minimum of 5-15 years of experience. This project is working on a new analytic platform, consolidation, and implementation. The cloud architect will identify, lead, and deliver data analysis and architecture optimization. Must have experience with Datalake Infrastructure, Data warehousing and Data Analytics Tools. Expertise in SQL Optimization and performance tuning and development of procedures Experience with Database technologies such as SQL Oracle or Informatica Working knowledge of Agile based development including DevOps, DataOps Good problem-solving skills including debugging skills Experience leading 3-4 projects as the Technical Architect Basic Understanding of General Ledger (GL) Accounting, Profit & Loss Statements, Profit Centers, and cost centers is added advantages. Minimum Requirements: Education required/preferred: Bachelor's degree in relevant field or equivalent experience Strong analytical and problem-solving skills Excellent written and verbal communication skills Experience in working and delivering within an agile methodology / framework Strong desire to work in cross functional teams
Posted 1 day ago
0.0 years
0 - 0 Lacs
Chandigarh, Chandigarh
On-site
IMMEDIATE REQUIREMENT Position: Database Administrator (DBA-SQL Server) Qualification: B.E./B.Tech/MCA or equivalent Experience: 5+ yrs Location: Chandigarh Please fill the form https://lnkd.in/dpbdhG_5 and send your updated resume to hr@edidacs.com Skills: Strong experience as a SQL Server DBA. In-depth knowledge of SQL Server database administration, including database design, installation, configuration, and maintenance. Proficiency in writing complex SQL queries, stored procedures, triggers, and functions. Experience with database performance tuning and opmtization techniques. Experience with backups, restores and recovery models. Familiarity with high availability and disaster recovery solutions, such as database mirroring, log shipping, and clustering. Knowledge of database security best practices and the ability to implement and enforce security measures. Monitor database performance, implement changes and apply new patches and versions when required Experience with SSIS and SSRS Job Type: Permanent Pay: ₹50,000.00 - ₹80,000.00 per month Work Location: In person
Posted 1 day ago
0.0 - 2.0 years
0 Lacs
Jaipur, Rajasthan
On-site
Job Responsibility: Understanding customer requirements and project KPIs. Implementing various development, testing, automation tools, and IT infrastructure. Planning the team structure, activities, and involvement in project management activities. Setting up tools and required infrastructure. Defining and setting development, test, release, update, and support processes forDevOps operation. Have the technical skill to review, verify, and validate the software code developed in theproject. Monitoring the processes during the entire lifecycle for its adherence and updating orcreating new processes for improvement and minimizing the wastage of resource usage. Encouraging and building automated processes wherever possible. Identifying and deploying cybersecurity measures by continuously performingvulnerability assessment and risk management. Incidence management and root cause analysis. Coordination and communication within the team and with customers. Selecting and deploying appropriate CI/CD tools. Strive for continuous improvement and build continuous integration, continuousdevelopment, and constant deployment pipeline (CI/CD Pipeline). Experience working on Linux based infrastructure. Experience of managing LAMP/LEMP/React based applications using Docker. Performance Tuning of services with load balance. Configuration and managing databases such as MySQL, Mongo,Redis,ElasticSearch. Excellent troubleshooting Job Types: Full-time, Permanent Pay: Up to ₹800,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Weekend availability Experience: Cloud infrastructure: 1 year (Required) OnPrem solutions: 2 years (Required) Location: Jaipur, Rajasthan (Required) Work Location: In person
Posted 1 day ago
0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
Administer and maintain Oracle SOA Suite and Oracle Service Bus (OSB) environments Deploy, configure, and monitor SOA composites and middleware components Ensure high availability, performance tuning, and capacity planning of SOA infrastructure Troubleshoot and resolve issues related to SOA services, integrations, and middleware Collaborate with development and infrastructure teams to support application deployments Implement and manage security policies, access controls, and SSL configurations Automate routine administrative tasks and streamline deployment processes Maintain documentation for configurations, procedures, and system changes Perform patching, upgrades, and backup/recovery of SOA environments Support incident management, root cause analysis, and continuous improvement initiatives
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: We are looking for a Lead Generative AI Engineer with 3–5 years of experience to spearhead development of cutting-edge AI systems involving Large Language Models (LLMs) , Vision-Language Models (VLMs) , and Computer Vision (CV) . You will lead model development, fine-tuning, and optimization for text, image, and multi-modal use cases. This is a hands-on leadership role that requires a deep understanding of transformer architectures, generative model fine-tuning, prompt engineering, and deployment in production environments. Roles and Responsibilities: Lead the design, development, and fine-tuning of LLMs for tasks such as text generation, summarization, classification, Q&A, and dialogue systems. Develop and apply Vision-Language Models (VLMs) for tasks like image captioning, VQA, multi-modal retrieval, and grounding. Work on Computer Vision tasks including image generation, detection, segmentation, and manipulation using SOTA deep learning techniques. Leverage frameworks like Transformers, Diffusion Models, and CLIP to build and fine-tune multi-modal models. Fine-tune open-source LLMs and VLMs (e.g., LLaMA, Mistral, Gemma, Qwen, MiniGPT, Kosmos, etc.) using task-specific or domain-specific datasets. Design data pipelines , model training loops, and evaluation metrics for generative and multi-modal AI tasks. Optimize model performance for inference using techniques like quantization, LoRA, and efficient transformer variants. Collaborate cross-functionally with product, backend, and ML ops teams to ship models into production. Stay current with the latest research and incorporate emerging techniques into product pipelines. Requirements: Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, Machine Learning, or related field. 3–5 years of hands-on experience in building, training, and deploying deep learning models, especially in LLM, VLM , and/or CV domains. Strong proficiency with Python , PyTorch (or TensorFlow), and libraries like Hugging Face Transformers, OpenCV, Datasets, LangChain, etc. Deep understanding of transformer architecture , self-attention mechanisms , tokenization , embedding , and diffusion models . Experience with LoRA , PEFT , RLHF , prompt tuning , and transfer learning techniques. Experience with multi-modal datasets and fine-tuning vision-language models (e.g., BLIP, Flamingo, MiniGPT, Kosmos, etc.). Familiarity with MLOps tools , containerization (Docker), and model deployment workflows (e.g., Triton Inference Server, TorchServe). Strong problem-solving, architectural thinking, and team mentorship skills.
Posted 1 day ago
0 years
15 - 25 Lacs
Mumbai Metropolitan Region
On-site
Administer and maintain Oracle SOA Suite and Oracle Service Bus (OSB) environments Deploy, configure, and monitor SOA composites and middleware components Ensure high availability, performance tuning, and capacity planning of SOA infrastructure Troubleshoot and resolve issues related to SOA services, integrations, and middleware Collaborate with development and infrastructure teams to support application deployments Implement and manage security policies, access controls, and SSL configurations Automate routine administrative tasks and streamline deployment processes Maintain documentation for configurations, procedures, and system changes Perform patching, upgrades, and backup/recovery of SOA environments Support incident management, root cause analysis, and continuous improvement initiatives
Posted 1 day ago
0 years
0 Lacs
Surat, Gujarat, India
On-site
About the Role: We are seeking a highly motivated and skilled AI/ML Engineer to join our innovation team focused on developing intelligent systems to improve internal processes, reduce manual workload, and accelerate project delivery timelines. The ideal candidate will have experience in designing, developing, and deploying AI agents, automation tools, and machine learning models that can streamline operations and enhance overall productivity across the organization. Key Responsibilities: Design, build, and deploy intelligent AI agents and automation tools that optimize internal workflows and reduce repetitive tasks Develop machine learning models and algorithms to support decision-making and improve operational efficiency Implement AI-driven solutions for use cases such as task automation, knowledge retrieval, data summarization, and intelligent reporting Research and experiment with various AI/ML approaches to identify the best-fit technologies for business problems Collaborate with cross-functional teams to identify pain points and deliver AI-based tools that support their functions Apply prompt engineering and fine-tuning techniques to maximize the value of large language models (LLMs) Develop systems that enable automated document generation, test case creation, ticketing, and other IT operations Ensure ethical, efficient, and scalable implementation of AI across internal tools and processes Maintain clear documentation of AI workflows, architecture, and operational procedures Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, Data Science, or a related field Proficiency in Python and strong experience in machine learning and deep learning frameworks Proven experience in developing AI-powered agents, automation systems, or productivity-enhancing tools Strong understanding of NLP, ML algorithms, LLMs, and AI-driven workflow automation Ability to work independently on research and prototyping of new AI capabilities Excellent problem-solving and communication skills Comfortable working in fast-paced and agile environments Preferred Qualifications: Experience in designing multi-agent systems or autonomous workflow assistants Understanding of retrieval-based systems, semantic search, or document intelligence Familiarity with MLOps practices and scalable deployment of ML solutions Experience using AI to automate software development lifecycle tasks such as code review, documentation, and QA Background in building internal tooling for IT or engineering teams
Posted 1 day ago
3.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
We’re looking for an experienced Audio Engineer to join MSIPL , one of India’s leading live audio production companies. This role is vital to delivering world-class sound across concerts and many live events. What You’ll Do: You’ll be responsible for end-to-end audio production—from equipment planning and system rigging to console setup and live mixing. You’ll work with cutting-edge gear like L-Acoustics and DiGiCo, leading on-site setups and ensuring flawless show execution. Success in This Role: Smooth coordination and execution of complex event setups Flawless live audio performance under pressure Proactive equipment management and client-facing professionalism Who You Are: An audio expert with at least 3 years of hands-on experience in large-format shows, proficient in digital consoles, system tuning, and wireless configuration. You thrive in fast-paced environments, are detail-oriented, and ready to travel as part of a dynamic team.
Posted 1 day ago
1.0 - 4.0 years
0 Lacs
Kerala, India
On-site
About the Role We are looking for a skilled and motivated Odoo Developer to join our team in Ernakulam. If you have 1 to 4 years of hands-on experience with Odoo and a passion for building tailored ERP solutions, this could be the perfect opportunity for you. You’ll collaborate with functional consultants and business analysts to design, develop, and implement Odoo modules that align with a wide range of business needs. Key Responsibilities Develop and customize Odoo modules for core business areas such as Sales, Purchase, Inventory, Accounting, CRM, and HR. Enhance existing functionalities and create new features based on project specifications. Work closely with cross-functional teams to understand and translate technical and functional requirements. Write clean, well-structured code using Python, XML, and Odoo development best practices. Design custom reports, dashboards, and integrate Odoo with external systems. Perform debugging, testing, and performance tuning to ensure high-quality outputs. Manage database operations including data migration, upgrades, and issue resolution. Provide ongoing support and continuous improvements for deployed Odoo solutions. Requirements 1 to 4 years of experience in Odoo development. Strong command of Python, XML, JavaScript, and PostgreSQL. Deep understanding of Odoo’s architecture, ORM, and modular structure. Experience working with both Odoo Community and Enterprise editions. Familiarity with HTML, CSS, and RESTful APIs. Strong analytical, debugging, and problem-solving skills. Effective communication and teamwork abilities. Knowledge of Git or other version control systems is a plus.
Posted 1 day ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Argus is where smart people belong and where they can grow. We answer the challenge of illuminating markets and shaping new futures. What We’re Looking For Join our Generative AI team as a Senior Data Scientist, reporting directly to the Lead Data Scientist in India. You will play a crucial role in building, optimizing, and maintaining AI-ready data infrastructure for advanced Generative AI applications. Your focus will be on hands-on implementation of cutting-edge data extraction, curation, and metadata enhancement techniques for both text and numerical data. You will be a key contributor to the development of innovative solutions, ensuring rapid iteration and deployment, and supporting the Lead in achieving the team's strategic goals. What Will You Be Doing AI-Ready Data Development: Design, develop, and maintain high-quality AI-ready datasets, ensuring data integrity, usability, and scalability to support advanced generative AI models. Advanced Data Processing: Drive hands-on efforts in complex data extraction, cleansing, and curation for diverse text and numerical datasets. Implement sophisticated metadata enrichment strategies to enhance data utility and accessibility for AI systems. Algorithm Implementation & Optimization: Implement and optimize state-of-the-art algorithms and pipelines for efficient data processing, feature engineering, and data transformation tailored for LLM and GenAI applications. GenAI Application Development: Apply and integrate frameworks like LangChain and Hugging Face Transformers to build modular, scalable, and robust Generative AI data pipelines and applications. Prompt Engineering Application: Apply advanced prompt engineering techniques to optimize LLM performance for specific data extraction, summarization, and generation tasks, working closely with the Lead's guidance. LLM Evaluation Support: Contribute to the systematic evaluation of Large Language Models (LLMs) outputs, analysing quality, relevance, and accuracy, and supporting the implementation of LLM-as-a-judge frameworks. Retrieval-Augmented Generation (RAG) Contribution: Actively contribute to the implementation and optimization of RAG systems, including working with embedding models, vector databases, and, where applicable, knowledge graphs, to enhance data retrieval for GenAI. Technical Mentorship: Act as a technical mentor and subject matter expert for junior data scientists, providing guidance on best practices in coding and PR reviews, data handling, and GenAI methodologies. Cross-Functional Collaboration: Collaborate effectively with global data science teams, engineering, and product stakeholders to integrate data solutions and ensure alignment with broader company objectives. Operational Excellence: Troubleshoot and resolve data-related issues promptly to minimize potential disruptions, ensuring high operational efficiency and responsiveness. Documentation & Code Quality: Produce clean, well-documented, production-grade code, adhering to best practices for version control and software engineering. Skills And Experience Academic Background: Advanced degree in AI, statistics, mathematics, computer science, or a related field. Programming and Frameworks: 2+ years of hands-on experience with Python, TensorFlow or PyTorch, and NLP libraries such as spaCy and Hugging Face. GenAI Tools: 1+ years Practical experience with LangChain, Hugging Face Transformers, and embedding models for building GenAI applications. Prompt Engineering: Deep expertise in prompt engineering, including prompt tuning, chaining, and optimization techniques. LLM Evaluation: Experience evaluating LLM outputs, including using LLM-as-a-judge methodologies to assess quality and alignment. RAG and Knowledge Graphs: Practical understanding and experience using vector databases. In addition, familiarity with graph-based RAG architectures and the use of knowledge graphs to enhance retrieval and reasoning would be a strong plus. Cloud: 2+ years of experience with Gemini/OpenAI models and cloud platforms such as AWS, Google Cloud, or Azure. Proficient with Docker for containerization. Data Engineering: Strong understanding of data extraction, curation, metadata enrichment, and AI-ready dataset creation. Collaboration and Communication: Excellent communication skills and a collaborative mindset, with experience working across global teams. What’s In It For You Our rapidly growing, award-winning business offers a dynamic environment for talented, entrepreneurial professionals to achieve results and grow their careers. Argus recognizes and rewards successful performance and as an Investor in People, we promote professional development and retain a high-performing team committed to building our success. Competitive salary Hybrid Working Policy (3 days in Mumbai office/ 2 days WFH once fully inducted) Group healthcare scheme 18 days annual leave 8 days of casual leave Extensive internal and external training Hours This is a full-time position operating under a hybrid model, with three days in the office and up to two days working remotely. The team supports Argus’ key business processes every day, as such you will be required to work on a shift-based rota with other members of the team supporting the business until 8pm. Typically support hours run from 11am to 8pm with each member of the team participating up to 2/3 times a week. Argus is the leading independent provider of market intelligence to the global energy and commodity markets. We offer essential price assessments, news, analytics, consulting services, data science tools and industry conferences to illuminate complex and opaque commodity markets. Headquartered in London with 1,500 staff, Argus is an independent media organisation with 30 offices in the world’s principal commodity trading hubs. Companies, trading firms and governments in 160 countries around the world trust Argus data to make decisions, analyse situations, manage risk, facilitate trading and for long-term planning. Argus prices are used as trusted benchmarks around the world for pricing transportation, commodities and energy. Founded in 1970, Argus remains a privately held UK-registered company owned by employee shareholders and global growth equity firm General Atlantic.
Posted 1 day ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Argus is where smart people belong and where they can grow. We answer the challenge of illuminating markets and shaping new futures. What We’re Looking For Join our Generative AI team to lead a new group in India, focused on creating and maintaining AI-ready data. As the point of contact in Mumbai, you will guide the local team and ensure seamless collaboration with our global counterparts. Your contributions will directly impact the development of innovative solutions used by industry leaders worldwide, supporting text and numerical data extraction, curation, and metadata enhancements to accelerate development and ensure rapid response times. You will play a pivotal role in transforming how our data are seamlessly integrated with AI systems, paving the way for the next generation of customer interactions. What Will You Be Doing Lead and Develop the Team: Oversee a team of data scientists in Mumbai. Mentoring and guiding junior team members, fostering their professional growth and development. Strategic Planning: Develop and implement strategic plans for data science projects, ensuring alignment with the company's goals and objectives. AI-Ready Data Development: Design, develop, and maintain high-quality AI-ready datasets, ensuring data integrity, usability, and scalability to support advanced Generative AI models. Advanced Data Processing: Drive hands-on efforts in complex data extraction, cleansing, and curation for diverse text and numerical datasets. Implement sophisticated metadata enrichment strategies to enhance data utility and accessibility for AI systems. Algorithm Implementation & Optimization: Implement and optimize state-of-the-art algorithms and pipelines for efficient data processing, feature engineering, and data transformation tailored for LLM and GenAI applications. GenAI Application Development: Apply and integrate frameworks like LangChain and Hugging Face Transformers to build modular, scalable, and robust Generative AI data pipelines and applications. Prompt Engineering Application: Apply advanced prompt engineering techniques to optimize LLM performance for specific data extraction, summarization, and generation tasks, working closely with the Lead's guidance. LLM Evaluation Support: Contribute to the systematic evaluation of Large Language Models (LLMs) outputs, analysing quality, relevance, and accuracy, and supporting the implementation of LLM-as-a-judge frameworks. Retrieval-Augmented Generation (RAG) Contribution: Actively contribute to the implementation and optimization of RAG systems, including working with embedding models, vector databases, and, where applicable, knowledge graphs, to enhance data retrieval for GenAI. Technical Leadership: Act as a technical leader and subject matter expert for junior data scientists, providing guidance on best practices in coding and PR reviews, data handling, and GenAI methodologies. Cross-Functional Collaboration: Collaborate effectively with global data science teams, engineering, and product stakeholders to integrate data solutions and ensure alignment with broader company objectives. Operational Excellence: Troubleshoot and resolve data-related issues promptly to minimize potential disruptions, ensuring high operational efficiency and responsiveness. Documentation & Code Quality: Produce clean, well-documented, production-grade code, adhering to best practices for version control and software engineering. Skills And Experience Leadership Experience: Proven track record in leading and mentoring data science teams, with a focus on strategic planning and operational excellence. Academic Background: Advanced degree in AI, statistics, mathematics, computer science, or a related field. Programming and Frameworks: 5+ years of hands-on experience with Python, TensorFlow or PyTorch, and NLP libraries such as spaCy and Hugging Face. GenAI Tools: 2+ years of Practical experience with LangChain, Hugging Face Transformers, and embedding models for building GenAI applications. Prompt Engineering: Deep expertise in prompt engineering, including prompt tuning, chaining, and optimization techniques. LLM Evaluation: Experience evaluating LLM outputs, including using LLM-as-a-judge methodologies to assess quality and alignment. RAG and Knowledge Graphs: Practical understanding and experience using vector databases. In addition, familiarity with graph-based RAG architectures and the use of knowledge graphs to enhance retrieval and reasoning would be a strong plus. Cloud: 2+ years of experience with Gemini/OpenAI models and cloud platforms such as AWS, Google Cloud, or Azure. Proficient with Docker for containerization. Data Engineering: Strong understanding of data extraction, curation, metadata enrichment, and AI-ready dataset creation. Collaboration and Communication: Excellent communication skills and a collaborative mindset, with experience working across global teams. What’s In It For You Our rapidly growing, award-winning business offers a dynamic environment for talented, entrepreneurial professionals to achieve results and grow their careers. Argus recognizes and rewards successful performance and as an Investor in People, we promote professional development and retain a high-performing team committed to building our success. Competitive salary Hybrid Working Policy (3 days in Mumbai office/ 2 days WFH once fully inducted) Group healthcare scheme 18 days annual leave 8 days of casual leave Extensive internal and external training Hours This is a full-time position operating under a hybrid model, with three days in the office and up to two days working remotely. The team supports Argus’ key business processes every day, as such you will be required to work on a shift-based rota with other members of the team supporting the business until 8pm. Typically support hours run from 11am to 8pm with each member of the team participating up to 2/3 times a week. Argus is the leading independent provider of market intelligence to the global energy and commodity markets. We offer essential price assessments, news, analytics, consulting services, data science tools and industry conferences to illuminate complex and opaque commodity markets. Headquartered in London with 1,500 staff, Argus is an independent media organisation with 30 offices in the world’s principal commodity trading hubs. Companies, trading firms and governments in 160 countries around the world trust Argus data to make decisions, analyse situations, manage risk, facilitate trading and for long-term planning. Argus prices are used as trusted benchmarks around the world for pricing transportation, commodities and energy. Founded in 1970, Argus remains a privately held UK-registered company owned by employee shareholders and global growth equity firm General Atlantic.
Posted 1 day ago
7.0 years
0 Lacs
Gurgaon Rural, Haryana, India
On-site
Minimum of 7+ years of experience in the data analytics field. Proven experience with Azure/AWS Databricks in building and optimizing data pipelines, architectures, and datasets. Strong expertise in Scala or Python, PySpark, and SQL for data engineering tasks. Ability to troubleshoot and optimize complex queries on the Spark platform. Knowledge of structured and unstructured data design, modelling, access, and storage techniques. Experience designing and deploying data applications on cloud platforms such as Azure or AWS. Hands-on experience in performance tuning and optimizing code running in Databricks environments. Strong analytical and problem-solving skills, particularly within Big Data environments. Experience with Big Data management tools and technologies including Cloudera, Python, Hive, Scala, Data Warehouse, Data Lake, AWS, Azure. Technical and Professional Skills: Must Have: Excellent communication skills with the ability to interact directly with customers. Azure/AWS Databricks. Python / Scala / Spark / PySpark. Strong SQL and RDBMS expertise. HIVE / HBase / Impala / Parquet. Sqoop, Kafka, Flume. Airflow.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Dear Candidate, Please find the below JD for your reference: Strong experience with Kotlin Multiplatform (KMP) and Kotlin language fundamentals. · Proficiency in native Android and iOS development. · Experience with shared code architecture, dependency injection, and modularization. · Familiarity with Ktor, SQLDelight, Coroutines, and Multiplatform libraries. · Understanding of RESTful APIs, JSON, and secure data transmission. · Experience with BLE/NFC integrations and sensor-based interfaces. · Knowledge of cryptographic APIs and secure storage mechanisms. · Strong debugging, profiling, and performance tuning skills. · Experience publishing apps to Google Play Store and Apple App Store.
Posted 1 day ago
5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Title: Senior Software Developer As a Senior Software Developer (.NET), you will play a critical role in designing, developing, and deploying scalable, secure enterprise applications aligned with Inevia’s business objectives. This role ensures high performance, maintainability, and reliability of applications built using .NET technologies, microservices, and SQL Server. You will work closely with cross-functional teams, including product owners and stakeholders, to deliver innovative solutions while mentoring team members in a collaborative Agile environment. The candidate must be detail-oriented, organized, and capable of managing multiple priorities in a fast-paced environment. Responsibilities: Application Design & Development: Design, develop, and maintain robust web applications using C#, .NET Core, and .NET 6/7/8. Develop reusable components, services, and libraries following clean coding practices. Build, consume, and secure RESTful APIs and microservices. Integrate with Angular-based frontend applications for seamless backend-frontend communication. Ensure adherence to architecture principles and coding standards. System Optimization, Monitoring, and Quality: Perform application performance tuning and optimization. Conduct unit and integration testing, and participate in code reviews. Ensure high availability, scalability, and reliability of applications. Implement robust logging and monitoring mechanisms. Maintain observability and troubleshooting capabilities across environments. Database and Integration: Write optimized SQL queries, stored procedures, and functions in SQL Server. Collaborate on schema design and query performance tuning. Use ORM tools like Entity Framework Core and Dapper for data access. CI/CD and DevOps: Participate in Agile ceremonies and sprint activities. Support CI/CD pipeline setup using Azure DevOps. Participate in containerization using Docker and deployment on cloud platforms. Manage source code repositories and branching strategies. Troubleshooting and Support: Investigate and resolve issues across development, staging, and production environments. Analyze logs and telemetry data to identify root causes and implement fixes. Collaboration and Communication: Collaborate with development teams and stakeholders to gather and clarify requirements. Mentor developers by providing guidance and technical support. Qualifications: Education and Experience: Bachelor’s degree in Computer Science, IT, Engineering, or a related field 5+ years of professional experience in .NET development. Proven experience in building enterprise-grade web applications and APIs. Knowledge and Skills: Expertise in C#, .NET Core, .NET 6/7/8. Strong knowledge of Microservices architecture, RESTful APIs, asynchronous programming, and authentication mechanisms (JWT, OAuth2). Hands-on experience with SQL Server and complex query writing. Familiarity with Entity Framework Core, LINQ, and clean architecture principles. Experience with version control systems such as Azure DevOps and Git Knowledge of cloud technologies, preferably Azure. Exposure to unit testing and test-driven development (TDD). Knowledge of Angular frontend is a plus. Benefits: Opportunity to work on scalable enterprise applications and backend architecture Room for professional growth and learning. Competitive compensation package. Additional Information: This is a full-time position located in Navi Mumbai. Inevia is an equal opportunity employer and encourages applications from candidates of all backgrounds and experiences.
Posted 1 day ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
6.0 years
0 Lacs
India
On-site
We are seeking a skilled and proactive Platform Lead with strong Snowflake expertise and AWS cloud exposure to lead the implementation and operational excellence of a scalable, multitenant modern data platform for a leading US-based marketing agency serving nonprofit clients. This role requires hands-on experience in managing Snowflake environments, supporting data pipeline orchestration, enforcing platform-level standards, and ensuring observability, performance, and security across environments. You will collaborate with architects, engineers, and DevOps teams to operationalize the platform’s design and drive its long-term stability and scalability in a cloud-native ecosystem. Job Specific Duties & Responsibilities: Lead the technical implementation and stability of the multitenant Snowflake data platform across dev, QA, and prod environments Design and manage schema isolation, role-based access control (RBAC), masking policies, and cost-optimized Snowflake architecture for multiple nonprofit tenants Implement and maintain CI/CD pipelines for dbt, Snowflake objects, and metadata-driven ingestion processes using GitHub Actions or similar tools Develop and maintain automation accelerators for data ingestion, schema validation, error handling, and onboarding new clients at scale Collaborate with architects and data engineers to ensure seamless integration with source CRMs, ByteSpree connectors, and downstream BI/reporting layers Monitor and optimize performance of Snowflake workloads (e.g., query tuning, warehouse sizing, caching strategy) to ensure reliability and scalability Establish and maintain observability and monitoring practices across data pipelines, ingestion jobs, and platform components (e.g., error tracking, data freshness, job status dashboards) Manage infrastructure-as-code (IaC), configuration templates, and version control practices across the data stack Ensure robust data validation, quality checks, and observability mechanisms are in place across all platform services Support incident response, pipeline failures, and technical escalations in production, coordinating across engineering and client teams Contribute to data governance compliance by implementing platform-level policies for PII, lineage tracking, and tenant-specific metadata tagging Required Skills, Experience & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field 6+ years of experience in data engineering or platform delivery, including 3+ years of hands-on Snowflake experience in production environments Proven expertise in building and managing multi tenant data platforms, including schema isolation, RBAC, and masking policies Solid knowledge of CI/CD practices for data projects, with experience guiding pipeline implementations using tools like GitHub Actions Hands-on experience with dbt, SQL, and metadata-driven pipeline design for large-scale ingestion and transformation workloads Strong understanding of AWS cloud services relevant to data platforms (e.g., S3, IAM, Lambda, CloudWatch, Secrets Manager) Experience optimizing Snowflake performance, including warehouse sizing, caching, and cost control strategies Familiarity with setting up observability frameworks, monitoring tools, and data quality checks across complex pipeline ecosystems Proficient in infrastructure-as-code (IaC) concepts and managing configuration/versioning across environments Awareness of data governance principles, including lineage, PII handling, and tenant-specific metadata tagging
Posted 1 day ago
6.0 years
8 - 12 Lacs
India
Remote
📍 Location: Remote (India) 📅 Start Date: ASAP 🔹 Type: Contract / Full-time (Flexible) 🏢 About The Company We deliver innovative solutions that help businesses accelerate performance across application development , BPO , data services , and professional services . Our mission is to improve efficiency, reduce costs, increase profitability, and shorten time-to-market for our clients. 📌 Role Overview We are looking for a skilled SQL Scripter with hands-on experience in Flexera database environments . The role involves designing, developing, and optimizing SQL scripts for FlexNet Manager Suite , with a strong focus on reporting, data analysis, and automation. 🛠️ Key Responsibilities Develop and maintain SQL scripts for: Data extraction Custom reporting Database automation within Flexera Execute SQL batches and build reports in FlexNet Manager Suite (FNMS) Optimize and troubleshoot complex SQL queries Collaborate with DBAs and developers to ensure data integrity and performance Support database backup, recovery, and security procedures Align all development with Flexera schema standards and best practices Document all SQL scripts and database-related workflows ✅ Qualifications 3–6 years of hands-on experience as a SQL Developer or Scripter Expert-level knowledge of SQL (queries, stored procedures, triggers, functions) Experience working with Flexera, especially FlexNet Manager Suite Familiarity with database administration and performance tuning Knowledge of PowerShell or Python for automation (a plus) Strong problem-solving and debugging skills Self-motivated and able to work both independently and in a team environment 🎯 Ideal Candidate Has worked on Flexera or licensing compliance tools Can handle SQL-heavy environments with minimal supervision Understands the importance of clean, well-documented scripts 📩 Interested? Send your CV to garima.s@zorbaconsulting.in with subject line: SQL Scripter – Flexera Application Skills: automation,data analysis,python,reporting,powershell,flexnet manager suite,sql scripter,sql,flexera
Posted 1 day ago
5.0 years
0 Lacs
India
Remote
About Company: Our Client is one of the world's fastest-growing AI companies accelerating the advancement and deployment of powerful AI systems. Client helps customers in two ways: Working with the world’s leading AI labs to advance frontier model capabilities in thinking, reasoning, coding, agentic behavior, multimodality, multilinguality, STEM and frontier knowledge; and leveraging that work to build real-world AI systems that solve mission-critical priorities for companies. Powering this growth is Client talent cloud—an AI-vetted pool of 4M+ software engineers, data scientists, and STEM experts who can train models and build AI applications. All of this is orchestrated by ALAN—our AI-powered platform for matching and managing talent, and generating high-quality human and synthetic data to improve model performance. ALAN also accelerates workflows for model and agent evals, supervised fine-tuning, reinforcement learning, reinforcement learning with human feedback, preference-pair generation, benchmarking, data capture for pre-training, post-training, and building AI applications. Client—based in San Francisco, California—was named #1 on The Information's annual list of "Top 50 Most Promising B2B Companies," and has been profiled by Fast Company, TechCrunch, Reuters, Semafor, VentureBeat, Entrepreneur, CNBC, Forbes, and many others. Client leadership team includes AI technologists from Meta, Google, Microsoft, Apple, Amazon, X, Stanford, Caltech, and MIT. Job Title: Python Developer Location: Remote Note: Candidate should be comfortable to work for US Shifts/Night Shifts Interview Mode: Virtual (Two rounds of interviews (60 min technical + 30 min technical & cultural discussion) Client: Turing Experience: 5+ yrs Job Type : Contract to hire. Notice Period:- Immediate joiners. Roles and Responsibilities: Analyze and triage GitHub issues across trending open-source libraries. Set up and configure code repositories, including Dockerization and environment setup. Evaluating unit test coverage and quality. Modify and run codebases locally to assess LLM performance in bug-fixing scenarios. Collaborate with researchers to design and identify repositories and issues that are challenging for LLMs. Opportunities to lead a team of junior engineers to collaborate on projects. Required Skills: Minimum 5+ years of overall experience Strong experience with at least one of the following languages: Python Proficiency with Git, Docker , and basic software pipeline setup. Ability to understand and navigate complex codebases. Comfortable running, modifying, and testing real-world projects locally. Experience contributing to or evaluating open-source projects is a plus.
Posted 1 day ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title: React.Js Developer Location: Pune Experience: 5+ yrs Job Type: Contract to hire(Min 1+ yr) Notice Period: Immediate joiners Job Description: 5 years hands on experience in ReactJS Nodejs React class based component and life cycle methods React function based component and React Hooks methods Redux framework and state management Express JS middleware authentication and authorization JEST Enzyme for unit test case Styled component in CSS HTTP network request REST API Proficient in HTML responsive CSS JavaScript JSON data handling Proficient in asynchronous programming Promise asyncawait callback Should be good with debugging in JavaScript Hands on experience for Jenkin GIT SonarQube Checkmarks Nexus Goods to have performance tuning MongoDB
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are looking for a skilled and hands-on AI Engineer to design, develop, and deploy an in-house AI assistant powered by LLaMA 3 and integrated with our MS SQL-based ERP system (4QT ERP) . This role includes responsibility for setting up LLM infrastructure , voice input (Whisper) , natural language to SQL translation , and delivering accurate, context-aware responses to ERP-related queries. Key Responsibilities: Setup and deploy LLaMA 3 (8B/FP16) models using llama-cpp-python or Hugging Face Integrate the AI model with FastAPI to create secure REST endpoints Connect with MS SQL database and design query logic for ERP modules (Sales, Payments, Units, etc.) Implement prompt engineering or fine-tuning (LoRA) to improve SQL generation accuracy Build a user-facing interface (React or basic web UI) for interacting via text or voice Integrate Whisper (OpenAI) or any STT system to support voice commands Ensure model responses are secure, efficient, and auditable (only SELECT queries allowed) Supervise or perform supervised fine-tuning with custom ERP datasets Optimize for performance (GPU usage) and accuracy (prompt/RAG tuning) Must-Have Skills: Strong experience with LLM deployment (LLaMA 3, Mistral, GPT-type models) Solid Python development experience using FastAPI or Flask SQL knowledge (esp. MS SQL Server ) – must know how to write and validate queries Experience with llama-cpp-python , Hugging Face Transformers, and LoRA fine-tuning Familiarity with LangChain or similar LLM frameworks Understanding of Whisper (STT) or equivalent Speech-to-Text tools Experience working with GPU inference (NVIDIA 4070/5090 etc.)
Posted 1 day ago
8.0 - 15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Oracle DBA Location: Noida Experience: 8-15 Years Required Skills Hands on experience working with Oracle database activities, SQL, OEM, Golden Gate Replication (CDC & BDA), RAC/EXA Setup. Expertise in SQL Profiling/ Performance Tuning, Database Monitoring, Backup and Restore, Data guard, Grid control toolset, etc. Responsible for technical database, support of Infrastructure, Applications & other components & processes. Participate in Planning, Development of specifications, & other supporting documentation & processes. Knowledge of Finance/banking industry, Terminology, Data & Data structures is add-on. Knowledge of SQL server as well as Oracle databases. Knowledge of Identity Framework. Experienced technical knowledge in specialty area with basic knowledge of complementary infrastructures. A fast learner with ability to dive into new products and technologies, develop subject matter expertise and drive projects to completion. A team player with good written and verbal communication skills that can mentor other members in the production support group. The candidates having experience & Knowledge and experience in scripting language ksh, Perl, etc. would be preferred for this role. Understanding ITIL processes. Utilizing monitoring tools effectively.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough