Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Purpose Design and Develop Software solutions to meet the functional requirements of innovative products developed by Trimble. Make use of well-established design patterns and architectures that suit the Salesforce platform and follow agile development processes. Responsible for designing, implementing, and maintaining software applications on the Salesforce platform. Perform unit testing of the code developed, develop unit test cases and test harness. Perform system integration and bug fixing, develop user documentation, generation of relevant reports and review similar work done by peers. Primary Responsibilities Read and understand high-level product descriptions or requirement documents and propose one or more software designs at the module level that are highly reusable and subject to the design principles of the software’s target host platform. Decompose design elements into the structured code as per prevailing coding guidelines, preparation and execution of unit test cases and development of test codes or test harnesses. Traceback through code, design and resolve issues and bugs. Plan, organize and execute assignments with very little or moderate supervision, in an agile environment Responsible for deliveries within the required deadlines. Deliveries can be modules, documentation, customer releases, etc. Collaborate with business analysts, stakeholders to gather and analyze business requirements Coordinate with the team for the timely delivery of work products. Ensure the quality of work products by reviews. Develop/Maintain customizations, configurations, integrations on Salesforce platform Write and maintain clean, efficient, and well-documented code Ensure data integrity and accuracy through data analysis and data cleansing activities Create and maintain technical design documentation Participate in code reviews and provide constructive feedback to peers Develop and execute unit and integration tests Support the deployment of solutions into production environments Provide ongoing maintenance and support for existing Salesforce applications Sub-Module level responsibility in large projects and Module (or component) level responsibility in small/medium sized projects and complete responsibility is small-sized projects, depending upon the complexity and decomposition Work with User Experience, QA teams to ensure the validity of the solution Be contemporary by adopting technological and market evolutions Other responsibilities as assigned by the management from time to time Skills And Background Solid understanding of Salesforce development principles, including Apex, Visualforce, Lightning Web Components (LWC), SOQL, and SOSL Experience with Salesforce integrations using REST and SOAP APIs Experience with Salesforce data model and security model Good understanding of software architectures, software algorithms, data structures and software engineering principles Good exposure to design, development, debugging tools, and agile methodologies Good analytical and problem-solving skills Experience with Salesforce Best practices for system development and integration Strong background in database design and system architecture Experience doing major transformation from Classic to Lightning 4+ Years of experience developing on the Full-stack Force.com platform 2+ years of experience with Lighting Web Component (LWC) programming skills 3+ Years of experience with Apex Triggers, Batch Classes and @Future Methods, Controllers Calling REST web services from Apex, generating and parsing JSON in Apex Visualforce Pages and Components Visualforce Remoting Effective Apex unit testing, including web service mocking Should have experienced the Service cloud / Sales Cloud/ Community Cloud Experience with Salesforce integration patterns, including application programming interfaces (API) and bulk data uploads Experience with Release Management, Source Control, and Deployment concepts and technologies such as ANT, SFDC Metadata API, Jenkins, Git (Code Commit) and DevOps in a Salesforce environment Excellent communication and interpersonal skills Upbeat, highly motivated and self-starter Salesforce Badges & Certifications (preferred) Java experience will be a value-added Work Experience Must have been a developer for 4 to 7 years or more in relevant areas in a tier-1 or tier-2 ranked product company Experience working with cross-cultural teams. Minimum Required Qualification Bachelors or Masters Degree in Engineering from a tier-1 or tier-2 ranked institute with a major in Computer Science or Information Technology. (or) Bachelors or Masters Degree in Engineering from a tier-1 or tier-2 ranked institute with a major in Electrical or Electronics Engineering with a minor in Computer Science or Information Technology. Reporting Individuals selected for this role shall report to a Technical Project Manager or Engineering Manager or Engineering Director or a person designated by the division. About Trimble Dedicated to the world’s tomorrow, Trimble is a technology company delivering solutions that enable our customers to work in new ways to measure, build, grow and move goods for a better quality of life. Core technologies in positioning, modeling, connectivity and data analytics connect the digital and physical worlds to improve productivity, quality, safety, transparency and sustainability. From purpose-built products and enterprise lifecycle solutions to industry cloud services, Trimble is transforming critical industries such as construction, geospatial, agriculture and transportation to power an interconnected world of work. For more information about Trimble (NASDAQ: TRMB), visit: www.trimble.com Trimble’s Inclusiveness Commitment We believe in celebrating our differences. That is why our diversity is our strength. To us, that means actively participating in opportunities to be inclusive. Diversity, Equity, and Inclusion have guided our current success while also moving our desire to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs in place to make sure our people are seen, heard, and welcomed and most importantly that they know they belong, no matter who they are or where they are coming from.
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About The Team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. Basic Qualifications Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones Preferred Qualifications Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3031496
Posted 3 weeks ago
0 years
4 - 4 Lacs
Bhubaneswar, Odisha, India
On-site
Worked in core PHP and PHP-MVC In-depth knowledge of CodeIgniter Framework Database Management: MySQL Parsing Technique: XML, JSON Knowledge of Javascript, CSS, HTML5, CSS3, jQuery, Bootstrap. In-depth knowledge of Rest API. Work Profile Meeting with the development team to discuss user interface ideas and applications. Reviewing application requirements and interface designs. Identifying web-based user interactions. Developing and implementing highly responsive user interface components using PHP concepts. Writing application interface codes using JavaScript following PHP workflows. Troubleshooting interface software and debugging application codes. Developing and implementing front-end architecture to support user interface concepts. Monitoring and improving front-end performance. Documenting application changes and developing updates. Primary Tasks Create and maintain new modules, and features for websites and CMS. Responsible for maintaining existing websites. Responsible for testing of application (unit/integration). Developing and implementing highly responsive user interface components using PHP concepts. Writing application interface codes using JavaScript following PHP workflows. Coordinate with project lead/onsite team for clarifications and documentation PHP Developer Requirements Should be Open to learning PHP as the Organisation will groom interested candidates in PHP tech for handling upcoming projects.Required Candidate profile: Bachelor's degree in computer science, information technology, or a similar field. Previous experience working as a PHP developer. In-depth knowledge of JavaScript, CSS, HTML, and front-end languages. Knowledge of PHP tools including Codeigniter framework Experience with user interface design Experience with browser-based debugging and performance testing software. Excellent troubleshooting skills. Good project management skills. Job Types: Full-time, Permanent Pay: ₹360,000.00 - ₹450,000.00 per year Benefits Health insurance Schedule Day shift Supplemental Pay Performance bonus Work Location: In person Skills:- CodeIgniter
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for an experienced application packager to re-package, test, and certify our current desktop application suite. The candidate will work with subject matter experts to ensure the functionality of in-house or third party software is fully tested for the Windows 11 migration project. The candidate should have strong knowledge of Office plug-ins, application compatibility, and operating system user access control (UAC) as it relates to software functionality on Windows 10. The individual will help us shape the process around our application lifecycle and assist in the compilation of user profile components that will follow the end user client. The candidate should have extensive experience packaging applications in support of migrating from Windows 10 to Windows 11. Requirements Application packaging experience in similar environment Experience working on Windows Operating System migrations. Work with developers, application owners and subject matter experts to ensure successful automated installation of software. Ability to work with subject matter experts and end user clients to coordinate UAT. Experience in peer review of a packager’s work and quality assurance of packaged applications to ensure they conform to existing corporate standards. Creation of technical documentation of the packaging work. In depth knowledge and experience using software packaging tools, (e.g. InstallShield, Orca, App-V sequencer, Wise, Vbscript). Automation and scripting using Powershell Strong knowledge of the Windows Registry. An understanding of Windows User Access Control (UAC) and process elevation. Full understanding of file system and registry security in locked down environments. Configuring PowerBroker policies to support application elevation in a locked down environment Knowledge in application discovery of installed applications and integration into CMDB a plus. Strong troubleshooting mind-set (e.g. parsing event log, Sysinternals, analyzing memory dumps) General knowledge of user profile management (Citrix UPM, WEM, Flex, Appsense, UE-V). Coding in SQL is a plus. Knowledge of GPO. Knowledge and experience deploying applications through Altiris ITMS (preferred) or SCCM. Experience building and using application testing tools and frameworks in support of Windows As A Service and automated application lifecycle management
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Greater Chennai Area
On-site
Overview Prodapt is looking for a Software Engineer (AI/ML) position with 4 - 6 years of experience, well versed in implementing & integrating AI applications. Responsibilities Cloud Environment Setup & Integration Single Sign-On (SSO) Cosmos platform for LLM/workflow integration Logging, monitoring, and security services. Deploy solution on Azure. RAG Pipeline Development Build a RAG pipeline to ingest and analyze AuditBoard reports (PDF, text, etc.). Enable querying and LLM-based summarization. User Interface Interactive querying Result visualization Report generation/export Develop an intuitive UI to support Requirements Experience: 4-6 years Key Skills: Python, LLM/RAG concepts. Azure Cloud experience (or equivalent cloud). Data ingestion, document parsing (PDF, text). REST API and basic UI development (React, Streamlit, etc.). Understanding of security and integration patterns (OAuth/SSO preferred).
Posted 3 weeks ago
10.0 years
0 Lacs
India
On-site
Company Description Technocratic Solutions is a leading provider of contract-based technical resources, serving businesses worldwide with top-notch software solutions. Our expert team specializes in cutting-edge technologies such as PHP, Java, JavaScript, Drupal, QA, Blockchain AI, and more. We empower businesses by providing high-quality technical resources that meet their project requirements effectively. Committed to exceptional customer service, we continuously enhance our services to maintain our reputation as a reliable partner focused on customer success. Join us and experience the difference of working with a partner driven by excellence. Key Responsibilities Looking for 10 years of experience in AI architect Discovery Phase Leadership • Client Engagement: Work directly with the client to understand and document all use cases (that are required to be built) spanning semantic search, document processing, predictive modeling, and agentic analytics • Requirements Analysis: Translate complex business needs into detailed technical specifications with accuracy requirements (including 100% accuracy for financial compliance use cases) • Architecture Strategy: Design future-proof, modular architecture that prevents vendor lock-in while maximizing strategic flexibility Technical Architecture Design • Prototype Development: Build working demos demonstrating key capabilities and optimization approaches • Cost-Benefit Analysis: Justify investment into a tech stacks by comparing it against other stacks for the long-term roadmap. • Implementation Roadmap: Detailed phased approach from pilot to full production deployment Strategic Planning • Long-term Vision: Create long term technology evolution plan preventing costly refactoring • Risk Assessment: Identify and mitigate stack lock-in risks and technical dependencies • Go-to-Market Strategy: Define pilot features for rapid market entry while building toward comprehensive platform Required Technical Expertise AI/ML Frameworks • DSPy: Deep understanding of automated prompt optimization, few-shot learning, and algorithmic tuning • LangGraph: Experience with multi-agent orchestration and complex workflow design • Azure AI & PromptFlow: Proficiency in Microsoft's AI services and visual workflow tools • RAG Architectures: Advanced knowledge of retrieval-augmented generation system Cloud & Infrastructure • Azure Ecosystem: Comprehensive understanding of AI Foundry, Cognitive Services, and enterprise scaling • Microservices Architecture: Design of modular, swappable components • API Design: RESTful services and integration patterns • Performance Optimization: Large-scale system optimization and monitoring • Hybrid AI Stack: Design and validate integration of DSPy + LangGraph + PromptFlow + Azure AI services • Scalability Planning: Architect solutions for 100K user base with cost-effective licensing models • Integration Strategy: Plan seamless integration with existing product ecosystem • Technology Evaluation: Conduct comparative analysis of AI frameworks, providing evidence-based recommendations Deliverable Creation • Technical Feasibility Studies: Comprehensive analysis for all the use-cases of the requirement Financial Services Domain [Good to have) • Regulatory Compliance: Understanding of financial data accuracy requirements and audit trails • Document Processing: Experience with legal document parsing (LPAs, fund documents) • Predictive Analytics: Investment modeling and risk assessment systems • CRM Integration: Customer relationship management and sentiment analysis Required Experience Professional Background • 8+ years in AI/ML architecture roles with enterprise clients • Hands-on experience with modern AI frameworks (DSPy, LangGraph, or similar) • Proven track record of leading discovery and implementation for complex AI implementations Client Management • Executive Communication: Ability to present technical concepts to C-level stakeholders • Requirements Gathering: Expert in translating business needs to technical specifications • Stakeholder Management: Experience managing demanding, detail-oriented clients • Documentation: Exceptional technical writing and presentation skills Technical Leadership • Architecture Design: Led design of scalable AI systems serving 50K+ users • Technology Evaluation: Experience conducting comparative analysis of AI platforms • Prototype Development: Hands-on coding ability for proof-of-concept development • Cost Estimation: Accurate project scoping and resource planning Preferred Qualifications Advanced Expertise • PhD/MS in Computer Science, AI/ML, or related field • Publications/Patents in AI optimization or enterprise AI architecture • Speaking Experience at AI conferences or industry events • Open Source Contributions to AI frameworks or libraries Industry Experience [Good to have] • Private Equity/Investment Management domain knowledge • Regulatory Technology experience with audit and compliance systems • Enterprise AI Deployments at scale (100K+ users) • Cost Optimization experience with AI workloads and licensing models Key Success Metrics Discovery Phase Outcomes • Client Approval: Scott approves progression to development phase based on discovery results • Technical Validation: All use cases of the requirement deemed technically feasible with proposed architecture • Cost Justification: Clear ROI demonstration for 4x cost premium over SFDC alternative • Timeline Adherence: Discovery completed within agreed timeframe and budget. Application Requirements Portfolio Submission • Architecture Samples: 2-3 examples of complex AI system designs you've led • Case Studies: Detailed examples of discovery phase leadership with measurable outcomes • Technical Writing: Samples of technical documentation for executive audiences • Client References: References from previous discovery/consulting engagements Technical Assessment • Architecture Design: Live design session for a sample use case from Scott's requirements • Framework Knowledge: Deep-dive technical discussion on DSPy optimization approaches • Business Acumen: Case study analysis of technology investment decisions • Client Interaction: Mock discovery session with simulated challenging client requirements. Architecture Quality • Future-Proof Design: Architecture prevents vendor lock-in and supports long-term evolution • Scalability Validation: 100K user performance and cost models validated • Integration Feasibility: Seamless integration strategy with the product confirmed • Accuracy Framework: 100% accuracy requirements for financial compliance addressed
Posted 3 weeks ago
4.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Required Skills & Qualifications Programming Languages & Frameworks: 4-5+ years of professional experience in Python development. 2+ years of experience with Django (Django REST Framework a strong plus). Data & Databases: Strong expertise in PostgreSQL: schema design, writing optimized SQL (indexes, partitions), migrations (Django migrations). Comfortable designing star-schema/dimension-fact table models. Experience with CSV/JSON parsing libraries (e.g., pandas, csv, dictreader) and writing resilient ETL scripts. Web Scraping & Automation: Hands-on experience with headless-browser automation tools such as Selenium or Playwright (Python bindings). Familiarity with handling OTP/2FA flows programmatically (e.g., integrating with Twilio, Vault, or custom prompt workflows). API Development & Security: Proficient building, testing, and documenting RESTful APIs (Django REST Framework, DRF serializers, viewsets). Strong understanding of JWT or token-based authentication, secure session management, and role-based ACL. Scheduling & Background Jobs: Experience setting up CRON, APScheduler, Celery (with Redis/RabbitMQ), or equivalent for periodic job orchestration. Knowledge of implementing retry logic, backoff strategies, and idempotency for long-running tasks. DevOps & Deployment: Familiar with Docker and containerization best practices for Python applications. Experience writing CI/CD pipelines (GitHub Actions, GitLab CI, Jenkins). Exposure to cloud platforms (AWS, GCP, or Azure), specifically RDS or managed PostgreSQL, EC2/ECS, and secrets management (AWS Secrets Manager, Parameter Store). Logging & Monitoring: Skilled in integrating structured logging (with logging, StructLog, or log aggregation services like ELK/Elastic Stack, Splunk). Familiarity with error-tracking tools (e.g., Sentry) and writing health-check endpoints. Other Technical Skills: Proficient in Git version control, code reviews (GitHub/GitLab). Ability to write unit tests (pytest, Django TestCase) and integration tests. Strong understanding of REST API performance optimization and caching strategies (Redis/memcached). Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication skills to collaborate with product owners, data analysts, and frontend developers. Self-motivated, able to prioritize tasks, and deliver on aggressive timelines. Familiarity with Agile/Scrum methodologies; comfortable working in sprints, attending stand-ups, and refining user stories. NP : Immediate to 30 Days preferred.
Posted 3 weeks ago
0 years
5 - 16 Lacs
India
On-site
Job description Senior Embedded Software Engineer - Job Description We are seeking an experienced Senior Embedded Software Engineer to join our team. The ideal candidate will have expertise in software architecture, design, and implementation for embedded systems. The role requires proficiency in various programming languages, communication protocols, and multimedia technologies. Key Responsibilities: Develop and maintain embedded software solutions for various applications. Design and implement software architecture using OOP concepts. Work on video processing and streaming technologies, including ONVIF/RTSP, H264/H265, and related codecs. Integrate and optimize multimedia frameworks like FFmpeg, Gstreamer, and LIVE555. Develop GUI applications using QT/QML. Implement and troubleshoot communication protocols, including BLE, WiFi, RS485, and UART. Develop multi-threaded and socket-based applications. Perform image processing and video parsing using tools like OpenCV. Collaborate with cross-functional teams for hardware-software integration. Key Skills: Proficiency in C, C++, Java, and Python. Strong understanding of software architecture and design principles. Expertise in multi-threading and socket programming. Knowledge of video codecs, streaming protocols, and multimedia frameworks. Hands-on experience with communication protocols and hardware interfaces. Qualifications: Bachelor’s or Master’s degree in Computer Science, Electronics. Proven experience in embedded software development. Strong analytical and problem-solving skills. Job Type: Full-time Job Type: Full-time Pay: ₹547,829.17 - ₹1,669,382.12 per year Schedule: Day shift Work Location: In person Speak with the employer +91 9081068979 Expected Start Date: 12/07/2025
Posted 3 weeks ago
10.0 years
40 - 50 Lacs
Pune, Maharashtra, India
On-site
About Company With a focus on Identity and Access Management (IAM) and Customer Identity and Access Management (CIAM), we offer cutting-edge solutions to secure your workforce, customers, and partners. Our expertise also includes offering new-age security solutions for popular CMS and project management platforms like Atlassian, WordPress, Joomla, Drupal, Shopify, BigCommerce, and Magento. Our solutions are specific, accurate, and, most importantly, great at doing what they’re supposed to: making you more secure! Position Details We are looking for a talented and experienced AI/ML Engineer to join our growing team and contribute to the development of cutting-edge AI-powered products and solutions. The ideal candidate will have 10+ years of hands-on experience in developing and deploying advanced AI and ML models and related software systems. Status: Full Time, Employee Experience: 10+ Years Qualifications: Bachelor's or Master's Degree in Computer Science, Data Science, Computational Linguistics, Natural Processing (NLP), or other related fields. Location: Baner, Pune Roles & Responsibilities Develop machine learning and deep learning models and algorithms to solve complex business problems, improve processes, and enhance product functionality. Develop and deploy personalized large language models (LLMs). Develop document parsing, named entity recognition (NER), retrieval-augmented generation (RAG), and chatbot systems. Build robust data and ML pipelines for production scale and performance. Optimize and fine-tune machine learning models for performance, scalability, and accuracy, leveraging techniques such as hyperparameter tuning and model optimization. Write robust, production-quality code using frameworks like PyTorch or TensorFlow. Stay updated on the latest advancements in AI/ML technologies, tools, and methodologies, incorporating best practices into development processes. Collaborate with stakeholders to understand business requirements, define project objectives, and deliver AI/ML solutions that meet customer needs and drive business value. Requirements Bachelor's or Master's Degree in Computer Science, Data Science, Computational Linguistics, Natural Processing (NLP), or other related fields. 10+ years of experience in developing and deploying machine learning models and algorithms, with hands-on experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn). Experience with Python, spaCy, NLTK, and knowledge graphs. Experience with search, particularly information retrieval, Elasticsearch, and relevance. Experience working with and fine-tuning existing models, especially from Hugging Face. Strong programming skills in languages such as Python, Java, or C++ for AI/ML model development and integration. Familiarity with web frameworks (FastAPI, Flask, Django, etc.) for building APIs. Knowledge of agentic AI frameworks like LangChain, LangGraph, AutoGen, or Crew AI. Outstanding knowledge and experience in Data Science and MLOps in the fields of ML/DL, Generative AI with experience in containerization (Docker) and orchestration (Kubernetes) for deployment. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and AI/ML deployment tools for scalable and reliable model deployment. Strong communication and collaboration skills, with the ability to work independently and collaboratively in a dynamic environment. What We Offer You A constant stream of new things for you to learn. We're always expanding into new areas and exploring new technologies. A set of extraordinarily talented and dedicated peers. A stable, collaborative, and supportive work environment. Skills: ai/ml technologies,deep learning,ai,kubernetes,neuro-linguistic programming (nlp),scikit-learn,tensorflow,c++,langchain,elasticsearch,python,autogen,nltk,deep learning models,crew ai,flask,chatbot,ner,ml,ml pipelines,nlp,mlops,orchestration,machine learning algorithms,generative ai,data science,containerization,natural language processing,fastapi,ai/ml,django,azure,gcp,machine learning models,apis,spacy,algorithms,llm,docker,agentic ai,large language models,pytorch,artificial intelligence,rag,java,aws,machine learning,ml/dl
Posted 3 weeks ago
3.0 years
3 - 8 Lacs
Gurgaon
On-site
DESCRIPTION Interested in building something new? Join the Amazon Autos team on an exhilarating journey to redefine the vehicle shopping experience. This is an opportunity to be part of the Amazon's new business ventures. Our goal is to create innovative automotive discovery and shopping experiences on Amazon, providing customers with greater convenience and a wider selection. You'll work in a creative, fast-paced, and entrepreneurial environment at the center of Amazon's innovation. As a key member, you'll play a pivotal role in helping us achieve our mission. We are looking for a highly accomplished Applied Science professional drive our science strategy, foster a culture of data-driven decision-making, and drive impactful business outcomes through advanced state-of-the-art science methodologies. If you're enthusiastic about innovating and delivering exceptional shopping experiences to customers, thrive on new challenges, and excel at solving complex problems using top-notch ML models, LLM and GenAI techniques, then you're the perfect candidate for this role. Strong business acumen and interpersonal skills are a must, as you'll work closely with business owners to understand customer needs and design scalable solutions. Join us on this exhilarating journey and be part of redefining the vehicle shopping experience. Key job responsibilities As an Applied Scientist in Amazon Autos, you will: Shape the roadmap and strategy for applying science to solve customer problems in the Amazon AutoStore domain. Drive big picture innovations with clear roadmaps for intermediate delivery. Apply your skills in areas such as deep learning and reinforcement learning while building scalable solutions for business problems. Produce and deliver models that help build best-in-class customer experiences and build systems that allow us to deploy these models to production with low latency and high throughput. Utilize your Generative AI, time series and predictive modeling skills, and creative problem-solving skills to drive new projects from ideation to implementation. Interface with business customers, gathering requirements and delivering science solutions. Collaborate with cross-functional teams, including software engineers, data scientists, and product managers, to define project requirements, establish success metrics, and deliver high-quality solutions. Effectively communicate complicated machine learning concepts to multiple partners. Research new and innovative machine learning approaches. A day in the life In this role, you will be part of a multidisciplinary team working on one of Amazon's newest business ventures. As a key member, you will collaborate closely with engineering, product, design, operations, and business development to bring innovative solutions to our customers. Your science expertise will be leveraged to research and deliver novel solutions to existing problems, explore emerging problem spaces, and create new knowledge. You will invent and apply state-of-the-art technologies, such as large language models, machine learning, natural language processing, and computer vision, to build next-generation solutions for Amazon. You'll publish papers, file patents, and work closely with engineers to bring your ideas to production. About the team This is a critical role for Amazon Autos team with a vision to create innovative automotive discovery and shopping experiences on Amazon, providing customers better convenience and more selection. We’re collaborating with other experienced teams at Amazon to define the future of how customers research and shop for cars online. BASIC QUALIFICATIONS 3+ years of building models for business application experience PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience Experience in patents or publications at top-tier peer-reviewed conferences or journals Experience programming in Java, C++, Python or related language Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing PREFERRED QUALIFICATIONS Experience using Unix/Linux Experience in professional software development Experience building complex software systems, especially involving deep learning, machine learning and computer vision, that have been successfully delivered to customers Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, HR, Gurugram Machine Learning Science
Posted 3 weeks ago
0 years
1 - 3 Lacs
India
On-site
Job Summary We are looking for a highly skilled React.js + Node.js Developer to join our development team. The ideal candidate will have strong hands-on experience with Node.js , Express, React.js , and Typescript, es6 features, along with a deep understanding of modern JavaScript development practices. This role requires a proactive developer who can build high-performance applications and is open to continuous learning. Responsibilities Develop and maintain responsive frontend applications using React.js Build backend services and REST APIs using Node.js and Express Build and maintain reusable component libraries for consistency and reusability Work with JSON data structures , parsing, and dynamic manipulation Use lazy loading and performance optimization techniques to enhance user experience Collaborate with the team to deliver scalable and maintainable solutions Use Git and version control systems to manage code and collaborate effectively Stay updated with latest technologies, and always be ready to learn new features or tech Demonstrate excellent problem-solving and debugging skills Skills Required Strong experience with React.js , Node.js , and Express.js Solid understanding of JavaScript , TypeScript , and ES6+ features Experience with SPA architecture and preventing memory leaks Practical experience with JSON data manipulation Understanding of lazy loading and code splitting Knowledge of styling systems (e.g., SCSS , CSS Modules , Tailwind CSS ) Hands-on with frontend bundlers like Webpack , Vite , or Parcel Familiarity with Redux , Context API , and modular architecture Proficiency in Git and version control best practices Strong communication skills and an open-minded team player Always eager to learn new technologies and adopt best practices Testing and Code Quality Awareness Good knowledge on writing application documentation API Integration and 3rd-Party Services Nice to Have Experience with relational databases (e.g., MySQL , MongoDB ) Familiarity with CI/CD pipelines Jira tasks Confluence documentation Agile methodology Knowledge on unit and End-to-End (E2E) Testing Tools (for Full UI Flow) Job Type: Full-time Pay: ₹15,000.00 - ₹30,000.00 per month Benefits: Paid sick time Paid time off Work Location: In person Expected Start Date: 14/07/2025
Posted 3 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Software and Product Innovation team you lead the implementation of user stories and solve business problems. As a Senior Associate, you guide and mentor junior team members while maintaining professional and technical standards to deliver quality client solutions. Responsibilities Design and implement Java Microservices architecture Collaborate with teams to define project scope and objectives Conduct code reviews to maintain quality standards Mentor junior developers in microservices practices Troubleshoot and resolve application issues promptly Stay updated on microservices trends and technologies Contribute to the software development lifecycle Document technical specifications and workflows What You Must Have Bachelor's Degree 4 years of experience in software engineering Oral and written proficiency in English required What Sets You Apart Proven experience in Java, Spring Boot, and Microservices Familiarity with RESTful APIs and JMS Understanding of financial applications, especially payment/wires Hands-on experience with DevOps practices and tools Demonstrating exceptional communication and interpersonal skills Knowledge of containerized deployments preferred Experience with microservices architecture and related technologies Ability to present technical solutions to executive stakeholders Understanding of message parsing in banking messages preferred
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Reference # 316918BR Job Type Full Time Your role Are you an analytic thinker? Do you enjoy creating valuable insights with data? Do you want to play a key role in transforming our firm into an agile organization? At UBS, we re-imagine the way we work, the way we connect with each other – our colleagues, clients and partners – and the way we deliver value. Being agile will make us more responsive, more adaptable, and ultimately more innovative. We’re looking for a Data Engineer to: transform data into valuable insights that inform business decisions, making use of our internal data platforms and applying appropriate analytical techniques design, model, develop, and improve data pipelines and data products engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using data platform infrastructure effectively develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements. build observability into our solutions, monitor production health, help to resolve incidents, and remediate the root cause of risks and issues understand, represent, and advocate for client needs Your team In our agile operating model, crews are aligned to larger products and services fulfilling client needs and encompass multiple autonomous pods. You’ll be working in the Developer Workspaces Team focusing on providing compute, development environments and tooling to developers and business users. Your expertise comprehensive understanding and ability to apply data engineering techniques, from event streaming and real-time analytics to computational grids and graph processing engines curious to learn new technologies and practices, reuse strategic platforms and standards, evaluate options, and make decisions with long-term sustainability in mind strong command of at least one language among Python, Java, Golang understanding of data management and database technologies including SQL/NoSQL understanding of data products, data structures and data manipulation techniques including classification, parsing, pattern matching experience with Databricks, ADLS, Delta Lake/Tables, ETL tools would be an asset good understanding of engineering practices and software development lifecycle enthusiastic, self-motivated and client-focused About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 3 weeks ago
5.0 years
0 Lacs
Delhi, India
On-site
Job Description : SDET (Software Development Engineer in Test) Notice Period Requirement: Immediately to 2 Month(Officially) Job Locations: Gurgaon/Delhi Experience: 5 to 8 Years Skills: SDET, Automation, Java programming, Selenium, Cucumber, Rest Assured, API Coding(All Mandatory) Job Type : Full-Time Job Description We are seeking an experienced and highly skilled SDET (Software Development Engineer in Test) to join our Quality Engineering team. The ideal candidate will possess a strong background in test automation with API testing or mobile testing or Web, with hands-on experience in creating robust automation frameworks and scripts. This role demands a thorough understanding of quality engineering practices, microservices architecture, and software testing tools. Key Responsibilities : - Design and develop scalable and modular automation frameworks using best industry practices such as the Page Object Model. - Automate testing for distributed, highly scalable systems. - Create and execute test scripts for GUI-based, API, and mobile applications. - Perform end-to-end testing for APIs, ensuring thorough validation of request and response schemas, status codes, and exception handling. - Conduct API testing using tools like Rest Assured, SOAP UI, NodeJS, and Postman, and validate data with serialization techniques (e.g., POJO classes). - Implement and maintain BDD/TDD frameworks using tools like Cucumber, TestNG, or JUnit. - Write and optimize SQL queries for data validation and backend testing. - Integrate test suites into test management systems and CI/CD pipelines using tools like Maven, Gradle, and Git. - Mentor team members and quickly adapt to new technologies and tools. - Select and implement appropriate test automation tools and strategies based on project needs. - Apply design patterns, modularization, and user libraries for efficient framework creation. - Collaborate with cross-functional teams to ensure the quality and scalability of microservices and APIs. Must-Have Skills : - Proficiency in designing and developing automation frameworks from scratch. - Strong programming skills in Java, Groovy, or JavaScript with a solid understanding of OOP concepts. - Hands-on experience with at least one GUI automation tool (desktop/mobile). Experience with multiple tools is an advantage. - In-depth knowledge of API testing and microservices architecture. - Experience with BDD and TDD methodologies and associated tools. - Familiarity with SOAP and REST principles. - Expertise in parsing and validating complex JSON and XML responses. - Ability to create and manage test pipelines in CI/CD environments. Nice-to-Have Skills : - Experience with multiple test automation tools for GUI or mobile platforms. - Knowledge of advanced serialization techniques and custom test harness implementation. - Exposure to various test management tools and automation strategies. Qualifications : - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5 Years+ in software quality engineering and test automation. - Strong analytical and problem-solving skills with attention to detail.
Posted 3 weeks ago
2.0 - 3.0 years
7 - 8 Lacs
Hyderābād
On-site
Position Title: Data Engineer Location: Hyderabad Grade: L3-1 Hiring Manager: Sabya DG About the Job At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as a Data Engineer and you can help make it happen. What you will be doing: Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As a Data Engineer, you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data & AI Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities: Data Product Engineering: Provide input into the engineering feasibility of developing specific R&D Data/AI Products Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data Collaborate with Data/AI Product Owner and Scrum Master to share Progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories Optimize data workflows to drive high performance and reliability of implemented data products Oversee and support junior engineer with Data/AI Product testing requirements and execution Innovation & Team Collaboration: Stay current on industry trends, emerging technologies, and best practices in data product engineering Contribute to a team culture of innovation, collaboration, and continuous learning within the product team About You: Key Functional Requirements & Qualifications: Bachelor’s degree in software engineering or related field, or equivalent work experience 2-3 years of experience in data engineering, software engineering, or other related fields Experience working in life science/pharmaceutical industry and understanding of R&D business preferred Excellent communication and collaboration skills Working knowledge and comfort working with Agile methodologies Key Technical Requirements & Qualifications: Experience in cloud-based data platforms and analytics engineering stack (AWS, Snowflake and DBT) Experience with job scheduling and orchestration (Airflow is a plus) Working knowledge of scripting languages (Python, Shell scripting) Good knowledge of SQL and relational databases technologies/concepts Understanding of data structures and algorithms Experience working with data models and query tuning Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Applications received after the official close date will be reviewed on an individual basis.
Posted 3 weeks ago
3.0 years
2 - 7 Lacs
Chennai
On-site
An Amazing Career Opportunity for AI/ML Engineer Location: Chennai, India (Hybrid) Job ID: 39582 Position Summary A rewarding career at HID Global beckons you! We are looking for an AI/ML Engineer , who is responsible for designing, developing, and deploying advanced AI/ML solutions to solve complex business challenges. This role requires expertise in machine learning, deep learning, MLOps, and AI model optimization , with a focus on building scalable, high-performance AI systems. As an AI/ML Engineer , you will work closely with data engineers, software developers, and business stakeholders to integrate AI-driven insights into real-world applications. You will be responsible for model development, system architecture, cloud deployment, and ensuring responsible AI adoption . We are a leading company in the trusted source for innovative HID Global Human Resources products, solutions and services that help millions of customers around the globe create, manage and use secure identities. Who are we? HID powers the trusted identities of the world’s people, places, and things, allowing people to transact safely, work productively and travel freely. We are a high-tech software company headquartered in Austin, TX, with over 4,000 worldwide employees. Check us out: www.hidglobal.com and https://youtu.be/23km5H4K9Eo LinkedIn: www.linkedin.com/company/hidglobal/mycompany/ About HID Global, Chennai HID Global powers the trusted identities of the world’s people, places and things. We make it possible for people to transact safely, work productively and travel freely. Our trusted identity solutions give people secure and convenient access to physical and digital places and connect things that can be accurately identified, verified and tracked digitally. Millions of people around the world use HID products and services to navigate their everyday lives, and over 2 billion things are connected through HID technology. We work with governments, educational institutions, hospitals, financial institutions, industrial businesses and some of the most innovative companies on the planet. Headquartered in Austin, Texas, HID Global has over 3,000 employees worldwide and operates international offices that support more than 100 countries. HID Global® is an ASSA ABLOY Group brand. For more information, visit www.hidglobal.com. HID Global has is the trusted source for secure identity solutions for millions of customers and users around the world. In India, we have two Engineering Centre (Bangalore and Chennai) over 200+ Engineering Staff. Global Engineering Team is based in Chennai and one of the Business Unit Engineering team is based in Bangalore. Physical Access Control Solutions (PACS) HID's Physical Access Control Solutions Business Area: HID PAC’s Business Unit focuses on the growth of new clients and existing clients where we leverage the latest card and reader technologies to solve the security challenges of our clients. Other areas of focus will include authentication, card sub systems, card encoding, Biometrics, location services and all other aspects of a physical access control infrastructure. Qualifications:- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Roles & Responsibilities: Design, develop, and deploy robust & scalable AI/ML models in Production environments. Collaborate with business stakeholders to identify AI/ML opportunities and define measurable success metrics. Design and build Retrieval-Augmented Generation (RAG) pipelines integrating vector stores, semantic search, and document parsing for domain-specific knowledge retrieval. Integrate Multimodal Conversational AI platforms (MCP) including voice, vision, and text to deliver rich user interactions. Drive innovation through PoCs, benchmarking, and experiments with emerging models and architectures. Optimize models for performance, latency and scalability. Build data pipelines and workflows to support model training and evaluation. Conduct research & experimentation on the state-of-the-art techniques (DL, NLP, Time series, CV) Partner with MLOps and DevOps teams to implement best practices in model monitoring, version and re-training. Lead code reviews, architecture discussions and mentor junior & peer engineers. Architect and implement end-to-end AI/ML pipelines, ensuring scalability and efficiency. Deploy models in cloud-based (AWS, Azure, GCP) or on-premises environments using tools like Docker, Kubernetes, TensorFlow Serving, or ONNX Ensure data integrity, quality, and preprocessing best practices for AI/ML model development. Ensure compliance with AI ethics guidelines, data privacy laws (GDPR, CCPA), and corporate AI governance. Work closely with data engineers, software developers, and domain experts to integrate AI into existing systems. Conduct AI/ML training sessions for internal teams to improve AI literacy within the organization. Strong analytical and problem solving mindset. Technical Requirements: Strong expertise in AI/ML engineering and software development. Strong experience with RAG architecture, vector databases Proficiency in Python and hands-on experience in using ML frameworks (tensorflow, pytorch, scikit-learn, xgboost etc) Familiarity with MCPs like Google Dialogflow, Rasa, Amazon Lex, or custom-built agents using LLM orchestration. Cloud-based AI/ML experience (AWS Sagemaker, Azure ML, GCP Vertex AI, etc.). Solid understanding of AI/ML life cycle – Data preprocessing, feature engineering, model selection, training, validation and deployment. Experience in production grade ML systems (Model serving, APIs, Pipelines) Familiarity with Data engineering tools (SPARK, Kafka, Airflow etc) Strong knowledge of statistical modeling, NLP, CV, Recommendation systems, Anomaly detection and time series forecasting. Hands-on in Software engineering with knowledge of version control, testing & CI/CD Hands-on experience in deploying ML models in production using Docker, Kubernetes, TensorFlow Serving, ONNX, and MLflow. Experience in MLOps & CI/CD for ML pipelines, including monitoring, retraining, and model drift detection. Proficiency in scaling AI solutions in cloud environments (AWS, Azure & GCP). Experience in data preprocessing, feature engineering, and dimensionality reduction. Exposure to Data privacy, Compliance and Secure ML practices Education and/or Experience: Graduation or master’s in computer science or information technology or AI/ML/Data science 3+ years of hands-on experience in AI/ML development/deployment and optimization Experience in leading AI/ML teams and mentoring junior engineers. Why apply? Empowerment: You’ll work as part of a global team in a flexible work environment, learning and enhancing your expertise. We welcome an opportunity to meet you and learn about your unique talents, skills, and experiences. You don’t need to check all the boxes. If you have most of the skills and experience, we want you to apply. Innovation : You embrace challenges and want to drive change. We are open to ideas, including flexible work arrangements, job sharing or part-time job seekers. Integrity: You are results-orientated, reliable, and straightforward and value being treated accordingly. We want all our employees to be themselves, to feel appreciated and accepted. This opportunity may be open to flexible working arrangements. HID is an Equal Opportunity/Affirmative Action Employer – Minority/Female/Disability/Veteran/Gender Identity/Sexual Orientation. We make it easier for people to get where they want to go! On an average day, think of how many times you tap, twist, tag, push or swipe to get access, find information, connect with others or track something. HID technology is behind billions of interactions, in more than 100 countries. We help you create a verified, trusted identity that can get you where you need to go – without having to think about it. When you join our HID team, you’ll also be part of the ASSA ABLOY Group, the global leader in access solutions. You’ll have 63,000 colleagues in more than 70 different countries. We empower our people to build their career around their aspirations and our ambitions – supporting them with regular feedback, training, and development opportunities. Our colleagues think broadly about where they can make the most impact, and we encourage them to grow their role locally, regionally, or even internationally. As we welcome new people on board, it’s important to us to have diverse, inclusive teams, and we value different perspectives and experiences. #LI-HIDGlobal
Posted 3 weeks ago
0 years
4 - 5 Lacs
Chennai
On-site
Has excellent knowledge of Node JS. Worked with Express JS. Knowledge of ORMs like Drizzle, TypeORM or Prizma Has knowledge of MySQL or PostGreSQL Has worked with REST Apis extensively Has knowledge of XML parsing and construction Good to have SAP exposure About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 weeks ago
3.0 years
0 Lacs
India
Remote
GitLab is an open core software company that develops the most comprehensive AI-powered DevSecOps Platform, used by more than 100,000 organizations. Our mission is to enable everyone to contribute to and co-create the software that powers our world. When everyone can contribute, consumers become contributors, significantly accelerating the rate of human progress. This mission is integral to our culture, influencing how we hire, build products, and lead our industry. We make this possible at GitLab by running our operations on our product and staying aligned with our values. Learn more about Life at GitLab. Thanks to products like Duo Enterprise, and Duo Workflow, customers get the benefit of AI at every stage of the SDLC. The same principles built into our products are reflected in how our team works: we embrace AI as a core productivity multiplier. All team members are encouraged and expected to incorporate AI into their daily workflows to drive efficiency, innovation, and impact across our global organization. An Overview Of This Role As a member of the Secret Detection team, you'll be at the forefront of protecting sensitive data by creating specialized tools that prevent, detect, and remediate leaked secrets in code. Our team focuses on the complete secret management lifecycle - from push protection to pipeline-based scanning, providing automated remediation workflows and audit trails when necessary. We’re passionate about embedding security into the development process seamlessly, allowing developers to focus on innovation while we handle security concerns proactively. You'll help developers safeguard their credentials, API keys, and other sensitive information by building sophisticated detection patterns, reducing false positives, and creating seamless remediation paths when secrets are discovered. Your work will enable organizations to quickly identify exposed secrets, understand their impact, and efficiently revoke and rotate compromised credentials. Your impact will be significant and far-reaching, as our solutions protect both GitLab's ecosystem and the sensitive data of thousands of organizations worldwide, preventing costly data breaches before they happen. Some Examples Of Our Projects Prevent secret leaks in source code with GitLab Secret Push Protection Verify validity of secret detection findings What You’ll Do Lead the design and implementation of fullstack features for our Secret Detection offering, contributing to both the frontend (Vue.js) and backend (Ruby on Rails, GraphQL). Write clean, well-tested code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment. Mentor and support fellow engineers, especially those looking to grow into fullstack contributors. Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems. Contribute to code reviews, RFCs, and Proof-of-Concepts that shape the technical direction of the product Recognize impediments to our efficiency as a team ("technical debt"), propose and implement solutions Work async-first with a globally distributed team, while also participating in necessary sync meetings like high level planning, engineering brainstorming sessions and pairing sessions. What You’ll Bring 3+ years of professional experience with Vue.js, GraphQL, and Ruby on Rails. Proven ability to mentor engineers, lead technical initiatives, and drive frontend and fullstack best practices. Knowledge of security concepts, vulnerabilities, mitigation techniques, and secure coding practices is preferred. Background in developing or using security tools or products Hands-on experience with reverse engineering tools such as Ghidra, Binary Ninja, or diffoscope for analyzing, unpacking, and extracting data from compiled binaries and executable files Experience with Go programming language or strong motivation to learn Ability to work across the stack to deliver end-to-end solutions. A strong product mindset and ability to collaborate closely with cross-functional teams including Product, Design and Technical Writing. Demonstrated ability to work closely with other parts of the organization. Excellent written and verbal communication skills, especially in async-first, remote environments. A proactive, self-managing approach to work with a bias for action and ownership. About The Team GitLab’s Secret Detection team is responsible for the Secret Detection feature category. We want to help developers write better code and worry less about common security mistakes. We do this by helping developers easily identify common security issues as code is being contributed, and mitigate these issues proactively. We work closely with the larger GitLab security product suite while maintaining our specialized focus on the unique challenges of secret detection. Our technical stack spans Rails and Go backends, Vue.js frontends, and custom parsing engines that enable efficient and accurate secret identification. We're committed to making sophisticated security tooling accessible to developers of all skill levels. We'd like to continue to expand our capabilities across these workflows, while also continuously improving the result quality across all types of findings our security tools are responsible for detecting. We balance security best practices with practical developer experience to ensure protection doesn't come at the cost of productivity. Thanks to our Transparency value, you can learn more about us on our Team page. How GitLab Will Support You Benefits to support your health, finances, and well-being All remote, asynchronous work environment Flexible Paid Time Off Team Member Resource Groups Equity Compensation & Employee Stock Purchase Plan Growth and Development Fund Parental leave Home office support Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application. Remote-Global The base salary range for this role’s listed level is currently for residents of listed locations only. Grade level and salary ranges are determined through interviews and a review of education, experience, knowledge, skills, abilities of the applicant, equity with other team members, and alignment with market data. See more information on our benefits and equity. Sales roles are also eligible for incentive pay targeted at up to 100% of the offered base salary. California/Colorado/Hawaii/New Jersey/New York/Washington/DC/Illinois/Minnesota pay range $117,600—$252,000 USD Country Hiring Guidelines: GitLab hires new team members in countries around the world. All of our roles are remote, however some roles may carry specific location-based eligibility requirements. Our Talent Acquisition team can help answer any questions about location after starting the recruiting process. Privacy Policy: Please review our Recruitment Privacy Policy. Your privacy is important to us. GitLab is proud to be an equal opportunity workplace and is an affirmative action employer. GitLab’s policies and practices relating to recruitment, employment, career development and advancement, promotion, and retirement are based solely on merit, regardless of race, color, religion, ancestry, sex (including pregnancy, lactation, sexual orientation, gender identity, or gender expression), national origin, age, citizenship, marital status, mental or physical disability, genetic information (including family medical history), discharge status from the military, protected veteran status (which includes disabled veterans, recently separated veterans, active duty wartime or campaign badge veterans, and Armed Forces service medal veterans), or any other basis protected by law. GitLab will not tolerate discrimination or harassment based on any of these characteristics. See also GitLab’s EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know during the recruiting process.
Posted 3 weeks ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Python Developer – Web Scraping & Automation Company: Actowiz Solutions Location: Ahmedabad Job Type: Full-time About Us Actowiz Solutions is a leading provider of data extraction, web scraping, and automation solutions. We empower businesses with actionable insights by delivering clean, structured, and scalable data through cutting-edge technology. Role Overview We are looking for a highly skilled Python Developer with expertise in web scraping, automation tools, and related frameworks. Key Responsibilities Design, develop, and maintain scalable web scraping scripts and frameworks. Lead a team of Python developers in project planning, task allocation, and code reviews. Work with tools and libraries such as Scrapy, BeautifulSoup, Selenium, Playwright, Requests, etc. Implement robust error handling, data parsing, and storage mechanisms (JSON, CSV, databases, etc.). Optimize scraping performance and ensure compliance with legal and ethical scraping practices. Research new tools and techniques to improve scraping efficiency and scalability. Requirements 2+ years of experience in Python development with strong expertise in web scraping. Proficiency in scraping frameworks like Scrapy, Playwright, or Selenium. Deep understanding of HTTP, proxies, user agents, browser automation, and anti-bot measures. Experience with REST APIs, asynchronous programming, and multithreading. Familiarity with databases (SQL/NoSQL) and cloud-based data pipelines. Preferred Qualifications Knowledge of DevOps tools (Docker, CI/CD) is a plus. Experience with big data platforms or ETL pipelines is advantageous. Contact us Mobile : 841366964 Email:komal.actowiz@gmail.com Website: https://www.actowizsolutions.com/career.php
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital transformation, technology consulting, and business process outsourcing, Business Focus on Digital Engineering, Cloud Services, AI and Data Analytics, Enterprise Applications ( SAP, Oracle, Sales Force ), IT Infrastructure, Business Process Out Source. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru. Offices in over 35 countries. India is a major operational hub, with as its U.S. headquarters. Job Title: Tosca Automation Location: Hyderabad Experience: 3 to 8 years Job Type : Contract Notice Period: Immediate joiners Key skills: In-depth knowledge of TASCA's functionalities, architecture, and how it integrates with other tools. Ability to analyze manual tasks/processes and identify opportunities for automation. Familiarity with scripting languages such as Python, JavaScript, or VBScript (depending on the automation framework used Custom scripting within TASCA workflows Designing and building automated workflows within TASCA Understanding of triggers, conditions, and actions Using REST or SOAP APIs to integrate TASCA with other systems (e.g. CRMs, ERPs, databases) Using tools like Excel, SQL, or XML/JSON for data parsing and transformation. Documenting automation logic, workflows, and changes clearly for team collaboration and future maintenance Troubleshooting automation failures and optimizing performance.
Posted 3 weeks ago
0 years
0 Lacs
India
Remote
Hiring: Part-Time Developer – Web Scraping Expertise | Remote | Immediate Joiner Company: PearlThoughts Role: Part-Time Developer – Web Scraping Location: Remote Engagement: Project-Based Start: Immediate PearlThoughts is seeking a skilled Part-Time Developer with strong expertise in web scraping to join our team on a project-based engagement. This is a remote position ideal for individuals who are available to start immediately and can contribute a few hours daily based on project needs. Key Responsibilities Develop and maintain web scraping scripts and automation workflows Extract and structure data from dynamic websites using modern scraping techniques Monitor and update scrapers as websites evolve Ensure scraped data is clean, accurate, and usable Required Skills Proficiency in Python with experience in libraries such as BeautifulSoup, Selenium, Scrapy Understanding of anti-scraping mechanisms and ability to bypass them ethically Experience with handling APIs, data parsing, and storage formats (JSON, CSV) Strong attention to detail, problem-solving skills, and code optimization Ability to deliver results independently and meet timelines Additional Information Work Mode: 100% Remote Compensation: Project-Based (to be discussed during the interview) Availability: Immediate Joiner Preferred We are looking for someone who is dependable, efficient, and passionate about working with real-world data extraction challenges. If this sounds like you, we’d love to connect. Let’s build something valuable together.
Posted 3 weeks ago
2.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Title: Data Engineer Location: Hyderabad Grade: L3-1 Hiring Manager: Sabya DG About The Job At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as a Data Engineer and you can help make it happen. What You Will Be Doing Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As a Data Engineer, you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data & AI Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities Data Product Engineering: Provide input into the engineering feasibility of developing specific R&D Data/AI Products Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data Collaborate with Data/AI Product Owner and Scrum Master to share progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories Optimize data workflows to drive high performance and reliability of implemented data products Oversee and support junior engineer with Data/AI Product testing requirements and execution Innovation & Team Collaboration Stay current on industry trends, emerging technologies, and best practices in data product engineering Contribute to a team culture of innovation, collaboration, and continuous learning within the product team About You Key Functional Requirements & Qualifications: Bachelor’s degree in software engineering or related field, or equivalent work experience 2-3 years of experience in data engineering, software engineering, or other related fields Experience working in life science/pharmaceutical industry and understanding of R&D business preferred Excellent communication and collaboration skills Working knowledge and comfort working with Agile methodologies Key Technical Requirements & Qualifications Experience in cloud-based data platforms and analytics engineering stack (AWS, Snowflake and DBT) Experience with job scheduling and orchestration (Airflow is a plus) Working knowledge of scripting languages (Python, Shell scripting) Good knowledge of SQL and relational databases technologies/concepts Understanding of data structures and algorithms Experience working with data models and query tuning Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Applications received after the official close date will be reviewed on an individual basis. null Pursue Progress . Discover Extraordinary . Join Sanofi and step into a new era of science - where your growth can be just as transformative as the work we do. We invest in you to reach further, think faster, and do what’s never-been-done-before. You’ll help push boundaries, challenge convention, and build smarter solutions that reach the communities we serve. Ready to chase the miracles of science and improve people’s lives? Let’s Pursue Progress and Discover Extraordinary – together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, protected veteran status or other characteristics protected by law.
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Has excellent knowledge of Node JS. Worked with Express JS. Knowledge of ORMs like Drizzle, TypeORM or Prizma Has knowledge of MySQL or PostGreSQL Has worked with REST Apis extensively Has knowledge of XML parsing and construction Good to have SAP exposure
Posted 3 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
An Amazing Career Opportunity for AI/ML Engineer Location: Chennai, India (Hybrid) Job ID: 39582 Position Summary A rewarding career at HID Global beckons you! We are looking for an AI/ML Engineer , who is responsible for designing, developing, and deploying advanced AI/ML solutions to solve complex business challenges. This role requires expertise in machine learning, deep learning, MLOps, and AI model optimization , with a focus on building scalable, high-performance AI systems. As an AI/ML Engineer , you will work closely with data engineers, software developers, and business stakeholders to integrate AI-driven insights into real-world applications. You will be responsible for model development, system architecture, cloud deployment, and ensuring responsible AI adoption . We are a leading company in the trusted source for innovative HID Global Human Resources products, solutions and services that help millions of customers around the globe create, manage and use secure identities. Who are we? HID powers the trusted identities of the world’s people, places, and things, allowing people to transact safely, work productively and travel freely. We are a high-tech software company headquartered in Austin, TX, with over 4,000 worldwide employees. Check us out: www.hidglobal.com and https://youtu.be/23km5H4K9Eo LinkedIn: www.linkedin.com/company/hidglobal/mycompany/ About HID Global, Chennai HID Global powers the trusted identities of the world’s people, places and things. We make it possible for people to transact safely, work productively and travel freely. Our trusted identity solutions give people secure and convenient access to physical and digital places and connect things that can be accurately identified, verified and tracked digitally. Millions of people around the world use HID products and services to navigate their everyday lives, and over 2 billion things are connected through HID technology. We work with governments, educational institutions, hospitals, financial institutions, industrial businesses and some of the most innovative companies on the planet. Headquartered in Austin, Texas, HID Global has over 3,000 employees worldwide and operates international offices that support more than 100 countries. HID Global® is an ASSA ABLOY Group brand. For more information, visit www.hidglobal.com . HID Global has is the trusted source for secure identity solutions for millions of customers and users around the world. In India, we have two Engineering Centre (Bangalore and Chennai) over 200+ Engineering Staff. Global Engineering Team is based in Chennai and one of the Business Unit Engineering team is based in Bangalore. Physical Access Control Solutions (PACS) HID's Physical Access Control Solutions Business Area: HID PAC’s Business Unit focuses on the growth of new clients and existing clients where we leverage the latest card and reader technologies to solve the security challenges of our clients. Other areas of focus will include authentication, card sub systems, card encoding, Biometrics, location services and all other aspects of a physical access control infrastructure. Qualifications:- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Roles & Responsibilities: Design, develop, and deploy robust & scalable AI/ML models in Production environments. Collaborate with business stakeholders to identify AI/ML opportunities and define measurable success metrics. Design and build Retrieval-Augmented Generation (RAG) pipelines integrating vector stores, semantic search, and document parsing for domain-specific knowledge retrieval. Integrate Multimodal Conversational AI platforms (MCP) including voice, vision, and text to deliver rich user interactions. Drive innovation through PoCs, benchmarking, and experiments with emerging models and architectures. Optimize models for performance, latency and scalability. Build data pipelines and workflows to support model training and evaluation. Conduct research & experimentation on the state-of-the-art techniques (DL, NLP, Time series, CV) Partner with MLOps and DevOps teams to implement best practices in model monitoring, version and re-training. Lead code reviews, architecture discussions and mentor junior & peer engineers. Architect and implement end-to-end AI/ML pipelines, ensuring scalability and efficiency. Deploy models in cloud-based (AWS, Azure, GCP) or on-premises environments using tools like Docker, Kubernetes, TensorFlow Serving, or ONNX Ensure data integrity, quality, and preprocessing best practices for AI/ML model development. Ensure compliance with AI ethics guidelines, data privacy laws (GDPR, CCPA), and corporate AI governance. Work closely with data engineers, software developers, and domain experts to integrate AI into existing systems. Conduct AI/ML training sessions for internal teams to improve AI literacy within the organization. Strong analytical and problem solving mindset. Technical Requirements: Strong expertise in AI/ML engineering and software development. Strong experience with RAG architecture, vector databases Proficiency in Python and hands-on experience in using ML frameworks (tensorflow, pytorch, scikit-learn, xgboost etc) Familiarity with MCPs like Google Dialogflow, Rasa, Amazon Lex, or custom-built agents using LLM orchestration. Cloud-based AI/ML experience (AWS Sagemaker, Azure ML, GCP Vertex AI, etc.). Solid understanding of AI/ML life cycle – Data preprocessing, feature engineering, model selection, training, validation and deployment. Experience in production grade ML systems (Model serving, APIs, Pipelines) Familiarity with Data engineering tools (SPARK, Kafka, Airflow etc) Strong knowledge of statistical modeling, NLP, CV, Recommendation systems, Anomaly detection and time series forecasting. Hands-on in Software engineering with knowledge of version control, testing & CI/CD Hands-on experience in deploying ML models in production using Docker, Kubernetes, TensorFlow Serving, ONNX, and MLflow. Experience in MLOps & CI/CD for ML pipelines, including monitoring, retraining, and model drift detection. Proficiency in scaling AI solutions in cloud environments (AWS, Azure & GCP). Experience in data preprocessing, feature engineering, and dimensionality reduction. Exposure to Data privacy, Compliance and Secure ML practices Education and/or Experience: Graduation or master’s in computer science or information technology or AI/ML/Data science 3+ years of hands-on experience in AI/ML development/deployment and optimization Experience in leading AI/ML teams and mentoring junior engineers. Why apply? Empowerment: You’ll work as part of a global team in a flexible work environment, learning and enhancing your expertise. We welcome an opportunity to meet you and learn about your unique talents, skills, and experiences. You don’t need to check all the boxes. If you have most of the skills and experience, we want you to apply. Innovation: You embrace challenges and want to drive change. We are open to ideas, including flexible work arrangements, job sharing or part-time job seekers. Integrity: You are results-orientated, reliable, and straightforward and value being treated accordingly. We want all our employees to be themselves, to feel appreciated and accepted. This opportunity may be open to flexible working arrangements. HID is an Equal Opportunity/Affirmative Action Employer – Minority/Female/Disability/Veteran/Gender Identity/Sexual Orientation. We make it easier for people to get where they want to go! On an average day, think of how many times you tap, twist, tag, push or swipe to get access, find information, connect with others or track something. HID technology is behind billions of interactions, in more than 100 countries. We help you create a verified, trusted identity that can get you where you need to go – without having to think about it. When you join our HID team, you’ll also be part of the ASSA ABLOY Group, the global leader in access solutions. You’ll have 63,000 colleagues in more than 70 different countries. We empower our people to build their career around their aspirations and our ambitions – supporting them with regular feedback, training, and development opportunities. Our colleagues think broadly about where they can make the most impact, and we encourage them to grow their role locally, regionally, or even internationally. As we welcome new people on board, it’s important to us to have diverse, inclusive teams, and we value different perspectives and experiences.
Posted 3 weeks ago
0 years
0 Lacs
Malappuram
On-site
Flutter Developer Intern Company: Cookee Apps LLP Location: On-site (Kozhikode, Kerala, India) Job Type: Internship (Full-time, 6 Months) Schedule: Day shift About Us Cookee Apps LLP is a fast-growing software company that builds innovative web and mobile solutions. We’re passionate about mentoring fresh talent through real-world, hands-on training and support. Position Overview We are seeking a proactive and enthusiastic Flutter Developer Intern for a 6-month, full-time internship. You will gain practical experience building cross‑platform mobile applications using Flutter and Dart, working alongside our front-end, back-end, and UI/UX teams. Key Responsibilities Contribute to the development of mobile apps using Flutter and Dart (expertia.ai). Collaborate with design and backend teams to implement responsive UI/UX and integrate RESTful APIs . Write clean, maintainable, and efficient code. Participate in code reviews, troubleshooting, and bug-fixing to improve app stability and performance (in.indeed.com). Assist in writing unit tests and contribute to documentation. Stay updated with emerging mobile technologies and flutter best practices. Required Skills Strong fundamentals in Dart and Flutter development (expertia.ai, expertia.ai). Basic understanding of mobile development concepts (UI frameworks, state management, navigation). Familiarity with RESTful API integration and JSON parsing (expertia.ai). Proficiency with Git version control. Solid problem-solving abilities and attention to detail. Strong communication skills and collaborative mindset. Preferred Qualifications Pursuing or completed a degree/certification in Computer Science, Software Engineering, or related field. Portfolio or GitHub showcasing Flutter/Dart projects (academic, personal, or hackathon). Experience using state management solutions (e.g. Provider, BLoC, GetX). Exposure to unit testing in Flutter, CI/CD pipelines, or Firebase integration. What We Offer Internship Certificate upon successful completion. Letter of Recommendation for outstanding performers. Real-time exposure to industry-level codebases and agile development processes. Mentorship from senior developers and the possibility of a full-time role post-internship. Duration & Schedule 6 months full-time commitment Day shift , On-site at Kozhikode, Kerala How to Apply Submit your resume , GitHub portfolio , and a brief statement of interest to career@cookee.io Job Type: Internship Schedule: Day shift Work Location: In person
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough