Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
India
On-site
At Momenta, we're committed to creating a safer digital world by protecting individuals and businesses from voice-based fraud and scams. Through innovative AI technology and community collaboration, we're building a future where communication is secure and trustworthy. The Role Key Responsibilities Design and Deliver Core Detection Pipeline Lead the development of a robust, modular deepfake detection pipeline capable of ingesting, processing, and classifying real-time audio streams with high accuracy and low latency. Architect the system to operate under telecom-grade conditions with configurable interfaces and scalable deployment strategies. Model Strategy, Development, and Optimization Own the experimentation and refinement of state-of-the-art deep learning models for voice fraud detection. Evaluate multiple model families, benchmark performance across datasets, and strategically select or ensemble models that balance precision, robustness, and compute efficiency for real-world deployment. Latency-Conscious Production Readiness Ensure the entire detection stack meets strict performance targets, including sub-20ms inference latency. Apply industry best practices in model compression, preprocessing optimization, and system-level integration to support high-throughput inference on both CPU and GPU environments. Evaluation Framework and Continuous Testing Design and implement a comprehensive evaluation suite to validate model accuracy, false positive rates, and environmental robustness. Conduct rigorous testing across domains, including cross-corpus validation, telephony channel effects, adversarial scenarios, and environmental noise conditions. Deployment Engineering and API Integration Deliver a fully containerized, production-ready inference service with REST/gRPC endpoints. Build CI/CD pipelines, integration tests, and monitoring hooks to ensure system integrity, traceability, and ease of deployment across environments. Ideal Profile Technical Skills ML Frameworks: PyTorch, TensorFlow, ONNX, OpenVINO, TorchScript Audio Libraries: Librosa, Torchaudio, FFmpeg Model Development: CNNs, Transformers, Wav2Vec/WavLM, AASIST, RawNet Signal Processing: VAD, noise reduction, band-pass filtering, codec simulation Optimization: Quantization, pruning, GPU acceleration DevOps: Git, Docker, CI/CD, FastAPI or Flask, REST/gRPC Preferred Experience Prior work on audio deepfake detection or telephony speech processing Experience with real-time ML model deployment Understanding of adversarial robustness and domain adaptation Familiarity with call center environments or telecom-grade constraints What's on Offer? Excellent career development opportunities Leadership Role Opportunity to make a positive impact Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
India
On-site
Kuration AI is an advanced platform that uses AI to make business research easier and more efficient. It helps you discover new companies, enrich your existing data, and monitor market changes in real time. Whether you're in sales, market research, or competitor tracking, Kuration AI provides the tools to save time and gain deeper insights. The Role About The Role We're hiring an experienced full stack developer to build core applications for our AI tooling platform. You’ll collaborate closely with data scientists and product stakeholders to deliver robust infrastructure that supports next-generation AI use cases. Responsibilities Design, develop, and maintain production-grade features and services Implement API integrations and back-end logic for AI agent pipelines Collaborate on system architecture and help drive product iterations Ensure scalability, reliability, and performance across the stack Support technical decisions and mentor junior team members if needed Ideal Profile You’re excited by emerging technologies and driven to apply AI in solving real-world business problems. 2+ years of professional experience as a full stack or backend-focused developer Fluent in Python and comfortable with modern JavaScript frameworks Strong understanding of system integration and working with AI/ML applications Track record of building tools or platforms, ideally in a fast-paced environment Startup experience or interest in lean, agile teams preferred Professional level English language skills Must be able to work on-site in Hong Kong full-time What's on Offer? Work alongside & learn from best in class talent Show more Show less
Posted 4 days ago
0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Lead the architecture, development, and maintenance of scalable backend systems and RESTful APIs using Node.js. Design and optimize NoSQL databases including MongoDB and Cassandra, ensuring performance, reliability, and data integrity. Build and manage cloud-native backend applications deployed on Microsoft Azure, leveraging relevant services and best practices. Develop and maintain automated CI/CD pipelines using Azure DevOps, enabling smooth, secure, and efficient delivery processes. Establish and enforce version control workflows using GIT, including branching strategies, pull requests, and code reviews. Provide technical leadership and mentorship to backend developers, supporting code quality, problem-solving, and team development. Ensure system scalability, security, and monitoring, and lead incident resolution for backend services. Collaborate with frontend, DevOps, QA, and product teams to deliver cohesive and well-integrated software solutions. Contribute to or oversee the integration of AI/ML or computer vision capabilities into backend workflows where applicable. Knowledge of React.js or Next.js is an added advantage, helping in seamless collaboration with frontend teams and full-stack troubleshooting. Maintain comprehensive technical documentation, architectural guidelines, and API specifications for internal use and cross-team coordination. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Scientist Lead Data Scientist Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. The New Product Engineering team within Data and Services is engineering the transformation of new ideas into software solutions to test in the market, and then growing and scaling products to meet long-term demand and needs. We achieve this with a "move fast, learn fast" mindset, relentless focus on customer excellence, and a culture of collaboration and empowerment. Our solutions are instrumental in positioning Mastercard as a data & services market leader. Role and Summary Data Scientist The Data Scientist in Data & Services New Product Engineering team is responsible for creating data & analytics solutions including deep learning Artificial Intelligence (A.I.) and Machine Learning (M.L.) models that sit atop large datasets of business and finance operations gathered by mid to large companies. The Data Scientist is responsible for developing full life cycle modeling processes of data analysis, feature engineering, model training, testing, serving and monitoring. As a Data Scientist, You Will Work closely with the business owners to understand business requirements, performance metrics regarding data quality and model performance of customer facing products Work with multiple disparate sources of data, storage systems, and building processes and pipelines to provide cohesive datasets for analysis and modeling Generate and maintain and optimize data pipelines for model building and model performance evaluation Develop, test, and evaluate modern machine learning and A.I. models Oversee implementation of models Evaluate production models based on business metrics to drive continuous improvement Essential Skills Data engineering and data science experience Experience with SQL language and one or multiple of the following database technologies: PostgreSQL, Hadoop, Spark. Good knowledge of Linux / Bash environment Python and one of the following machine learning libraries Spark ML TensorFlow Scikit Learn XGBoost Good communication skills Highly skilled problem solver Exhibits a high degree of initiative At least an undergraduate degree in CS, or a STEM related field. Nice To Have Bachelor’s or Master’s in CS, Data Science, ML/AI, or a related STEM field Understands and implements methods to evaluate own work and others for bias, inaccuracy, and error Databricks Loves working with error-prone, messy, disparate, unstructured data Experience in working with Cloud (e.g., Azure, AWS) Experience participating in complex engineering projects in an Agile setting e.g. Scrum Mastercard is an equal opportunity employer that values diversity and inclusion. Applicants will be considered and treated without regard to gender, gender identity, race, color, ethnicity, national origin, religion, sexual orientation, veteran or disabled status, or any other characteristic protected by applicable law. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251483 Show more Show less
Posted 4 days ago
0 years
0 Lacs
Delhi, India
Remote
About Us Astra is a cybersecurity SaaS company that makes otherwise chaotic pentests a breeze with its one of a kind AI-led offensive Pentest Platform. Astra's continuous vulnerability scanner emulates hacker behavior to scan applications for 13,000+ security tests. CTOs and CISOs love Astra because it helps them to achieve continuous security at scale, fix vulnerabilities in record time, and seamlessly transition from DevOps to DevSecOps with Astra's powerful CI/CD integrations. Astra is loved by 800+ companies across 70+ countries. In 2024 Astra uncovered 2.5 million+ vulnerabilities for its customers, saving customers $110M+ in potential losses due to security vulnerabilities. We've been awarded by the President of France Mr. François Hollande at the La French Tech program and Prime Minister of India Shri Narendra Modi at the Global Conference on Cyber Security. Loom, MamaEarth, Muthoot Finance, Canara Robeco, Dream 11, OLX Autos etc. are a few of Astra’s customers. Job Description This is a remote position. Role Overview As Astra Security’s first AI Engineer, you will play a pivotal role in introducing and embedding AI into our security products. You will be responsible for designing, developing, and deploying AI applications leveraging both open-source models (Llama, Mistral, DeepSeek etc) and proprietary services (OpenAI, Anthropic). Your work will directly impact how AI is used to enhance threat detection, automate security processes, and improve intelligence gathering. This is an opportunity to not only build future AI models but also define Astra Security’s AI strategy, laying the foundation for future AI-driven security solutions. Key Responsibilities Lead the AI integration efforts within Astra Security, shaping the company’s AI roadmap Develop and Optimize Retrieval-Augmented Generation (RAG) Pipelines with multi-tenant capabilities Build and enhance RAG applications using LangChain, LangGraph, and vector databases (e.g. Milvus, Pinecone, pgvector). Implement efficient document chunking, retrieval, and ranking strategies. Optimize LLM interactions using embeddings, prompt engineering, and memory mechanisms. Work with Graph databases (Neo4j or similar) for structuring and querying knowledge bases esign multi-agent workflows using orchestration platforms like LangGraph or other emerging agent frameworks for AI-driven decision-making and reasoning. Integrate vector search, APIs and external knowledge sources into agent workflows. Exposure to end-to-end AI ecosystem like Huggingface to accelerate AI development (while initial work won’t involve extensive model training, the candidate should be ready for fine-tuning, domain adaptation, and LLM deployment when needed) Design and develop AI applications using LLMs (Llama, Mistral, OpenAI, Anthropic, etc.) Build APIs and microservices to integrate AI models into backend architectures.. Collaborate with the product and engineering teams to integrate AI into Astra Security’s core offerings Stay up to date with the latest advancements in AI and security, ensuring Astra remains at the cutting edge What We Are Looking For Exceptional Python skills for AI/ML development Hands-on experience with LLMs and AI frameworks (LangChain, Transformers, RAG-based applications) Strong understanding of retrieval-augmented generation (RAG) and knowledge graphs Experience with AI orchestration tools (LangChain, LangGraph) Familiarity with graph databases (Neo4j or similar) Experience in Ollama for efficient AI model deployment for production workloads is a plus Experience deploying AI models using Docker Hands-on experience with Ollama setup and loading DeepSeek/Llama. Strong problem-solving skills and a self-starter mindset—you will be building AI at Astra from the ground up. Nice To Have Experience with AI deployment frameworks (e.g., BentoML, FastAPI, Flask, AWS) Background in cybersecurity or security-focused AI applications What We Offer Software Engineering Mindset: This role requires a strong software engineering mindset to build AI solutions from 0 to 1 and scale them based on business needs. The candidate should be comfortable designing, developing, testing, and deploying production-ready AI systems while ensuring maintainability, performance, and scalability. Why Join Astra Security? Own and drive the AI strategy at Astra Security from day one Fully remote, agile working environment. Good engineering culture with full ownership in design, development, release lifecycle. A wholesome opportunity where you get to build things from scratch, improve and ship code to production in hours, not weeks. Holistic understanding of SaaS and enterprise security business. Annual trips to beaches or mountains (last one was at Wayanad). Open and supportive culture. Health insurance & other benefits for you and your spouse. Maternity benefits included. Show more Show less
Posted 4 days ago
10.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
About The Job Position Name - Senior Data & AI/ML Engineer – GCP Specialization Lead Minimum Experience - 10+ years Expected Date of Joining - Immediate Primary Skill GCP Services: BigQuery, Dataflow, Pub/Sub, Vertex AI ML Engineering: End-to-end ML pipelines using Vertex AI / Kubeflow Programming: Python & SQL MLOps: CI/CD for ML, Model deployment & monitoring Infrastructure-as-Code: Terraform Data Engineering: ETL/ELT, real-time & batch pipelines AI/ML Tools: TensorFlow, scikit-learn, XGBoos Secondary Skills GCP Certifications: Professional Data Engineer or ML Engineer Data Tools: Looker, Dataform, Data Catalog AI Governance: Model explainability, privacy, compliance (e.g., GDPR, fairness) GCP Partner Experience: Prior involvement in specialization journey or partner enablement Work Location - Remote What Makes Techjays An Inspiring Place To Work At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change. Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises. Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI. We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You’ll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development. Preferred Qualification GCP Professional Certifications: Data Engineer or Machine Learning Engineer. Experience contributing to a GCP Partner specialization journey. Familiarity with Looker, Data Catalog, Dataform, or other GCP data ecosystem tools. Knowledge of data privacy, model explainability, and AI governance is a plus. Work Location: Remote Key Responsibilities Data & AI/ML Architecture Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage. Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines. Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry. Define and implement data governance, lineage, monitoring, and quality frameworks. Google Cloud Partner Enablement Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions. Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP. Contribute to building repeatable solution accelerators in Data & AI/ML. Work with the leadership team to align with Google Cloud Partner Program metrics. Team Development Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning. Organize and lead internal GCP AI/ML enablement sessions. Represent the company in Google partner ecosystem events, tech talks, and joint GTM engagements. What We Offer Best-in-class packages. Paid holidays and flexible time-off policies. Casual dress code and a flexible working environment. Opportunities for professional development in an engaging, fast-paced environment. Medical insurance covering self and family up to 4 lakhs per person. Diverse and multicultural work environment. Be part of an innovation-driven culture with ample support and resources to succeed. Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Greater Chennai Area
On-site
Why CDM Smith? Check out this video and find out why our team loves to work here! Join Us! CDM Smith – where amazing career journeys unfold. Imagine a place committed to offering an unmatched employee experience. Where you work on projects that are meaningful to you. Where you play an active part in shaping your career journey. Where your co-workers are invested in you and your success. Where you are encouraged and supported to do your very best and given the tools and resources to do so. Where it’s a priority that the company takes good care of you and your family. Our employees are the heart of our company. As an employer of choice, our goal is to provide a challenging, progressive and inclusive work environment which fosters personal leadership, career growth and development for every employee. We value passionate individuals who challenge the norm, deliver world-class solutions and bring diverse perspectives. Join our team, and together we will make a difference and change the world. Job Description CDM Smith is seeking a Data Engineer to join our Digital Engineering Solutions team. This individual will be part of the Data Technology group within the Digital Engineering Solutions team, helping to drive strategic Architecture, Engineering and Construction (AEC) initiatives using cutting-edge data technologies and analytics to deliver actionable business insights and robust solutions for AEC professionals and client outcomes. The Data Technology group will lead the firm in AEC-focused Business Intelligence and data services by providing architectural guidance, technological vision, and solution development. The Data Technology group will specifically utilize advanced analytics, data science, and AI/ML to give our business and our products a competitive advantage. It includes understanding and managing the data, how it interconnects, and architecting & engineering data for self-serve BI and BA opportunities. This position is for a person who has demonstrated excellence in data engineering capabilities, experienced with data technology and processes, and enjoys framing a problem, shaping and creating solutions, and helping to lead and champion implementation. As a member of the Digital Engineering Solutions team, the Data Technology group will also engage in research and development and provide guidance and oversight to the AEC practices at CDM Smith, engaging in new product research, testing, and the incubation of data technology-related ideas that arise from around the company. Key Responsibilities Assists in the design, development, and maintenance of scalable data pipelines and workflows to extract, transform, and load (ETL/ELT) data from various sources into target systems. Contributes to automate workflows to ensure efficiency, scalability, and error reduction in data integration processes. Tests data quality and integrity by implementing processes to validate completeness, accuracy, and consistency of data. Monitor and troubleshoot data pipeline performance and reliability. Document data engineering processes and workflows. Collaborate with Data Scientists, Analytics Engineers, and stakeholders to understand business requirements and deliver high-quality data solutions. Stay abreast of the latest developments and advancements, including new and emerging technologies & best practices and new tools & software applications and how they could impact CDM Smith. Assist with the development of documentation, standards, best practices, and workflows for data technology hardware/software in use across the business. Perform other duties as required. Skills And Abilities Good foundation with the Software Development Life Cycle (SDLC) and Agile Development methodologies. Good foundation with Cloud ETL/ELT tools and deployment, including Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Good Knowledge in data modeling and designing scalable ETL/ELT processes. Familiarity with CI/CD pipelines and DevOps practices for data solutions. Knowledge of monitoring tools and techniques for ensuring pipeline observability and reliability. Excellent problem-solving and critical thinking skills to identify and address technical challenges effectively. Strong critical thinking skills to generate innovative solutions and improve business processes. Ability to effectively communicate complex technical concepts to both technical and non-technical audiences. Detail oriented with the ability to assist with executing highly complex or specialized projects. Minimum Qualifications Bachelor’s degree. 0 – 2 years of related experience. Equivalent additional directly related experience will be considered in lieu of a degree. Amount Of Travel Required 0% Background Check and Drug Testing Information CDM Smith Inc. and its divisions and subsidiaries (hereafter collectively referred to as “CDM Smith”) reserves the right to require background checks including criminal, employment, education, licensure, etc. as well as credit and motor vehicle when applicable for certain positions. In addition, CDM Smith may conduct drug testing for designated positions. Background checks are conducted after an offer of employment has been made in the United States. The timing of when background checks will be conducted on candidates for positions outside the United States will vary based on country statutory law but in no case, will the background check precede an interview. CDM Smith will conduct interviews of qualified individuals prior to requesting a criminal background check, and no job application submitted prior to such interview shall inquire into an applicant's criminal history. If this position is subject to a background check for any convictions related to its responsibilities and requirements, employment will be contingent upon successful completion of a background investigation including criminal history. Criminal history will not automatically disqualify a candidate. In addition, during employment individuals may be required by CDM Smith or a CDM Smith client to successfully complete additional background checks, including motor vehicle record as well as drug testing. Agency Disclaimer All vendors must have a signed CDM Smith Placement Agreement from the CDM Smith Recruitment Center Manager to receive payment for your placement. Verbal or written commitments from any other member of the CDM Smith staff will not be considered binding terms. All unsolicited resumes sent to CDM Smith and any resume submitted to any employee outside of CDM Smith Recruiting Center Team (RCT) will be considered property of CDM Smith. CDM Smith will not be held liable to pay a placement fee. Business Unit COR Group COR Assignment Category Fulltime-Regular Employment Type Regular Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Greater Chennai Area
On-site
Why CDM Smith? Check out this video and find out why our team loves to work here! Join Us! CDM Smith – where amazing career journeys unfold. Imagine a place committed to offering an unmatched employee experience. Where you work on projects that are meaningful to you. Where you play an active part in shaping your career journey. Where your co-workers are invested in you and your success. Where you are encouraged and supported to do your very best and given the tools and resources to do so. Where it’s a priority that the company takes good care of you and your family. Our employees are the heart of our company. As an employer of choice, our goal is to provide a challenging, progressive and inclusive work environment which fosters personal leadership, career growth and development for every employee. We value passionate individuals who challenge the norm, deliver world-class solutions and bring diverse perspectives. Join our team, and together we will make a difference and change the world. Job Description CDM Smith is seeking an Analytics Engineer to join our Digital Engineering Solutions team. This individual will be part of the Data Technology group within the Digital Engineering Solutions team, helping to drive strategic Architecture, Engineering and Construction (AEC) initiatives using cutting-edge data technologies and analytics to deliver actionable business insights and robust solutions for AEC professionals and client outcomes. The Data Technology group will lead the firm in AEC-focused Business Intelligence and data services by providing architectural guidance, technological vision, and solution development. The Data Technology group will specifically utilize advanced analytics, data science, and AI/ML to give our business and our products a competitive advantage. It includes understanding and managing the data, how it interconnects, and architecting & engineering data for self-serve BI and BA opportunities. This position is for a person who has demonstrated excellence in analytics engineering capabilities, experienced with data technology and processes, and enjoys framing a problem, shaping and creating solutions, and helping to lead and champion implementation. As a member of the Digital Engineering Solutions team, the Data Technology group will also engage in research and development and provide guidance and oversight to the AEC practices at CDM Smith, engaging in new product research, testing, and the incubation of data technology-related ideas that arise from around the company. Key Responsibilities Performs data profiling, designs data flows and data sources, builds queries, builds basic to intermediate data models, and applies business logic/calculations. Create dynamic and interactive dashboards, reports, data visualizations, and applications to provide end-users with intuitive access to insights. Ensure data accuracy, consistency, and quality across all analytics deliverables. Collaborate with stakeholders to gather requirements and design tailored analytics applications and visualizations. Provide support to business users for utilizing self-service BI and analytics platforms effectively. Monitor analytics systems and troubleshoot technical issues, providing support and improvements as needed. Stay abreast of the latest developments and advancements, including new and emerging technologies & best practices and new tools & software applications and how they could impact CDM Smith. Assist with the development of documentation, standards, best practices, and workflows for data technology hardware/software in use across the business. Perform other duties as required. Skills And Abilities Basic ability to create data visualization and transform concepts into fully realized, production applications. Knowledge in Business Intelligence tools such as Power BI, R Shiny, Dash, or Streamlit. Familiarity with cloud-based analytics solutions, particularly Microsoft Azure and Databricks. Basic knowledge of software engineering principles, including modular code design, version control (e.g., Git), and CI/CD pipelines. Basic knowledge of Agile/Scrum. Basic ability to manipulate SQL queries. Basic programming skills in Python or R for data manipulation, analysis, and integration. Excellent problem-solving and critical thinking skills to identify and address technical challenges effectively. Strong critical thinking skills to generate innovative solutions and improve business processes. Ability to effectively communicate complex technical concepts to both technical and non-technical audiences. Detail oriented with the ability to assist with executing highly complex or specialized projects. Minimum Qualifications Bachelor’s degree. 0 – 2 years of related experience. Equivalent additional directly related experience will be considered in lieu of a degree. Amount Of Travel Required 0% Background Check and Drug Testing Information CDM Smith Inc. and its divisions and subsidiaries (hereafter collectively referred to as “CDM Smith”) reserves the right to require background checks including criminal, employment, education, licensure, etc. as well as credit and motor vehicle when applicable for certain positions. In addition, CDM Smith may conduct drug testing for designated positions. Background checks are conducted after an offer of employment has been made in the United States. The timing of when background checks will be conducted on candidates for positions outside the United States will vary based on country statutory law but in no case, will the background check precede an interview. CDM Smith will conduct interviews of qualified individuals prior to requesting a criminal background check, and no job application submitted prior to such interview shall inquire into an applicant's criminal history. If this position is subject to a background check for any convictions related to its responsibilities and requirements, employment will be contingent upon successful completion of a background investigation including criminal history. Criminal history will not automatically disqualify a candidate. In addition, during employment individuals may be required by CDM Smith or a CDM Smith client to successfully complete additional background checks, including motor vehicle record as well as drug testing. Agency Disclaimer All vendors must have a signed CDM Smith Placement Agreement from the CDM Smith Recruitment Center Manager to receive payment for your placement. Verbal or written commitments from any other member of the CDM Smith staff will not be considered binding terms. All unsolicited resumes sent to CDM Smith and any resume submitted to any employee outside of CDM Smith Recruiting Center Team (RCT) will be considered property of CDM Smith. CDM Smith will not be held liable to pay a placement fee. Business Unit COR Group COR Assignment Category Fulltime-Regular Employment Type Regular Show more Show less
Posted 4 days ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
POSITION PURPOSE The Service Delivery Manager (SDM) is accountable for ensuring the consistent and high-quality delivery of IT services and support across IDP’s production environments, working in close collaboration with the Service Desk, infrastructure, application support teams, CRM teams, and third-party vendors. The SDM is also responsible for driving operational excellence, automation, AI-driven support, continual service improvement (CSI), and enhancing customer satisfaction. Service Management & Customer Satisfaction Responsibility Measure Serve as the single point of accountability for end-to-end service delivery and performance. Customer Satisfaction Score (CSAT) Ensure services align with business expectations and SLAs/OLAs are consistently met. % SLA Compliance Lead monthly and quarterly service reviews. Number of Improvement Actions Identified & Tracked Major Incident Management (MIM) Responsibility Measure Act as the Major Incident Manager for Severity 1/critical incidents. Mean Time to Resolve (MTTR) for Sev 1 Incidents Conduct After Action Reviews (AARs) within 48 hours. % AARs Completed on Time Drive root cause analysis. % of RCAs Implemented Operations Oversight & Vendor Management Responsibility Measure Collaborate with vendors for efficient service delivery. Vendor SLA Adherence Monitor infrastructure health. System Uptime % Track vendor performance. Quarterly Vendor Scorecard Rating Data-Driven Insights, Reporting & Automation Responsibility Measure Monitor and report on SLAs, KPIs, and incident trends. Accuracy & Timeliness of Reports Use Now Assist/AI tools for proactive management. % Reduction in Manual Effort via Automation Present insights to influence strategic decisions. Number of Strategic Changes Implemented Based on Insights CRM & Customer Engagement Systems Responsibility Measure Manage CRM platforms (Salesforce, SAP C4C, MS Dynamics, etc.). CRM System Uptime & Response Times Leverage CRM analytics for service performance. Number of Improvements Initiated via CRM Insights Collaborate with CRM owners for integration stability. CRM Incident Closure Rate Within SLA Continuous Service Improvement Responsibility Measure Lead initiatives for process and service enhancements. % of CSI Actions Completed Promote automation and AI-driven improvements. % Increase in Automated Transactions Implement ITIL best practices. % Audit Compliance to ITIL Standards Change & Release Governance Responsibility Measure Support CAB meetings and change evaluation. % of Changes Without Backout or Failure Ensure business alignment for changes. Business Approval Rate for Changes Project Transitions Responsibility Measure Ensure readiness during new service transition. % of Projects Transitioned Without Major Issues Represent service operations in go-live. Transition Checklist Compliance Rate Team Collaboration & Leadership Responsibility Measure Foster a culture of ownership and responsiveness. Internal Engagement/Feedback Score Mentor staff for key roles. % of Team with Role Readiness Plans Privacy, Security & Compliance Responsibility Measure Align with data privacy/security standards. % Compliance in Security Audits Identify/escalate security risks. Number of Security Incidents Detected Early Experience & Qualifications 10+ years of experience in IT service delivery, production support, or operations. Proven expertise in managing high-severity incidents and leading support operations. Strong experience with ITSM tools (e.g., ServiceNow), CRM platforms (e.g., Salesforce, MS Dynamics, SAP C4C), automation frameworks, and AI/ML-powered tools like Now Assist. Familiarity with cloud platforms (AWS/Azure), DevOps, and modern monitoring tools. ITIL Foundation certification (v3/v4) mandatory; ITIL Intermediate/Expert preferred. LEADERSHIP & BEHAVIORAL COMPETENCIES Demonstrates urgency and accountability in resolving critical issues. Strong interpersonal and communication skills; can influence at senior levels. Collaborative and team-oriented with a focus on stakeholder satisfaction. Analytical thinker with a continuous improvement mindset. Ability to work under pressure, manage priorities, and drive results. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Aeris provides a SaaS based connectivity management platform for mobile operators and enterprises. It enables organizations to realize new revenue streams from a vast variety of devices while simplifying the process and reducing the cost of connecting them to benefit from economies of scale. The platform provides access to key functionality including subscription management, eSIM/eUICC management via world class APIs and operator and enterprise self-service portals. This is built on core tenants of cloud computing and intelligence (AI/ML). And thanks to our deep expertise and our extensive trusted partner network, we are the go-to destination for those wishing to roll-out high quality global IoT deployment. Our Aeris Noida office is based in the heart of the Noida IT Hub in Sec 62. We have a state-of-the-art collaborative workspace with open desks, recreation space, a round-the-clock fully stocked cafeteria and a terrace garden - to name just a few of our perks. We also have one of the best health and insurance benefit schemes in the industry to ensure that our employees and their family are taken care of. We are in search of a technical professional with a strong understanding of software product development, telecom domain knowledge and a passion for Quality Assurance and test automation. In this role, you'll be part of a small, fast-paced team. You will be collaborating with other engineering groups and product management. You must have a solid understanding of core Java and UI concepts as your team works the complete stack: including front end, backend, web services, and database layers. To this end, Aeris employs the use of a “QualDev” title to emphasize the mindset. Responsibilities Convert requirements into test cases Execute test cases/scripts to ensure delivery of quality software, and log/track defects Automate test cases using existing test automation frameworks Contribute towards design/development of automation frameworks Assist with troubleshooting issues. Conduct post-release/post-implementation testing. Work with cross-functional teams to ensure quality throughout the software development lifecycle. Qualifications 5+ years of professional software development experience in test Experience with any of the programming languages like Java, Python, etc. Experience automation of tests using any of the web-app automation tool like Selenium, Appium and UFT etc. Excellent communication, collaboration, reporting, analytical and problem-solving skills Expertise working with JSON formats. Proficiency in usage of SQL queries, databases, and large datasets. Willingness to learn new technologies rapidly. Preferred Qualifications Knowledge of REST APIs Experience in cloud technologies (Google Cloud Platform, AWS, Azure etc.) Experience in testing Big Data applications Experience in implementation/enhancement of test automation framework/s. Experience with Scrum/Agile development methodologies. As part of our hiring process, Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We are a brilliant mix of varying ethnicities, backgrounds, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Different perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer. Show more Show less
Posted 4 days ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile Job Description - * Position: Technical Lead – Java/Sr. Technical Lead – Java Experience Required: 7 – 12 years Location: Sector, 142, Noida (On-site/Hybrid) Job Overview We are looking for an experienced Technical/Sr. Technical Lead – Java, who is proficient in hands-on programming and coding (80%) while also providing guidance and mentorship (20%) to our Java Development Team and Hands-on Java Lead to guide our Java development team. The successful candidate will have extensive experience in Java development, excellent leadership abilities, and a track record of delivering high-quality software solutions. Key Responsibilities Lead and mentor a team of Java developers, providing technical guidance and support. Design, develop, and implement robust and scalable Java applications. Collaborate with stakeholders to gather requirements and define project scope. Ensure adherence to best practices in software development, including code quality, testing, and documentation. Conduct code reviews and ensure compliance with coding standards. Troubleshoot complex issues and provide effective solutions. Drive continuous improvement and innovation within the development team. Stay current with industry trends and advancements in Java and related technologies. Required Skills And Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Experience with core Spring Boot features like auto-configuration, dependency injection, and embedded servers. Working knowledge of relevant Spring Boot related frameworks (e.g., Spring Security, Spring Data JPA, Caching). Experience with RESTful API development and microservices architecture. Experience with object-oriented programming principles, design patterns, and best practices. Experience with unit testing frameworks (e.g., JUnit) and writing clean and maintainable unit tests that consider interactions with Spring Boot and its components. Experience on identifying and correcting Java applications performance problems. Proven leadership experience in a software development environment. Proficiency in relational databases, SQL, and ORM technologies. Excellent problem-solving skills and the ability to handle complex technical challenges. Strong communication and interpersonal skills. Familiarity with version control systems, such as Git. Preferred Qualifications Experience with cloud platforms like AWS or Azure. Knowledge of front-end technologies such as HTML, CSS, and JavaScript. Experience with Agile and Scrum development methodologies. Relevant certifications in Java development or cloud technologies. Experience with continuous integration and continuous delivery (CI/CD) pipelines. Experience with containerization technologies (e.g., Docker) Show more Show less
Posted 4 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersa’s rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines – Veersalabs: an In-house R&D and product development platform and Veersa tech consulting: Technical solutions delivery for clients. Veersa’s customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersa’s focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. Key Responsibilities Design, develop, and maintain scalable web applications using .NET Core and ASP.NET MVC. Implement front-end solutions using modern frameworks like Angular or React. Develop, optimize, and maintain data access layers using Entity Framework. Build and deploy applications on Azure Cloud, leveraging its services such as Azure Functions, Azure App Services, Azure SQL, and more. Collaborate with cross-functional teams to define, design, and ship new features. Ensure code quality through automated tests, code reviews, and adherence to best practices. Provide technical leadership, mentoring, and guidance to team members. Participate in architectural reviews and contribute to the evolution of the technology stack. Troubleshoot, debug, and upgrade existing systems as needed. Document technical designs, workflows, and processes on platforms like Jira Confluence for team-wide accessibility. Skills And Qualifications Technical Skills: Backend Development: Proficiency in .NET Core, ASP.NET MVC, C#, and building RESTful APIs. Frontend Development: Expertise in modern front-end frameworks like Angular or React, including state management tools (e.g., Redux or NgRx). Database Management: Hands-on experience with Entity Framework, SQL Server, and writing optimized queries for performance. Cloud Expertise: Strong experience in Azure Cloud Services, including Azure App Services, Azure Functions, Azure Storage, Azure SQL, and Azure DevOps. Architecture: Knowledge of microservices architecture, design patterns, and event-driven systems. DevOps and CI/CD: Experience with tools like Azure DevOps, Git, Jenkins, and implementing CI/CD pipelines. Testing: Familiarity with automated testing frameworks such as NUnit, MSTest, Jasmine, or Karma for both backend and frontend testing. Version Control: Proficiency with Git and branching strategies. Containerization: Knowledge of Docker and Kubernetes for containerized application development and deployment. Soft Skills Strong analytical and problem-solving skills with attention to detail. Excellent communication and ability to articulate complex technical concepts to non-technical stakeholders. Leadership capabilities with experience mentoring and guiding junior developers. Adaptability to work in Agile/Scrum environments and deliver under tight deadlines. Additional Skills Knowledge of asynchronous programming and real-time communication using SignalR. Experience in integrating third-party APIs and SDKs. Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. About The Role We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice To Have) Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Immediate joiners will be prioritized Show more Show less
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Aeris provides a SaaS based connectivity management platform for mobile operators and enterprises. It enables organizations to realize new revenue streams from a vast variety of devices while simplifying the process and reducing the cost of connecting them to benefit from economies of scale. The platform provides access to key functionality including subscription management, eSIM/eUICC management via world class APIs and operator and enterprise self-service portals. This is built on core tenants of cloud computing and intelligence (AI/ML). And thanks to our deep expertise and our extensive trusted partner network, we are the go-to destination for those wishing to roll-out high quality global IoT deployment. We are looking for a Senior Java Developer to join our team. In this critical role, you will be responsible for the design, development, and implementation of new features and services. This new position will play an important role advancing mission-critical SaaS platforms that power global IoT deployments. Responsibilities Develop software solutions by studying requirements, conferring with users, studying data usage and work processes, investigating problem areas and following the software development lifecycle. Demonstrate solutions by developing documentation, design specifications and architecture diagrams. Improve operations by conducting systems analysis; recommending changes in policies and procedures. Provide information by collecting, analyzing, and summarizing development and service issues. Responds promptly and professionally to support requests and customer needs. Taking ownership of technical issues and collaborate with offshore development groups to resolve more advanced issues when necessary. Document troubleshooting and problem resolution steps Provide training to Customer Support teams as required Ultimately define & design architecture components within the Aeris machine-to-machine cellular network participating from inception through support phases of the development and deployment lifecycle. Qualifications Total experience 6-10 years of professional experience. Strong core Java skills (multi-threading, memory management and web services) Demonstrates understanding of Java/J2ee development frameworks, best practices and design patterns Working knowledge of Java, spring, Hibernate, REST, XML, JSON, Message Queues, databases – both relational and non-relational. Exposure to cloud technologies preferably Google cloud Experience working with large-scale production systems. Excellent verbal, written and interpersonal communication Proficiency with Agile development. Strong Analytical and Problem Solving skills Be comfortable working in dynamic environment. Experience in Telcom will be an advantage As part of our hiring process, Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity.We are a brilliant mix of varying ethnicities, backgrounds, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Different perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include: Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile Job Description * Job Title: ETL Ingestion Engineer (Azure Data Factory) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 2–5 years About The Role We are looking for a talented ETL Ingestion Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF). Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF), including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Job Title: ETL Lead – Azure Data Factory (ADF) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 5+ years About The Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Show more Show less
Posted 4 days ago
2.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less
Posted 4 days ago
2.0 - 10.0 years
0 Lacs
Ghaziabad, Uttar Pradesh, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less
Posted 4 days ago
12.0 - 15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Aeris provides a SaaS based connectivity management platform for mobile operators and enterprises. It enables organizations to realize new revenue streams from a vast variety of devices while simplifying the process and reducing the cost of connecting them to benefit from economies of scale. The platform provides access to key functionality including subscription management, eSIM/eUICC management via world class APIs and operator and enterprise self-service portals. This is built on core tenants of cloud computing and intelligence (AI/ML). And thanks to our deep expertise and our extensive trusted partner network, we are the go-to destination for those wishing to roll-out high quality global IoT deployment. Aeris is in search of a "hands on" leader to guide QA and automation teams with a strong understanding of Software product development, telecom domain knowledge and a passion for Quality Assurance and test automation. In this role, you'll be leading a small, fast-paced team. You will be collaborating with other engineering teams and product management. You must have a solid understanding of core Java and UI concepts as your team works the complete stack: including front end, backend, web services, and database layers. To this end, Aeris employs the use of a “QualDev” title to emphasize the mindset. Responsibilities Provide leadership and guidance to a team of QualDev Engineers, fostering a collaborative and high-performing environment. Design, develop, and execute complex JMeter scripts for API testing, load testing, and performance testing. Enhance and maintain our testing framework, integrating Selenium, Appium, and Postman automation into our CI/CD pipelines. Collaborate with cross-functional teams to ensure seamless integration of automotive systems with the cloud. Analyze test results to identify bottlenecks and performance issues, driving continuous improvement in system reliability and performance. Troubleshoot, debug, and resolve complex issues with scripts, applications, and infrastructure components. Lead test planning and estimation, risk identification and mitigation, and maintain comprehensive test metrics (e.g., defect densities, post-production issues). Deliver detailed and meaningful reports on test results, highlighting potential areas for improvement. Stay up-to-date with the latest industry trends, tools, and techniques related to API and performance testing. Provide mentorship and professional development guidance to team members. Qualifications Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. A minimum of 12 to 15 years of experience in Software Testing, including at least 4 years as a QA Manager for a complete product. Proven leadership experience, preferably as a Testing Manager/ Test Architect. Strong development skills, with proficiency in at least one programming language (Java/Python). In-depth knowledge of RESTful APIs, messaging protocols such as MQTT and AMQP, Kafka, and Google Cloud Pub/Sub. Hands-on experience with containerization technologies like Kubernetes. Proficiency in building and maintaining CI/CD pipelines using Jenkins. Excellent problem-solving skills, attention to detail, and strong communication and collaboration abilities. Experience in writing SQL queries and a basic understanding of performance testing. Preferred Qualifications Experience in the automotive industry or IoT projects involving vehicle connectivity. Experience with network Security products Experience with working with products based on data analytics Exposure to additional cloud environments (GCP/AWS) and technologies such as Kafka, Elastic Search, Redis. Knowledge of additional performance testing tools and techniques beyond JMeter. As part of our hiring process, Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity.We are a brilliant mix of varying ethnicities, backgrounds, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Different perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Delhi, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include: Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile Job Description * Job Title: ETL Ingestion Engineer (Azure Data Factory) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 2–5 years About The Role We are looking for a talented ETL Ingestion Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF). Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF), including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Job Title: ETL Lead – Azure Data Factory (ADF) Department: Data Engineering / Analytics Employment Type: Full-time Experience Level: 5+ years About The Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Show more Show less
Posted 4 days ago
2.0 - 10.0 years
0 Lacs
Delhi, India
On-site
About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less
Posted 4 days ago
12.0 - 18.0 years
0 Lacs
Tamil Nadu, India
Remote
Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. This position requires expertise in designing, developing, debugging, and maintaining AI-powered applications and data engineering workflows for both local and cloud environments. The role involves working on large-scale projects, optimizing AI/ML pipelines, and ensuring scalable data infrastructure. As a PMTS, you will be responsible for integrating Generative AI (GenAI) capabilities, building data pipelines for AI model training, and deploying scalable AI-powered microservices. You will collaborate with AI/ML, Data Engineering, DevOps, and Product teams to deliver impactful solutions that enhance our products and services. Additionally, it would be desirable if the candidate has experience in retrieval-augmented generation (RAG), fine-tuning pre-trained LLMs, AI model evaluation, data pipeline automation, and optimizing cloud-based AI deployments. Responsibilities AI-Powered Software Development & API Integration Develop AI-driven applications, microservices, and automation workflows using FastAPI, Flask, or Django, ensuring cloud-native deployment and performance optimization. Integrate OpenAI APIs (GPT models, Embeddings, Function Calling) and Retrieval-Augmented Generation (RAG) techniques to enhance AI-powered document retrieval, classification, and decision-making. Data Engineering & AI Model Performance Optimization Design, build, and optimize scalable data pipelines for AI/ML workflows using Pandas, PySpark, and Dask, integrating data sources such as Kafka, AWS S3, Azure Data Lake, and Snowflake. Enhance AI model inference efficiency by implementing vector retrieval using FAISS, Pinecone, or ChromaDB, and optimize API latency with tuning techniques (temperature, top-k sampling, max tokens settings). Microservices, APIs & Security Develop scalable RESTful APIs for AI models and data services, ensuring integration with internal and external systems while securing API endpoints using OAuth, JWT, and API Key Authentication. Implement AI-powered logging, observability, and monitoring to track data pipelines, model drift, and inference accuracy, ensuring compliance with AI governance and security best practices. AI & Data Engineering Collaboration Work with AI/ML, Data Engineering, and DevOps teams to optimize AI model deployments, data pipelines, and real-time/batch processing for AI-driven solutions. Engage in Agile ceremonies, backlog refinement, and collaborative problem-solving to scale AI-powered workflows in areas like fraud detection, claims processing, and intelligent automation. Cross-Functional Coordination and Communication Collaborate with Product, UX, and Compliance teams to align AI-powered features with user needs, security policies, and regulatory frameworks (HIPAA, GDPR, SOC2). Ensure seamless integration of structured and unstructured data sources (SQL, NoSQL, vector databases) to improve AI model accuracy and retrieval efficiency. Mentorship & Knowledge Sharing Mentor junior engineers on AI model integration, API development, and scalable data engineering best practices, and conduct knowledge-sharing sessions. Education & Experience Required 12-18 years of experience in software engineering or AI/ML development, preferably in AI-driven solutions. Hands-on experience with Agile development, SDLC, CI/CD pipelines, and AI model deployment lifecycles. Bachelor’s Degree or equivalent in Computer Science, Engineering, Data Science, or a related field. Proficiency in full-stack development with expertise in Python (preferred for AI), Java Experience with structured & unstructured data: SQL (PostgreSQL, MySQL, SQL Server) NoSQL (OpenSearch, Redis, Elasticsearch) Vector Databases (FAISS, Pinecone, ChromaDB) Cloud & AI Infrastructure AWS: Lambda, SageMaker, ECS, S3 Azure: Azure OpenAI, ML Studio GenAI Frameworks & Tools: OpenAI API, Hugging Face Transformers, LangChain, LlamaIndex, AutoGPT, CrewAI. Experience in LLM deployment, retrieval-augmented generation (RAG), and AI search optimization. Proficiency in AI model evaluation (BLEU, ROUGE, BERT Score, cosine similarity) and responsible AI deployment. Strong problem-solving skills, AI ethics awareness, and the ability to collaborate across AI, DevOps, and data engineering teams. Curiosity and eagerness to explore new AI models, tools, and best practices for scalable GenAI adoption. About Athenahealth Here’s our vision: To create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. What’s unique about our locations? From an historic, 19th century arsenal to a converted, landmark power plant, all of athenahealth’s offices were carefully chosen to represent our innovative spirit and promote the most positive and productive work environment for our teams. Our 10 offices across the United States and India — plus numerous remote employees — all work to modernize the healthcare experience, together. Our Company Culture Might Be Our Best Feature. We don't take ourselves too seriously. But our work? That’s another story. athenahealth develops and implements products and services that support US healthcare: It’s our chance to create healthier futures for ourselves, for our family and friends, for everyone. Our vibrant and talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our goal. We continue to expand our workforce with amazing people who bring diverse backgrounds, experiences, and perspectives at every level, and foster an environment where every athenista feels comfortable bringing their best selves to work. Our size makes a difference, too: We are small enough that your individual contributions will stand out — but large enough to grow your career with our resources and established business stability. Giving back is integral to our culture. Our athenaGives platform strives to support food security, expand access to high-quality healthcare for all, and support STEM education to develop providers and technologists who will provide access to high-quality healthcare for all in the future. As part of the evolution of athenahealth’s Corporate Social Responsibility (CSR) program, we’ve selected nonprofit partners that align with our purpose and let us foster long-term partnerships for charitable giving, employee volunteerism, insight sharing, collaboration, and cross-team engagement. What can we do for you? Along with health and financial benefits, athenistas enjoy perks specific to each location, including commuter support, employee assistance programs, tuition assistance, employee resource groups, and collaborative workspaces — some offices even welcome dogs. In addition to our traditional benefits and perks, we sponsor events throughout the year, including book clubs, external speakers, and hackathons. And we provide athenistas with a company culture based on learning, the support of an engaged team, and an inclusive environment where all employees are valued. We also encourage a better work-life balance for athenistas with our flexibility. While we know in-office collaboration is critical to our vision, we recognize that not all work needs to be done within an office environment, full-time. With consistent communication and digital collaboration tools, athenahealth enables employees to find a balance that feels fulfilling and productive for each individual situation. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are The Cisco Security AI team delivers AI products and platform for all Cisco secure products and portfolios so businesses around the world defend against threats and safeguard the most vital aspects of your business with security resilience. We are passionate about making businesses secure and simplify security with zero compromise using AI and Machine Learning. We are seeking a strong Machine Learning E ngineer who can make a big difference in the cybersecurity industry. Who You Are You are an accomplished and visionary Senior Machine Learning Engineer with a track record of leading teams and architecting machine learning solutions that have made a significant impact. You are deeply passionate about machine learning with a proven track record of successfully developing and implementing models that address real-world problems . You are someone with : Deep Knowledge of LLM Architecture: Comprehensive understanding of the architecture underlying large language models, such as Transformer-based models, including GPT (Generative Pre-trained Transformer), and their variants. Language Model Training and Fine-Tuning: Experience in training large-scale language models from scratch, as well as fine-tuning pre-trained models for specific applications or domains. Experience with agentic frameworks a plus. Data Preprocessing for NLP: Skills in preprocessing textual data, including tokenization, stemming, lemmatization, and handling of different text encodings. Transfer Learning and Adaptation: Proficiency in applying transfer learning techniques to adapt existing LLMs to new languages, domains, or specific business needs. Handling Ambiguity and Context in Text: Ability to design models that effectively handle ambiguities, nuances, and context in natural language processing. Application of LLMs: Experience in creatively applying LLM technology in diverse areas such as chatbots, content creation, semantic search, and more. Data Annotation and Evaluation: Skills in designing and implementing data annotation strategies for training LLMs and evaluating their performance using appropriate metrics . Scalability and Deployment: Experience in scaling LLMs for production environments, ensuring efficiency and robustness in deployment. What You Will Do Model Training, Optimization, and Evaluation : This encompasses the complete cycle of training, fine-tuning, and validating language models. You will be designing and adapting LLMs for use in virtual assistants, automated chatbots, content recommendation systems, etc. Algorithm Development for Enhanced Language Understanding : Focusing on the development or refinement of algorithms to improve the efficiency and accuracy of language models, especially in natural language understanding and generation tasks. Applying LLMs to Cybersecurit y: Tailoring language models for cybersecurity purposes, such as analyzing threat intelligence, detecting cyber threats, and automating responses to security incidents. Deployment Strategy: Collaborate with software engineering teams to design and implement deployment strategies for machine learning models into security systems, ensuring scalability, reliability, and efficiency. Documentation and Best Practices: Establish best practices for machine learning and security operations, and maintain clear documentation of models, data pipelines and metrics Experimentation with Emerging Technologies and Methods : Actively exploring new technologies and methodologies in language model development, including experimental frameworks, software tools, and cutting-edge approaches. Mentoring and Cross-Functional Collaboration : Providing mentorship to team members and working collaboratively with cross-functional teams to ensure cohesive development and implementation of language model projects. Basic Qualifications BA / BS degree with 7+ years of experience (or) MS degree with 5+ years of experience as a machine learning engineer Solid experience in machine learning engineering, with a strong portfolio of successful projects Extensive experience in building machine learning systems and scalable solutions Expertise in machine learning algorithms, deep learning, and statistical modeling Preferred Qualifications Advanced degree in Computer Science, Data Science, Statistics, Computational Linguistics or a related field. Proficiency in programming languages such as Python or R, and experience with machine learning libraries (e.g., TensorFlow, PyTorch , scikit-learn) Excellent problem-solving and communication skills, with the ability to explain complex concepts to non-technical stakeholders Proven ability to work collaboratively in cross-functional teams #CiscoAIJobs Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You’ll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you’re working in our four global Home Offices, Distribution Centers or Retail Stores—TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you’ll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family—a Fortune 100 company and the world’s leading off-price retailer. Job Description About TJX: At TJX, is a Fortune 100 company that operates off-price retailers of apparel and home fashions. TJX India - Hyderabad is the IT home office in the global technology organization of off-price apparel and home fashion retailer TJX, established to deliver innovative solutions that help transform operations globally. At TJX, we strive to build a workplace where our Associates’ contributions are welcomed and are embedded in our purpose to provide excellent value to our customers every day. At TJX India, we take a long-term view of your career. We have a high-performance culture that rewards Associates with career growth opportunities, preferred assignments, and upward career advancement. We take well-being very seriously and are committed to offering a great work-life balance for all our Associates. What You’ll Discover Inclusive culture and career growth opportunities A truly Global IT Organization that collaborates across North America, Europe, Asia and Australia, click here to learn more Challenging, collaborative, and team-based environment What You’ll Do The Global Supply Chain - Logistics Team is responsible for managing various supply chain logistics related solutions within TJX IT. The organization delivers capabilities that enrich the customer experience and provide business value. We seek a motivated, talented Staff Engineer with good understanding of cloud base, database and BI concepts to help architect enterprise reporting solutions across global buying, planning and allocations. What You’ll Need The Global Supply Chain - Logistics Team thrives on strong relationships with our business partners and working diligently to address their needs which supports TJX growth and operational stability. On this tightly knit and fast-paced solution delivery team you will be constantly challenged to stretch and think outside the box. You will be working with product teams, architecture and business partners to strategically plan and deliver the product features by connecting the technical and business worlds. You will need to break down complex problems into steps that drive product development while keeping product quality and security as the priority. You will be responsible for most architecture, design and technical decisions within the assigned scope. Key Responsibilities Design, develop, test and deploy AI solutions using Azure AI services to meet business requirements, working collaboratively with architects and other engineers. Train, fine-tune, and evaluate AI models, including large language models (LLMs), ensuring they meet performance criteria and integrate seamlessly into new or existing solutions. Develop and integrate APIs to enable smooth interaction between AI models and other applications, facilitating efficient model serving. Collaborate effectively with cross-functional teams, including data scientists, software engineers, and business stakeholders, to deliver comprehensive AI solutions. Optimize AI and ML model performance through techniques such as hyperparameter tuning and model compression to enhance efficiency and effectiveness. Monitor and maintain AI systems, providing technical support and troubleshooting to ensure continuous operation and reliability. Create comprehensive documentation for AI solutions, including design documents, user guides, and operational procedures, to support development and maintenance. Stay updated with the latest advancements in AI, machine learning, and cloud technologies, demonstrating a commitment to continuous learning and improvement. Design, code, deploy, and support software components, working collaboratively with AI architects and engineers to build impactful systems and services. Lead medium complex initiatives, prioritizing and assigning tasks, providing guidance, and resolving issues to ensure successful project delivery. Minimum Qualifications Bachelor's degree in computer science, engineering, or related field 8+ years of experience in data/software engineering, design, implementation and architecture. At least 5+ years of hands-on experience in developing AI/ML solutions, with a focus on deploying them in a cloud environment. Deep understanding of AI and ML algorithms with focus on Operations Research / Optimization knowledge (preferably Metaheuristics / Genetic Algorithms). Strong programming skills in Python with advanced OOPS concepts. Good understanding of structured, semi structured, and unstructured data, Data modelling, Data analysis, ETL and ELT. Proficiency with Databricks & PySpark. Experience with MLOps practices including CI/CD for machine learning models. Knowledge of security best practices for deploying AI solutions, including data encryption and access control. Knowledge of ethical considerations in AI, including bias detection and mitigation strategies. This role operates in an Agile/Scrum environment and requires a solid understanding of the full software lifecycle, including functional requirement gathering, design and development, testing of software applications, and documenting requirements and technical specifications. Fully Owns Epics with decreasing guidance. Takes initiative through identifying gaps and opportunities. Strong communication and influence skills. Solid team leadership with mentorship skills Ability to understand the work environment and competing priorities in conjunction with developing/meeting project goals. Shows a positive, open-minded, and can-do attitude. Experience In The Following Technologies Advanced Python programming (OOPS) Operations Research / Optimization knowledge (preferably Metaheuristics / Genetic Algorithms) Databricks with Pyspark Azure / Cloud knowledge Github / version control Functional knowledge on Supply Chain / Logistics is preferred. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individual's status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address Salarpuria Sattva Knowledge City, Inorbit Road Location: APAC Home Office Hyderabad IN Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Greater Hyderabad Area
On-site
Welcome to Single Point Solutions! We are a leading software development company specializing in delivering cutting-edge solutions to our clients worldwide. With a team of highly skilled professionals, we provide a comprehensive range of services, including App development, Cloud solutions, Data engineering, Product engineering, Artificial Intelligence, Machine Learning, and Testing Services.At SPS, we believe that technology has the power to transform businesses and shape the future. Our dedicated team of experts leverages their deep industry knowledge and technical expertise to design and develop innovative software solutions tailored to meet our clients' unique needs.We are committed to building long-lasting partnerships with our clients. We strive to understand their business objectives, challenges, and aspirations, allowing us to deliver tailored solutions that exceed expectations. Our client-centric approach, combined with our technical prowess, sets us apart and enables us to deliver exceptional results.Connect with us on LinkedIn to stay updated on the latest trends in software development, AI, ML, and technology. Let's collaborate and unlock the full potential of technology to drive your business forward. The Role Key Responsibilities Design and develop robust backend services using Java and Spring Boot Build responsive and interactive web interfaces using React.js Collaborate with cross-functional teams to define, design, and deliver new features Ensure code quality and maintainability through best practices and code reviews Write unit and integration tests to ensure high-quality software delivery Participate in Agile/Scrum ceremonies and contribute to continuous improvement Ideal Profile Required Skills & Qualifications Strong programming skills in Java and familiarity with Spring Boot Solid experience with React.js, including hooks, state management, and component-based architecture Proficient understanding of RESTful APIs and microservices architecture Experience with databases (SQL and/or NoSQL) Familiarity with Git, CI/CD pipelines, and modern DevOps practices Excellent problem-solving and debugging skills Strong communication and teamwork abilities Nice to Have You have at least 5 years experience including solid experience in a similar role within IT. You are a strong networker & relationship builder You are a strong mentor and coach who builds high performing teams You are a strong team player who can manage multiple stakeholders What's on Offer? Work in a company with a solid track record of performance A role that offers a breadth of learning opportunities Flexible working options Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Future of work : Mandate 3: Onsite – Employees will come to office all 5 days. About Swiggy Swiggy is India’s leading on-demand delivery platform with a tech-first approach to logistics and a solution-first approach to consumer demands. With a presence in 500+ cities across India, partnerships with hundreds of thousands of restaurants, an employee base of over 5000, a 2 lakh+ strong independent fleet of Delivery Executives, we deliver unparalleled convenience driven by continuous innovation. Built on the back of robust ML technology and fuelled by terabytes of data processed every day, Swiggy offers a fast, seamless and reliable delivery experience for millions of customers across India. From starting out as a hyperlocal food delivery service in 2014, to becoming India’s leading on-demand convenience platform today, our capabilities result not only in lightning-fast delivery for customers, but also in a productive and fulfilling experience for our employees. Responsibilities Develop and manage views for different business metrics: dashboards, and scorecards. Coordinate with MIS and analytics for data gathering using SQL/excel and ensuring data sanity for day-to-day reporting. Work with large, complex data sets to solve business problems, applying advanced analytical methods as needed. Conduct regular planning and review key performance metrics and aid in benefitting from change. Identify the right metrics to track progress against a given business goal Expedite root cause analyses/insight generation against a given recurring use case through automation/self-serve platforms. Work with cross-functional teams for continuous improvement of data accuracy through feedback and scoping on instrumentation quality and completeness. Desired Skills A bachelor’s degree in engineering/business/related field 2+ years of experience in Analytics Excellent planning, organizational, and time management skills Strong problem-solving & ability to work in ambiguous environments with high ownership Proficiency in SQL, Python, Power BI & Excel, Strong drive to move fast and break barriers. "We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, colour, religion, sex, disability status, or any other characteristic protected by the law" Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a significant rise in the demand for Machine Learning (ML) professionals in recent years. With the growth of technology companies and increasing adoption of AI-driven solutions, the job market for ML roles in India is thriving. Job seekers with expertise in ML have a wide range of opportunities to explore in various industries such as IT, healthcare, finance, e-commerce, and more.
The average salary range for Machine Learning professionals in India varies based on experience levels. Entry-level positions such as ML Engineers or Data Scientists can expect salaries starting from INR 6-8 lakhs per annum. With experience, Senior ML Engineers or ML Architects can earn upwards of INR 15-20 lakhs per annum.
The career progression in Machine Learning typically follows a path from Junior Data Scientist or ML Engineer to Senior Data Scientist, ML Architect, and eventually to a Tech Lead or Chief Data Scientist role.
In addition to proficiency in Machine Learning, professionals in this field are often expected to have skills in:
As you explore job opportunities in Machine Learning in India, remember to hone your skills, stay updated with the latest trends in the field, and approach interviews with confidence. With the right preparation and mindset, you can land your dream ML job and contribute to the exciting world of AI and data science. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2