Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About Us: We love going to work and think you should too. Our team is dedicated to trust, customer obsession, agility, and striving to be better everyday. These values serve as the foundation of our culture, guiding our actions and driving us towards excellence. We foster a culture of performance and recognition, allowing us to transform growth as we enable our employees to do the best work of their careers. This position is located in Pune. You'll be working in a major tech center of Pune, India. Across the globe, our Centers of Energy serve as hubs where we accelerate productivity and collaboration, inspire creativity, and cultivate a culture of connection and celebration. Our teams coordinate their time in Centers of Energy to reflect how they work best. To learn more about life at LogicMonitor, check out our Careers Page. What You'll Do: LM Envision, LogicMonitor's leading hybrid observability platform powered by AI, helps modern enterprises gain operational visibility into and predictability across their IT stacks, so they can continue to deliver extraordinary employee and customer experiences. LogicMonitor has a layered approach to intelligence, where AI and Machine Learning is baked into every facet of the LM Envision platform to help IT teams improve efficiency, minimize alert fatigue, proactively predict trends, and maximize enterprise growth and transformation. Our customers love LogicMonitor's ability to bring cloud and traditional IT together into one view, as seen in minimal churn rates, expansion business, and exciting new customer references. In fact, LogicMonitor has received the highest Net Promoter Score of any IT Infrastructure Management provider. LogicMonitor also boasts high employee satisfaction. We have been certified as a Great Place To Work®, and named one of BuiltIn's Best Places to Work for the seventh year in a row! The Solutions Engineer is responsible for the successful delivery of LogicMonitor Professional Services (PS) technical solutions for new and existing customers. The Solutions Engineer works as a member of the PS team, interfacing directly with LogicMonitor customers while working closely with internal Customer Success Management, Tech Support and Training teams. Duties vary from crafting advanced configurations of LogicMonitor, leading discovery, design & deployment working sessions with customers and relaying product features and improvements to Product/Development teams. Considered a Subject Matter Expert on all things LogicMonitor, the Solutions Engineer strives to help customers adopt best practices and optimal configurations that align with business goals and supplements a comprehensive IT monitoring strategy. Here's a closer look at this key role: Lead Implementations : Oversee end-to-end deployments—from small to large-scale—ensuring flawless delivery of LogicMonitor solutions. Collaborative Engagement : Partner with PS Project and Customer Success Managers to kick off engagements and ensure seamless handoffs. Drive Customer Adoption : Facilitate remote and onsite working sessions for discovery, design, and deployment to accelerate LogicMonitor adoption. Optimize Configurations : Advocate and implement best practices to fine-tune customer LogicMonitor configurations. Automate Processes : Leverage REST API and scripting to automate routine and bulk tasks for enhanced efficiency. Provide Critical Feedback : Identify gaps, feature requests, and issues, and escalate insights to Product and Development teams. Develop Custom Solutions : Create tailored, scripted solutions using LogicMonitor features (Websites, LogicModules, NetScans) and external integrations. Share Expertise : Serve as a subject matter expert by sharing technical lessons and best practices with peers and customers. Offer Cross-Functional Support : Occasionally assist Monitoring Engineering with LogicModule development and support Sales/Customer Success in closing deals or retaining customers. Stay Current : Continuously leverage internal resources to stay updated on new product features, enhancements, and best practices. What You'll Need: 3–4 years of experience as an IT professional or software engineer. 2+ years in IT operations, including systems administration, network engineering, and DevOps. 1+ years of hands-on scripting/programming experience in Python, PowerShell, Java, or Groovy. Subject matter expert in one or more IT domains—such as Windows/Linux server, networking, security, storage, virtualization, or cloud (AWS, Azure, GCP). Proficiency with web technologies, including HTTP, JSON, RegEx, and REST APIs. Willingness to travel 10–20%. Experience using AI tools to enhance productivity, innovation, or problem-solving Click here to read our International Applicant Privacy Notice. LogicMonitor is an Equal Opportunity Employer At LogicMonitor, we believe that innovation thrives when every voice is heard and each individual is empowered to bring their unique perspective. We’re committed to creating a workplace where diversity is celebrated, and all employees feel inspired and supported to contribute their best. For us, equal opportunity means fostering a truly inclusive culture where everyone has the chance to grow and succeed. We don’t just open doors; we invite you to step through and be part of something bigger. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. #BI-Hybrid (DELETE UNNECESSARY TAGS) Our goal is to ensure an accessible and inclusive experience for every candidate. If you need a reasonable accommodation during the application or interview process under applicable local law, please submit a request via this Accommodation Request Form. Know your rights: workplace discrimination is illegal. Please click here to review LogicMonitor’s U.S. Pay Transparency Nondiscrimination Provision.
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Python Developer – Web Scraping & Data Processing About the Role We are seeking a skilled and detail-oriented Python Developer with hands-on experience in web scraping, document parsing (PDF, HTML, XML), and structured data extraction. You will be part of a core team working on aggregating biomedical content from diverse sources, including grant repositories, scientific journals, conference abstracts, treatment guidelines, and clinical trial databases. Key Responsibilities • Develop scalable Python scripts to scrape and parse biomedical data from websites, pre-print servers, citation indexes, journals, and treatment guidelines. • Build robust modules for splitting multi-record documents (PDFs, HTML, etc.) into individual content units. • Implement NLP-based field extraction pipelines using libraries like spaCy, NLTK, or regex for metadata tagging. • Design and automate workflows using schedulers like cron, Celery, or Apache Airflow for periodic scraping and updates. • Store parsed data in relational (PostgreSQL) or NoSQL (MongoDB) databases with efficient schema design. • Ensure robust logging, exception handling, and content quality validation across all processes. Required Skills and Qualifications • 3+ years of hands-on experience in Python, especially for data extraction, transformation, and loading (ETL). o Strong command over web scraping libraries: BeautifulSoup, Scrapy, Selenium, Playwright o Proficiency in PDF parsing libraries: PyMuPDF, pdfminer.six, PDFPlumber • Experience with HTML/XML parsers: lxml, XPath, html5lib • Familiarity with regular expressions, NLP, and field extraction techniques. • Working knowledge of SQL and/or NoSQL databases (MySQL, PostgreSQL, MongoDB). • Understanding of API integration (RESTful APIs) for structured data sources. • Experience with task schedulers and workflow orchestrators (cron, Airflow, Celery). • Version control using Git/GitHub and comfortable working in collaborative environments. Good to Have • Exposure to biomedical or healthcare data parsing (e.g., abstracts, clinical trials, drug labels). • Familiarity with cloud environments like AWS (Lambda, S3) • Experience with data validation frameworks and building QA rules. • Understanding of ontologies and taxonomies (e.g., UMLS, MeSH) for content tagging. Why Join Us • Opportunity to work on cutting-edge biomedical data aggregation for large-scale AI and knowledge graph initiatives. • Collaborative environment with a mission to improve access and insights from scientific literature. • Flexible work arrangements and access to industry-grade tools and infrastructure.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Experience in LLM models like PaLM, GPT4, Mistral (open-source models), Work through the complete lifecycle of Gen AI model development, from training and testing to deployment and performance monitoring. Developing and maintaining AI pipelines with multimodalities like text, image, audio etc. Have implemented in real-world Chat bots or conversational agents at scale handling different data sources. Experience in developing Image generation/translation tools using any of the latent diffusion models like stable diffusion, Instruct pix2pix. Expertise in handling large scale structured and unstructured data. Efficiently handled large-scale generative AI datasets and outputs. Familiarity in the use of Docker tools, pipenv/conda/poetry env Comfort level in following Python project management best practices (use of cxzsetup.py, logging, pytests, relative module imports,sphinx docs,etc.,) Familiarity in use of Github (clone, fetch, pull/push,raising issues and PR, etc.,) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai.text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc.,) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Skillset to leverage cloud platform for Data Engineering, Big Data and ML needs. Use of Dockers (experience in experimental docker features, docker-compose, etc.,) Familiarity with orchestration tools such as airflow, Kubeflow Experience in CI/CD, infrastructure as code tools like terraform etc. Kubernetes or any other containerization tool with experience in Helm, Argoworkflow, etc., Ability to develop APIs with compliance, ethical, secure and safe AI tools. Good UI skills to visualize and build better applications using Gradio, Dash, Streamlit, React, Django, etc., Deeper understanding of javascript, css, angular, html, etc., is a plus. Responsibilities Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc.,) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale. Qualifications Education : Bachelor’s or Master’s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed.
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: • Overall 10+ years of experience working within a large enterprise consisting of large and diverse teams. • Minimum of 6 years of experience on APM and Monitoring technologies • Minimum of 3 years of experience on ELK • Design and implement efficient log shipping and data ingestion processes. • Collaborate with development and operations teams to enhance logging capabilities. • Implement and configure components of the Elastic Stack, including, File beat, Metrics beat, Winlog beat, Logstash and Kibana. • Create and maintain comprehensive documentation for Elastic Stack configurations and processes. • Ensure seamless integration between various Elastic Stack components. • Advance Kibana dashboards and visualizations modelling, deployment. • Create and manage Elasticsearch Clusters on premise, including configuration parameters, indexing, search, and query performance tuning, RBAC security governance, and administration. • Hands-on Scripting & Programming in Python, Ansible, bash, data parsing (regex), etc. • Experience with Security Hardening & Vulnerability/Compliance, OS patching, SSL/SSO/LDAP. • Understanding of HA design, cross-site replication, local and global load balancers, etc. • Data ingestion & enrichment from various sources, webhooks, and REST APIs with JSON/YAML/XML payloads & testing POSTMAN, etc. • CI/CD - Deployment pipeline experience (Ansible, GIT). • Strong knowledge of performance monitoring, metrics, planning, and management. • Ability to apply a systematic & creative approach to solve problems, out-of-the-box thinking with a sense of ownership and focus. • Experience with application onboarding - capturing requirements, understanding data sources, architecture diagrams, application relationships, etc. • Influencing other teams and engineering groups in adopting logging best practices. • Effective communication skills with the ability to articulate technical details to a different audience. • Familiarity with Service now, Confluence and JIRA. • Understand SRE and DevOps principles Technical Skills: APM Tools – ELK, AppDynamics, PagerDuty Programming Languages - Java / .Net, Python Operating Systems – Linux and Windows Automation – GitLab, Ansible Container Orchestration – Kubernetes Cloud – Microsoft Azure and AWS Interested candidates please share your resume with balaji.kumar@flyerssoft.com
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
goa
On-site
As an intern at Vumonic, your day-to-day responsibilities will involve extracting and parsing email receipts in HTML and PDF formats to identify key data points. You will be required to develop scripts to convert unstructured data into structured formats such as JSON and CSV. Additionally, you will implement regex, NLP, and AI techniques to enhance data extraction accuracy. Collaborating with the data team, you will refine parsing logic and automate processes. Writing SQL queries to store, retrieve, and manipulate structured data will be part of your tasks. When necessary, you will utilize R for data cleaning, analysis, and visualization. Moreover, you will explore and integrate AI/ML-based approaches to improve data extraction and validation. It is essential to stay updated with the latest advancements in AI, NLP, and data parsing technologies. You will also be responsible for testing, validating, and optimizing data pipelines for scalability. About Company: Vumonic is a provider of global data and market intelligence services that assist companies in making data-driven decisions related to strategy, marketing, sales, investments, and more, ultimately enhancing ROI. Vumonic is a rapidly growing startup in the data analytics field with a flat hierarchy.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. And EY is counting on your unique voice and perspective to help the organization become even better. Join us and build an exceptional experience for yourself, and contribute to creating a better working world for all. As a CMS-TDR Staff at EY, you will be part of the cyber security team and work as a SOC analyst to assist clients in detecting and responding to security incidents with the support of SIEM, EDR, and NSM solutions. **The Opportunity:** We are seeking a Security Analyst with experience in SIEM, EDR, and NSM solutions. **Your key responsibilities include:** - Providing operational support using SIEM solutions (Splunk, Sentinel, CrowdStrike Falcon LogScale), EDR Solution (Defender, CrowdStrike, Carbon Black), NSM (Fidelis, ExtraHop) for multiple customers. - Performing the first level of monitoring and triaging of security alerts. - Conducting initial data gathering and investigation using SIEM, EDR, NSM solutions. - Providing near real-time analysis, investigation, and reporting of security incidents for customers. **Skills and attributes for success:** - Customer Service oriented with a commitment to meeting customer needs and seeking feedback for improvement. - Hands-on knowledge of SIEM technologies like Splunk, Azure Sentinel, CrowdStrike Falcon LogScale from a Security analyst's perspective. - Exposure to IOT/OT monitoring tools like Claroty, Nozomi Networks is a plus. - Good knowledge and experience in Security Monitoring and Cyber Incident Response. - Familiarity with Network monitoring platforms like Fidelis XPS, ExtraHop and endpoint protection tools such as Carbon Black, Tanium, CrowdStrike, Defender ATP, etc. **To qualify for the role, you must have:** - B. Tech./ B.E. with sound technical skills. - Ability to work in 24x7 shifts. - Strong command of verbal and written English language. - Technical acumen and critical thinking abilities. - Strong interpersonal and presentation skills. - Hands-on experience in SIEM, EDR, and NSM solutions. - Certification in any of the SIEM platforms. - Knowledge of RegEx, Perl scripting, and SQL query language. - Certification such as CEH, ECSA, ECIH, Splunk Power User. **What working at EY offers:** At EY, you will work on inspiring and meaningful projects with a focus on education, coaching, and personal development. You will have opportunities for skill development, career progression, and the freedom to handle your role in a way that suits you best. EY offers support, coaching, and feedback from engaging colleagues, along with an environment that emphasizes high quality and knowledge exchange. EY is dedicated to building a better working world, creating value for clients, people, and society, and building trust in the capital markets. With diverse teams in over 150 countries, EY provides trust through assurance and helps clients grow, transform, and operate across various domains.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Data Dynamics is a global leader in enterprise data management, specializing in Digital Trust and Data Democracy. With a client base of over 300 organizations, including 25% of the Fortune 20, Data Dynamics is dedicated to establishing a transparent, unified, and empowered data ecosystem. The company's AI-powered self-service data management software is reshaping traditional data management practices by granting data creators of all proficiency levels ownership and control over their data. As a Regional Solutions Engineer at Data Dynamics, you will assume a pivotal role in technical pre-sales and customer management. Throughout the pre-sales phase, your responsibilities will involve engaging with customers to elicit business and technical requirements, coordinating technical information, delivering demonstrations, and showcasing product positioning capabilities. You will need to conduct thorough research into the specific needs of customers and assess the feasibility of their requirements. Your proficiency will be crucial in evaluating customers" business demands and translating them into technical and licensing requisites. In instances involving Proof of Concept (POC) or a sale, you will collaborate with the sales representative for review. For existing customers where upselling or cross-selling is the primary objective, you will closely collaborate with the customer success manager allocated to the account and the global support team as necessary. A seamless handover to the implementation team before the customer's go-live phase is essential to ensure a smooth customer experience consistently. A core expectation is your capability to independently set up and conduct demonstrations and POCs from inception. The POC process necessitates defining and agreeing upon success criteria with the customer and sales representative, alongside monitoring the progress towards meeting the customer's objectives. Post POC, you are expected to secure customer approval through a presentation to Data Dynamics Sales and the customers" decision-makers. Furthermore, following any sale globally, your presence may be required to support or lead the installation, contingent on time zones and available resources. Given your regular interactions with both new and existing customers, understanding their business visions, goals, technical capacities, and constraints, you are anticipated to advocate for customers" needs. Providing feedback on business processes, technical matters, or enhancements to the relevant teams within Data Dynamics is crucial. Driving continuous enhancements across all aspects of our operations is vital to achieving our ultimate goal of ensuring customer success and satisfaction. Your responsibilities extend to scriptwriting when necessary, bug reporting, software testing, personal activity tracking, CRM updates, and communication updates to management. Occasional travel for customer or internal meetings may be part of the role. At Data Dynamics, we are committed to delivering an exceptional customer experience and fostering a collaborative, challenging, fun, and innovative work environment. If you are a customer-centric and passionate individual with a strong commitment to building world-class, scalable, data-driven software, we would like to connect with you. Qualifications: - Proficiency in Windows and Linux, Docker, and Kubernetes - Knowledge of SQL, Postgresql, Elasticsearch, File/NAS, and Object Storage - Familiarity with Data Discovery, Data Science, OCR, NLP, Computer Vision, AI, Keyword Search, Regex, Data Governance, GDPR, HIPAA, CCPA - Experience with Microservices-based applications, ITILv4, Project Management, and Data Migration - Prior experience in Presales and Integration - Strong problem-solving, presentation, and communication skills - Ability to collaborate effectively in a team setting - Background in data management or related field is advantageous - Bachelor's degree in Computer Science, Engineering, or a related field,
Posted 1 week ago
0 years
0 Lacs
India
Remote
Official Title: Data Operations Analyst About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc’s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. Why You Should Apply NOW: We are seeking a highly skilled and detail-oriented Data Operations Analyst to play a crucial role in developing custom product attribution solutions based on unique customer needs. This position requires a deep understanding of consumer product data, a strong combination of technical abilities, and a proactive approach to delivering high-quality results. The Data Operations Analyst will work closely with our corporate retail and brand customers, who may have different approaches to organizing and structuring their categories. Your primary responsibility will be to map products to customer requirements using a blend of manual tagging and machine-learning tools. About The Role: The Data Operations Analyst plays a critical role in developing custom product attribution based on unique customer needs. Each of our corporate retail and brand customers thinks about the structure of their categories in slightly different ways, and the Data Operations Analyst will execute the mapping of products to their requirements using a combination of manual tagging and machine learning tools. Interpretation and execution design require strong judgment. The ideal candidate should bring a combination of technical skills, knowledge of consumer product data, strong attention to detail, and accountability to plan and deliver projects on time with a high degree of accuracy. This is a fully-remote opportunity based in India. The start date is June 30, 2025. During onboarding and training period, we expect several hours of overlap with US time zones. Afterward, hires should be available for meetings and check-ins with their US managers and colleagues; however, outside of these specific times, standard work hours can be flexible. As Our Data Operations Analyst, You Will: Work with corporate retail and brand customers to understand their category structures and product attribution requirements. Execute the mapping of products to customers' category needs, utilizing both manual tagging and machine learning tools. Apply strong judgment and interpretation skills to ensure that data mapping aligns with customer specifications and business goals. Collaborate with cross-functional teams to plan and deliver projects on time, ensuring high levels of accuracy and precision. Continuously monitor and improve product attribution processes to increase efficiency and quality. Leverage technical skills to enhance existing processes and tools that support data operations. Maintain strong attention to detail and ensure accuracy in every aspect of the data mapping process. Take accountability for successfully executing projects, including meeting deadlines and client expectations. You Are Likely To Succeed If You Have… 1-2yrs of experience developing a strong technical background, including applied Python, Pyspark, and/or SQL skills RegEx experience is not mandatory but would be considered a nice-to-have skill Proficiency in data analysis, tagging systems, and machine learning tools. Knowledge of consumer product data and experience working with retail or brand data. Exceptional attention to detail, with a focus on delivering high-quality, accurate results. Excellent problem-solving and critical-thinking skills. The ability to manage multiple tasks and projects while maintaining a high degree of accuracy. Strong communication and collaboration skills to work with internal teams and external clients. Proven ability to execute projects efficiently and meet deadlines. Preferred Skills for This Position Include: Experience with data management platforms, product attribution systems, or machine learning tools. Familiarity with data mapping, tagging, and categorization practices. If you are a proactive, technically skilled individual with a passion for data and product attribution, we invite you to join our dynamic team! Apply today to help shape the future of data operations for our retail and brand customers. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, medical insurance, parental leave, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. The annual salary for this position is anticipated to be ₹16,60,000 - ₹20,75,000 (INR). The final offer may be determined by a number of factors, including, but not limited to, the applicant's experience, knowledge, skills, and abilities, as well as internal team benchmarks. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice
Posted 1 week ago
0 years
0 Lacs
India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMS-TDR Staff As part of our EY-cyber security team, who shall work as SOC analyst who will assist clients in detecting and responding to security incidents with support of their SIEM, EDR and NSM solutions. The opportunity We’re looking for Security Analyst with experience in SIEM, EDR and NSM solutions. Your key responsibilities Operational support using SIEM solutions (Splunk, Sentinel), EDR Solution (Defender, CrowdStrike, Carbon Black), NSM (Fidelis, ExtraHop) for multiple customers. First level of monitoring and triaging of security alerts Initial data gathering and investigation using SIEM, EDR, NSM solutions. Provide near real-time analysis, investigation and, reporting security incidents for customer Skills and attributes for success Customer Service oriented - Meets commitments to customers; Seeks feedback from customers to identify improvement opportunities. Good knowledge of SIEM technologies such as Splunk, Azure Sentinel from a Security analyst’s point of view Exposure to IOT/OT monitoring (Claroty, Nozomi Networks etc.) is a plus Good knowledge and experience in Security Monitoring Good knowledge and experience in Cyber Incident Response Knowledge in Network monitoring technology platforms such as Fidelis XPS, ExtraHop Knowledge in endpoint protection tools, techniques, and platforms such as Carbon Black, Tanium, CrowdStrike, Defender ATP etc. To qualify for the role, you must have B. Tech./ B.E. with sound technical skills Ability to work in 24x7 shifts Strong command on verbal and written English language. Demonstrate both technical acumen and critical thinking abilities. Strong interpersonal and presentation skills. Hands-on experience in SIEM, EDR and NSM solution Certification in any of the SIEM platforms Knowledge of RegEx, Perl scripting and SQL query language. Certification - CEH, ECSA, ECIH, Splunk Power User What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
5.0 years
3 - 7 Lacs
Bengaluru
On-site
Department Cyber Defense Job posted on Jul 21, 2025 Employee Type Permanent Experience range (Years) 5 years - 8 years Role: Responsible for the security monitoring & log analysis of multi-vendor security solutions Continuously assess and recommend the implementation of cutting-edge technologies relevant to cyber defense models to meet our customer's evolving needs. Analyze security alerts to identify potential incidents, such as malware infections, unauthorized access, or data breaches. Formulating and implementing monitoring policies, procedures and standards relating to SecOps and security domains network security, data security, cloud security, zero trust, etc Automated response to security incidents (malware infections, unauthorized access, malicious emails, DDoS attacks, etc, together with evaluating the type, nature and severity of security events (security assurance/security compliance) through the use of a range of security event analysis tools. Threat Hunting - Analyzes security system logs, security tools, and available data sources on a day to day Enhance SOC service capabilities and offerings across key security domains and solution areas Malware reverse engineering including: code or behavior analysis for endpoints and the network Data security controls including malware protection, firewalls, intrusion detection systems, content filtering, Internet proxies, encryption controls, and log management solutions Advanced problem solving skills, ability to develop effective long-term solutions to complex problems Knowledge and implementation of MITRE ATT&CK to map use cases across the initial points of exposure, alert mapping, and incident reporting. Evaluate internal and external environment for threats, changes, related to Information Security and perform the role as Information Security subject matter expert to ensure these are properly addressed and controlled Skills: Intermediate knowledge of security operations, incident analysis, incident handling, and vulnerability management or testing, system patching, log analysis, intrusion detection, Develop and implement custom detection rules and use cases to identify and respond to potential security threats. Ability to investigate compromised systems, analyze malware, and collect intrusion artifacts (e.g., source code, trojans) to determine the scope and origin of an attack. Familiarity with forensic tools like Forensic Toolkit (FTK), Wireshark, or Elastic Stack is critical. Conduct detailed forensic analyses to identify the root cause, scope, and impact of security incidents, including malware analysis and artifact collection. develop and implement incident response plans, playbooks, and procedures to ensure effective threat containment, eradication, and recovery. Document incidents thoroughly and prepare actionable reports for technical and non-technical stakeholders, including management and, if necessary, law enforcement. Collaborate with threat intelligence teams to enhance threat detection capabilities. Solid experience in Incident response and Data protection incidents Analyze cloud platform logs (CloudTrail, Audit Logs, etc.) and Logs to identify patterns and anomalies indicative of security threats or unauthorized access. Develop, implement and maintain detection rules based on cloud platform logs to identify specific activities and events within the cloud environment. Create and optimize alerts and notifications for security incidents identified through log analysis. Perform adversary emulation activities to identify detection gaps in the environment. Knowledge of threat intelligence sources and indicators of compromise (IOCs). Understanding of DevOps and CI/CD pipelines in cloud environments. Collaborate with security teams to refine detection rules based on the latest threat intelligence. Work closely with teams to discover new detection capabilities. Integrate cloud platform log data with SIEM systems for centralized monitoring and correlation with other security events. Familiar with field extractions , regex and having knowledge on SIEM infrastructure issues will be added advantage Document detection rules, processes, and methodologies for cloud platform log analysis.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Comfort level in following Python project management best practices (use of setup.py, logging, pytests, relative module imports,sphinx docs,etc.,) Familiarity in use of Github (clone, fetch, pull/push,raising issues and PR, etc.,) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai.text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc.,) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Skillset to leverage cloud platform for Data Engineering, Big Data and ML needs. Use of Dockers (experience in experimental docker features, docker-compose, etc.,) Familiarity with orchestration tools such as airflow, Kubeflow Experience in CI/CD, infrastructure as code tools like terraform etc. Kubernetes or any other containerization tool with experience in Helm, Argoworkflow, etc., Ability to develop APIs with compliance, ethical, secure and safe AI tools. Good UI skills to visualize and build better applications using Gradio, Dash, Streamlit, React, Django, etc., Deeper understanding of javascript, css, angular, html, etc., is a plus. Responsibilities Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc.,) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale. Data Engineering: Skillsets to perform distributed computing (specifically parallelism and scalability in Data Processing, Modeling and Inferencing through Spark, Dask, RapidsAI or RapidscuDF) Ability to build python-based APIs (e.g.: use of FastAPIs/ Flask/ Django for APIs) Experience in Elastic Search and Apache Solr is a plus, vector databases. Qualifications Education : Bachelor’s or Master’s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed.
Posted 1 week ago
2.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Title Business Development Executive (BDE) + Counsellor Company REGex Software Location Jaipur (On-site) Employment Type Full-Time Experience Level 1–2 Years (EdTech experience preferred) About the Company REGex Software is a rapidly growing EdTech company focused on bridging the digital skills gap through industry-aligned IT training and placement programs. Our mission is to help learners — regardless of their background — transform their careers through practical knowledge, industry-recognized certifications, and comprehensive career support. Role Overview We are looking for an energetic and motivated Business Development Executive + Counsellor with 1–2 years of experience. This role is ideal for someone who can engage with students, support the business development team, and assist in managing day-to-day operations of training programs. Key Responsibilities 1. Student Counselling & Sales Interact with prospective students to understand their background and career interests Share program details and guide them in selecting the right training Follow up through calls, WhatsApp, and emails Maintain communication logs and support the team in meeting enrollment goals 2. Business Development Support Assist in lead generation via calls and social media outreach Support senior team members in building connections with: • Colleges and training institutes • Companies and recruiters for placement partnerships Help coordinate outreach events like info sessions, webinars, or college visits 3. Program Coordination & Student Support Help onboard students and ensure smooth access to resources Coordinate with trainers, operations, and placement teams to resolve student queries Track attendance, engagement, and share feedback to improve experience 4. Team Collaboration & Learning Work closely with senior counsellors and BDEs to learn and grow Maintain accurate CRM or tracking records for leads and students Attend team meetings and contribute to internal tasks and activities Requirements 1–2 years of relevant experience Excellent communication and interpersonal skills Strong follow-up and coordination abilities Basic knowledge of tools like MS Office, Google Sheets, and CRMs
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Lead the planning, design, and execution of Applied GenAI projects, aligning them with business objectives and technical requirements. Collaborate with cross-functional teams including engineers, data scientists, and business stakeholders to deliver impactful GenAI solutions. Provide technical leadership and mentorship , fostering a culture of innovation and continuous improvement. Conduct thorough assessments of client needs to design tailored GenAI strategies addressing specific business challenges. Configure GenAI models using prompt engineering, keyword tuning, rules, preferences, and weightages for customer-specific datasets. Oversee deployment and integration of GenAI models into production environments, ensuring scalability, reliability, and performance. Demonstrate strong troubleshooting abilities using tools such as SQL, Kibana Logs, and Azure AppInsights. Monitor solution performance and provide data-driven recommendations for enhancements and optimization. Stay current with the latest GenAI advancements , incorporating best practices into implementation. Prepare and present reports, documentation, and demos to clients and senior leadership, showcasing progress and insights. Conduct GenAI proof-of-concepts and demonstrations exploring the art of the possible in AI applications. Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering , or a related field. Extensive experience in AI/ML with a focus on Generative AI technologies . Proven success in leading and delivering complex AI projects. Strong understanding of GenAI frameworks, tools, and methodologies . Excellent problem-solving, strategic thinking , and project management skills. Exceptional communication and collaboration abilities with diverse teams and stakeholders. Experience with cloud platforms and AI deployments in cloud environments (Azure preferred). Preferred Skills Hands-on experience with GenAI-based products and prompt engineering . Proficiency in text analytics/NLP , including machine learning techniques and algorithms. Skilled in text mining, parsing, and classification using state-of-the-art techniques. Expertise in the Microsoft technology stack . Good knowledge of client-side scripting : JavaScript and jQuery. Understanding of ethical AI principles and best practices in implementation. Proficiency in Python, R, Java , and Regex expressions . Experience using requirement management tools like TFS. Strong verbal and written communication skills with the ability to influence peers and leadership .
Posted 2 weeks ago
0 years
0 Lacs
Chennai
On-site
The primary expectation for this role as a Linguist for the linguistics team is proficiency in Hebrew, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor’s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities. Lightcast is proud to be an equal opportunity workplace and is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Lightcast has always been, and always will be, committed to diversity, equity and inclusion. We seek dynamic professionals from all backgrounds to join our teams, and we encourage our employees to bring their authentic, original, and best selves to work.
Posted 2 weeks ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Job brief: We are looking for a candidate who have experience in as DevOps engineer to creating systems software and analyzing data to improve existing systems or New innovation, along with develop and maintain scalable applications Monitor, troubleshoot, and resolve issues including deployments in multiple environments. Candidate must be well-versed in computer systems and network functions. They should be able to work diligently and accurately and should have great problem-solving ability in order to fix issues and ensure client’s business functionalities. Main Responsibilities: • Develop research programs incorporating current developments to improve existing products and study the potential of new products. • Research, design and evaluate materials, assemblies, processes, and equipment. • Document all phases of research and development. • Establish and maintain testing procedures for assessing raw materials, in-process and finished products. • Assess the scope of research projects and ensure they are on time and with result-oriented outcome. • Be present at industry conferences on research topics of interest. • Understand customer expectations on to-be manufactured products. • Identify and evaluate new technologies that help in building better products or services. • Maintain user guides and technical documentations. • Create impactful demonstrations to showcase emerging security technologies. • Design and build services with a focus on business value and usability. • Assist in keeping the SIEM platform up to date and contribute to security strategies as an when new threats emerge. • Staying up to date with emerging security threats including applicable regulatory security requirements. • Other responsibilities and additional duties as assigned by the security management team or service delivery manager. Skills Must-haves: • ELK development experience • Dev or DevOps experience on AWS cloud, containers, serverless code • Development stack of Wazuh and ELK. • Implement best DevOps practice • Tool set knowledge required for parser/ use case development, plugin customisation – Regex, python, yaml, xml . • Hands-on experience in DevOps . • Experience with Linux and monitoring, logging tools such as Splunk ,Strong scripting skills • Researching and designing new software systems, websites, programs, and applications. • Writing and implementing, clean, scalable code. • Troubleshooting and debugging code. • Verifying and deploying software systems. • Evaluating user feedback. • Recommending and executing program improvements. • Maintaining software code and security systems. • Knowledge of cloud system(AWS, Azure). • Excellent communication skills.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The primary expectation for this role as a Linguist for the linguistics team is proficiency in Korean, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor’s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities. Lightcast is proud to be an equal opportunity workplace and is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Lightcast has always been, and always will be, committed to diversity, equity and inclusion. We seek dynamic professionals from all backgrounds to join our teams, and we encourage our employees to bring their authentic, original, and best selves to work.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The primary expectation for this role as a Linguist for the linguistics team is proficiency in Dutch, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor’s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities. Lightcast is proud to be an equal opportunity workplace and is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Lightcast has always been, and always will be, committed to diversity, equity and inclusion. We seek dynamic professionals from all backgrounds to join our teams, and we encourage our employees to bring their authentic, original, and best selves to work.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The primary expectation for this role as a Linguist for the linguistics team is proficiency in Hebrew, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor’s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities. Lightcast is proud to be an equal opportunity workplace and is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Lightcast has always been, and always will be, committed to diversity, equity and inclusion. We seek dynamic professionals from all backgrounds to join our teams, and we encourage our employees to bring their authentic, original, and best selves to work.
Posted 2 weeks ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMS-TDR Senior As part of our EY-cyber security team, who shall work as Senior analyst who will assist clients in detecting and responding to security incidents with support of their SIEM, EDR and NSM solutions. The opportunity We’re looking for Security Analyst with expertise in SIEM, EDR and NSM solutions. Your Key Responsibilities Operational support using SIEM solutions (Splunk, Sentinel), EDR (CrowdStrike, Defender, Carbon Black) and NSM (Fidelis, ExtraHop) for multiple customers. Specialized in second level incident validation and more detailed investigation Performs incident coordination and communication with client to ensure effective containment, eradication, and recovery SIEM support activities which includes adhoc reporting and basic troubleshooting Advise customers on best practices and use cases on how to use this solution to achieve customer end state requirements. Provide near real-time analysis, investigating, reporting, remediation, coordinating and tracking of security-related activities for customer Skills And Attributes For Success Customer Service oriented - Meets commitments to customers; Seeks feedback from customers to identify improvement opportunities. Good knowledge of SIEM technologies such as Splunk, Azure Sentinel from an Security analyst’s point of view Troubleshoot issues associated with SIEM solution. Ability to work with minimal levels of supervision or oversight. Exposure to IOT/OT monitoring (Claroty, Nozomi Networks etc.) is a plus Good knowledge and experience in Security Monitoring Good knowledge and experience in Cyber Incident Response Knowledge in ELK Stack Knowledge in Network monitoring technology platforms such as Fidelis XPS, ExtraHop Knowledge in endpoint protection tools, techniques, and platforms such as Carbon Black, Tanium, CrowdStrike, Defender etc To qualify for the role, you must have B. Tech./ B.E. with sound technical skills Ability to work in 24x7 shifts Strong command on verbal and written English language. Demonstrate both technical acumen and critical thinking abilities. Strong interpersonal and presentation skills. Minimum 3 years of Hands-on experience in SIEM/EDR/NSM solutions Certification in any of the SIEM platforms Knowledge of RegEx, Perl scripting and SQL query language. Certification - CCSA, CEH, CISSP, GCIH, GIAC. Ideally, you’ll also have People/Project management skills. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Particle41 is seeking a talented and versatile Data Engineer to join our innovative team. As a Data Engineer, you will play a key role in designing, building, and maintaining robust data pipelines and infrastructure to support our clients' data needs. You will work on end-to-end data solutions, collaborating with cross-functional teams to ensure high-quality, scalable, and efficient data delivery. This is an exciting opportunity to contribute to impactful projects, solve complex data challenges, and grow your skills in a supportive and dynamic environment. In This Role, You Will: Software Development Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources. Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing. Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis. Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes. Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows. Requirements Gathering and Analysis Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions. Participate in requirement analysis sessions to understand business needs and user requirements. Provide technical insights and recommendations during the requirements-gathering process. Agile Development Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews. Work closely with Agile teams to deliver software solutions on time and within scope. Adapt to changing priorities and requirements in a fast-paced Agile environment. Testing and Debugging Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications. Write unit tests and validate the functionality of developed features and individual elements. Writing integration tests to ensure different elements within a given application function as intended and meet desired requirements. Identify and resolve software defects, code smells, and performance bottlenecks. Continuous Learning and Innovation Stay updated with the latest technologies and trends in full-stack development. Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications. Continuously seek opportunities to optimize and refactor existing codebase for better efficiency. Stay up to date with cloud platforms such as AWS, Azure, or Google Cloud Platform. Collaboration Collaborate effectively with cross-functional teams, including testers, and product managers. Foster a collaborative and inclusive work environment where ideas are shared and valued. Skills and Experience We Value: Bachelor's degree in computer science, Engineering, or related field. Proven experience as a Data Engineer, with a minimum of 3 years of experience. Proficiency in Python programming language. Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases. Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, datawarehousing/lakehouse, principles, database and ORM, data analysis databricks, panda's, Spark, Pyspark, Machine learning, OpenCV, scikit-learn. Utilities & Tools: logging, requests, subprocess, regex, pytest ELK stack, Redis, distributed task queues Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts. Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers. Familiarity with version control systems like Git and collaborative development workflows. Competence in working on Linux OS and creating shell scripts. Solid understanding of software engineering principles, design patterns, and best practices. Excellent problem-solving and analytical skills, with a keen attention to detail. Effective communication skills, both written and verbal, and the ability to collaborate in a team environment. Adaptability and willingness to learn new technologies and tools as needed. About Particle41 Our core values of Empowering, Leadership, Innovation, Teamwork, and Excellence drive everything we do to achieve the ultimate outcomes for our clients. Empowering Leadership for Innovation in Teamwork with Excellence ( ELITE ) E - Empowering: Enabling individuals to reach their full potential L - Leadership: Taking initiative and guiding each other toward success I - Innovation: Embracing creativity and new ideas to stay ahead T - Teamwork: Collaborating with empathy to achieve common goals E - Excellence: Striving for the highest quality in everything we do We seek team members who embody these values and are committed to contributing to our mission. Particle41 welcomes individuals from all backgrounds who are committed to our mission and values. We provide equal employment opportunities to all employees and applicants, ensuring that hiring and employment decisions are based on merit and qualifications without discrimination based on race, color, religion, caste, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, local, or international laws. This policy applies to all aspects of employment and hiring. We appreciate your interest and encourage applicants from these regions to apply. If you need any assistance during the application or interview process, please feel free to reach out to us at careers@Particle41.com.
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Our Team The Content and Data Analytics team is part of DataOps, which is an integral part of Global Operations at Elsevier. We provide data analysis services, primarily using Databricks, and mostly serve product owners and data scientists of Elsevier’s Research Data Platform. Our work contributes to the delivery of leading data analytics products for the world of scientific research, including Scopus and SciVal. About The Role As a Senior Data Analyst, you have a solid understanding of best practices and can execute projects and initiatives without supervision from others. You are able to create advanced-level insights and recommendations. You are able to lead analytics efforts with high complexity, mostly independently. Responsibilities The Senior Data Analyst role will support data scientists working within the Domains of the Research Data Platform. Domains are functional units that are responsible for delivering one or more data products, often through data science algorithms. Supporting this work could lead to a wide range of different analytical activities. For example, you may be asked to dive into large datasets to answer questions from product owners or data scientists; you may need to perform large-scale data preparation (data prep) in order to test hypotheses or support prototypes; you may be asked to review the precision and recall of data science algorithms at scale and surface these as dashboard metrics. You will need to have a keen eye for detail, good analytical skills, and expertise in at least one data analysis system. Above all, you will need curiosity, dedication to high quality work, and an interest in the world of scientific research and the products that Elsevier creates to serve it. Because you will need to communicate with a range of stakeholders around the world, we ask for candidates to demonstrate a high level of English. Requirements Minimum work experience of 5 years Coding skills in at least one programming language (preferably Python) and SQL Familiarity with common string manipulation functions or libraries such as regular expressions (regex) Prior exposure to data analysis in a tabular form, for example with Pandas or Apache Spark/Databricks Experience of using basic statistics relevant to data science such as precision, recall and statistical significance Knowledge of visualization tools such as Tableau/Power BI is a plus Experience of working with Agile tools such as JIRA is a plus Stake Holder Management Build and maintain strong relationships with Data Scientists and Product Managers. Align activities with Data Scientists and Product Managers. Present achievements and project status updates, both written and verbally, to various stakeholders. Competencies Collaborates well and works effectively as part of a team Takes initiative and is proactive in suggesting approaches or solutions to problems Drives for results by taking a task to a polished conclusion Key Results Understand the requirements of a given task Identify, gather, prepare and refine data Interpret and understand large data sets Report findings to stakeholders through effective story telling Formulate recommendations and requirements Identify and address new opportunities Way that Works for You We promote a healthy work-life balance across the organization. We offer numerous well-being initiatives, shared parental leave, study assistance, and sabbaticals to help you meet both your immediate responsibilities and long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance. Enhanced Health Insurance Options. Group Life Insurance. Group Accident Insurance. Flexible Working Arrangements. Employee Assistance Program. Medical Screening. Modern Family Benefits include maternity, paternity, and adoption support. Long Service Awards. Celebrating New Baby Gift. Subsidized Meals (location-specific). Various Paid Time Off options including Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport for home-office-home commutes (location-specific). About The Business We are a global leader in information and analytics, assisting researchers and healthcare professionals in advancing science and improving health outcomes. We combine quality information and extensive data sets with analytics to support science and research, health education, and interactive learning. At our company, your work contributes to addressing the world's grand challenges and fostering a sustainable future. We utilize innovative technologies to support science and healthcare, partnering with us for a better world.
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Freshworks makes it fast and easy for businesses to delight their customers and employees. We do this by taking a fresh approach to building and delivering software that is affordable, quick to implement, and designed for the end user. Headquartered in San Mateo, California , Freshworks has a global team operating from 13 global locations to serve more than 65,000 companies -- from startups to public companies – that rely on Freshworks software-as-a-service to enable a better customer experience (CRM, CX) and employee experience (ITSM). Freshworks’ cloud-based software suite includes Freshdesk (omni-channel customer support), Freshsales (sales automation), Freshmarketer (marketing automation), Freshservice (IT service desk), Freshchat (AI-powered bots), supported by Neo, our underlying platform of shared services. Freshworks is featured in global national press including CNBC, Forbes, Fortune, Bloomberg and has been a BuiltIn Best Place to work in San Francisco and Denver for the last 3 years. Our customer ratings have earned Freshworks products TrustRadius Top Rated Software ratings and G2 Best of Awards for Best Feature Set, Best Value for the Price and Best Relationship. Device42, a Freshworks company is the most trusted, advanced, and complete full-stack agentless discovery, dependency mapping, CMDB and IT Asset Management platform for Hybrid Cloud. Job Description We are looking for a technical, highly motivated and dynamic IT Product Information & Lead Research Analyst to play a key role in our growing team. This hands-on position is ideal for a self-starter who enjoys research, data curation, and taxonomy design, with a unique opportunity to transition into a leadership role as the team expands. The team is responsible for building and maintaining a comprehensive, accurate, and curated product catalog of all software and hardware technology products sold and deployed globally. The Lead Product Research Analyst will be responsible for: Lead and coach a team of research analysts working on technology data mining and curation. Conduct research on technology data/content about software and hardware products, including vendors, manufacturers, product suites, lifecycle data Collect, analyze, and curate information to enrich our technology catalog, ensuring accuracy and consistency Oversee the maintenance of a technology data catalog, ensuring comprehensive and up-to-date content Design and manage taxonomies for software and hardware products. Develop and enforce rules for normalizing and mapping data between source systems Define and implement end-to-end research workflows, including methodologies, training programs, and quality control measures. Respond to customer and internal requests to enrich catalog content and resolve content-related issues promptly. Act as the subject matter expert (SME) on technology data, working directly with customers and partners to address data-related issues and enhancement requests. Collaborate with internal stakeholders, including engineers and product teams, to translate business requirements into new data offerings and features. Qualifications Bachelor’s degree in computer science, engineering, data management or related field Experience with Master Data Management or Product Information Management platforms Proficiency in writing regular expressions (Regex) - required Familiarity with database tools, data entry systems, and processes. In-depth, subject matter expertise-level understanding of how software is named, titled, and released by commercial and open-source teams, including EOL and vulnerability disclosures. Strong background in researching and managing technical content, with an eye for detail. Ability to identify gaps in data and develop proactive plans to close them. Experience in designing and managing taxonomies for technical products. Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
We are seeking skilled and motivated Elasticsearch Engineers to join our team focused on delivering robust search, analytics, and reporting solutions using the ELK Stack (Elasticsearch, Logstash, Kibana). Your role will involve designing, implementing, and maintaining scalable data pipelines and search views to enhance modern observability and reporting capabilities. This position is perfect for engineers who are excited to work with cutting-edge search technologies and contribute to improving system visibility and data-driven decision-making. Key Responsibilities: - Collaborate within the Scrum Team during the MVP phase. - Develop and maintain data ingestion pipelines into Elasticsearch from diverse sources. - Guide and support development teams in enhancing application logging strategies. - Design and maintain customer-facing search views using Kibana or similar. - Monitor and manage Elasticsearch indexes (e.g. size, performance, growth). - Assist in the deployment and maintenance of Elasticsearch on Linux environments. - Work with IT operations to define long-term hosting and scaling strategies. Required Skills & Qualifications: - Experience with Elasticsearch / ELK Stack. - Good understanding of SQL and NoSQL data models. - Scripting abilities in Python or Ruby for data transformation and ingestion. - Knowledge of Regex for building and tuning log parsers. - 5-8 years of programming experience with C# and/or Java. - Basic Linux system administration knowledge. - Understanding of full-text search (FTS) concepts, including token filters/tokenizers (preferred). If you are interested in one of our vacancies, please send your CV and motivation to: hrtvm@arstraffic.com. For more information about working at ARS, please contact us at 0471 6616755.,
Posted 2 weeks ago
8.0 - 13.0 years
27 - 35 Lacs
Bengaluru
Work from Office
Manage CyberArk PAM projects. Drive implementation, onboard privileged accounts, develop connectors/plugins, lead audits, handle automation, support DevOps onboarding, and interact with clients on controls, compliance, and operations. Required Candidate profile CyberArk Project Manager with 10+ years in IT. Skilled in PAM, scripting, plugin dev, audits, tracking. Defender certified. Strong in communication, automation, and stakeholder management.
Posted 2 weeks ago
8.0 - 10.0 years
19 - 25 Lacs
Bengaluru
Work from Office
Responsibilities Running and maintaining 1E Tachyon Platform Backend Infrastructure (On Prem / SaaS) Responsible for improving Digital Employee Experience (DEX) Communicate findings and recommendations clearly to internal stakeholders and clients Support project planning, scoping, and execution with a focus on quality and efficiency Developing custom instructions for workplace experience and Selfheal capabilities for devices Guarantied state rule creation and deployment Device Stability and Experience monitoring and troubleshooting Tachyon agent and Server Health monitoring ITSM connect with Tachyon Desktop management integration (OS Patching, Software Delivery) Inventory Management for endpoints Collaborate with other collaborators to build, test, and release automation and selfheal capabilities Work with 1E Vendor for any raised issues related to 1E Solutions and Platform Train support staff and help with technical issues as necessary Work with client organizations to identify and address business needs Design and execute SQL queries, dashboards, and reports using BI tools (e.g., Power BI, Tableau) Leverage telemetry data to perform in-depth analysis of system performance, user behavior, and usage patterns to identify actionable insights and optimization opportunities Ensure accurate and consistent reporting, adhering to data governance standards and aligning with internal and customer expectations Create visually compelling and outcome-oriented PowerPoint presentations to effectively communicate insights, recommendations, and project progress to internal stakeholders and clients Analyze incident volume trends, knowledge base (KB) articles, and support operations data to uncover automation and process improvement opportunities Collaborate with cross-functional teams to drive outcomes, ensuring timely delivery and adherence to project governance frameworks Develop analytical frameworks and models to support strategic decision-making and continuous improvement initiatives Maintain a high standard of data quality, integrity, and security across all analytics outputs and dashboards Mentor and guide junior analysts and contribute to best practices Lead cross-functional initiatives, coordinating with business and technical stakeholders to ensure timely, outcome-driven delivery of analytical insights Own end-to-end delivery of analysis projects, from requirement gathering to final presentation, driving clarity and measurable impact Set best practices, guide the analytics team in prioritizing tasks, and foster a culture of continuous improvement and proactive problem-solving Monitor, validate, and analyze outputs from various Tachyon modules, ensuring they align with expected behaviors and deliver measurable value to customers Desired Skills and Qualifications B.Tech/B.E. in Any Specialization, MCA in Any Specialization, M.Tech in Any Specialization, or any Bachelors degree Experience of working on 1E Tachyon Experience of working on other 1E tools such as Nomad, Shopping, Nightwatchman, etc. Working knowledge of desktop management related technologies and processes, including Microsoft SCCM, Active Directory, networking, monitoring, patching, group policy, image development, and advanced desktop troubleshooting Experience in Nexthink, SysTrack Thorough knowledge of SQL/SQLite, PowerShell, RESTful services, Web Services, WMI, Regular Expressions (RegEx), Windows Operating System; understanding of JSON, XML formats Experience of consuming APIs Understanding of Windows server and client administration Good understanding of Group Policy, DNS, and DHCP administration Experience in enterprise endpoint management solutions including modern management with Microsoft Intune, SCCM, VMWare WorkspaceOne Nice to Have Skills Problem-solving attitude Collaborative team spirit Good communication skills with the ability to represent in global meetings/forums/customers Good understanding of network infrastructure Knowledge of Cloud Technologies Excellent problem solving and analytical skills Ability to work independently and as part of a team Must be able to multi-task, change direction, set priorities, and meet deadlines.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough