Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
30.0 years
0 Lacs
Pune, Maharashtra, India
On-site
OVERVIEW TransPerfect helps organizations navigate the global marketplace. It remains founder-led – and has striven to maintain the ethos and drive that led them to grow organically year-on-year for the past 30 years. Starting in an NYU dorm room in the early 1990s, TransPerfect is today recognized as the largest translation and language services company in the world with thousands of employees located in 100+ offices around the world – and over $1 billion in revenue in 2023. The TransPerfect TechOps team has been a vital part of the company’s success since its formation 10 years ago – delivering technology and services that have drastically simplified the lives of our clients and colleagues – from workflow improvements for colleagues, to core services that form the basis of the company’s GlobalLink platform, to creating scalable client interfaces that allow novice users to navigate complex ecosystems. About this Role: In this role, you will help build and enhance our existing web applications and set the standards for code and performance. You will help design and execute the projects that support the company roadmap, collaborate to best build core systems for scalability, reliability and performance, to ensure that your product is delivering an amazing experience for users. This position offers: A significant leadership role in a dynamic, well-functioning technology division with the world's largest provider of language services and technology solutions An opportunity to set the standard in product direction and development Advantages of working in a team of subject matter experts that develop the new cutting-edge technology Outstanding financial rewards DESCRIPTION Define project deliverables to support delivery of high quality products on schedule that meet company needs and goals Ensure complete understanding of the development estimate and project scope – work with Product Managers and QA leads to assure development teams deliver projects within the framework of the given estimate and assumptions Manage software product releases and upgrades Provide support for continuous integration, test automation, source code control and review processes Constantly evaluate process and procedures for inefficiencies, and make recommendations for improvement, and drive them through with stakeholders Diverse duties include analysis, design, development, maintenance and support of complex applications, web services and RESTful Web services Co-ordinate with Product Managers all software delivery activities and acts as the escalation point for all development issues specific to projects Work with QA team to assure project quality and defined measurements of code quality Drive product strategy and vision to deliver high quality products and/or features on schedule that meet the product needs and corporate goals. Communicates project scope changes, prioritization reasoning, and decisions affecting product delivery Collaborate with stakeholders to assure timely completion of all tasks associated with development process Provide constructive feedback to management staff during all phases of the software lifecycle to keep development priorities aligned with business needs. Operate in agile environment, communicate and manage internal and external implementation requirements and expectations REQUIRED SKILLS Bachelor’s Degree preferred; computer science, information technology, math, or related field preferred Strong experience with source control management (GIT), continuous integration and deployment (Team Foundation Server, etc.), and best practices Experienced professional, 8+ years’ experience with .Net, C#, MVC, AJAX, JSON, JQuery, SQL Server, WCF, in developing and/or supporting custom applications, their reporting tools and integration points. In-depth knowledge of the Microsoft platform (IIS, .NET, Web Services, SQL Server, Windows Server, Clustering, Active Directory etc.) NoSQL database experience, an asset Demonstrated experience in coaching and mentoring others to success, developing team members and being a role model Good written and verbal communication skills with the ability to document and communicate technical information to IT professionals Good knowledge of techniques for planning, monitoring and controlling projects Deeply knowledgeable of the SDLC, Agile and SCRUM methodologies Experience managing projects and teams using agile management technologies such as Jira, Team Foundation Server, Confluence, Microsoft Project, Trello, etc. A problem solver's mentality with the ability to effectively communicate solutions and issues to stakeholders Show more Show less
Posted 2 weeks ago
15.0 years
0 Lacs
India
On-site
About the company As the world’s global leader in innovative and sustainable building materials, Holcim is reinventing how the world builds. With a presence in 60 countries and a global workforce of 60,000 across four key segments—cement, aggregates, ready-mix concrete, and solutions & products—Holcim is committed to building a greener, smarter, and healthier world. We are leading the way in reducing carbon emissions and accelerating the global transition toward low-carbon construction. To support this vision, we have established the Group Center of Excellence (CoE) for Advanced Analytics and AI, which focuses on data science, data engineering, cloud platform engineering, and business intelligence. About the role As a Data Science Specialist , you will be a core contributor to Holcim’s global Advanced Analytics CoE. You will collaborate with cross-functional teams to build and deploy cutting-edge AI/ML solutions that drive operational excellence and business decision-making. This is an individual contributor role with potential for career progression into people management based on performance and impact. Responsibilities Collaborate with business/domain SMEs to identify opportunities and develop predictive models for key business processes Select and apply suitable machine learning and deep learning algorithms, validate results, and derive actionable insights Work closely with product development teams to industrialize AI/ML models and rapidly prototype minimum viable solutions Conduct exploratory data analysis and hypothesis testing to derive business insights Manage end-to-end data science workflows including data acquisition, preprocessing, feature engineering, modeling, and optimization Design and implement generative AI solutions Build and deploy full-stack machine learning pipelines on cloud platforms such as AWS and GCP If required, adapt ML models for edge devices or mobile applications Qualifications Education BE/B.Tech in Computer Science, Engineering, or relevant STEM fields Graduate degree in Data Science or a related quantitative discipline is preferred Strong foundation in mathematics, statistics, and algebra AWS analytics certification is mandatory Experience 12–15 years of total experience, with at least 8 years in analytics/data science Experience in manufacturing, building materials, process industries, or pharmaceuticals is highly desirable Proven experience designing and deploying generative AI solutions Hands-on expertise in statistical and machine learning techniques Required Skills 8+ years of experience in machine learning and deep learning algorithms: decision trees, random forests, SVMs, regression, clustering, CNNs, RNNs, LSTMs, transformers, etc. 6+ years of programming experience in Python, PySpark, or equivalent languages for data manipulation and analysis 5+ years of working with cloud platforms, especially AWS, including experience deploying AI/ML solutions in production environments Strong knowledge of deep learning frameworks such as TensorFlow, Keras, or PyTorch Familiarity with business intelligence tools like QlikView and data engineering frameworks is a plus Equal opportunity Holcim is an equal opportunity employer. We value diversity and are committed to fostering an inclusive environment for all employees. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Key Responsibilities: Provide support and maintenance for Dell VXRail environments. Manage and troubleshoot VMware infrastructure. Perform basic storage management tasks, including provisioning, monitoring, and maintenance. Implement and manage clustering solutions to ensure high availability and reliability. Collaborate with other IT team members to resolve technical issues and improve system performance. Document system configurations, procedures, and troubleshooting steps. Assist in the planning and execution of system upgrades and migrations. Qualifications: Proven experience with Dell VXRail and VMware. Strong understanding of storage management principles and practices. Experience with clustering technologies and high availability solutions. Proficiency in Windows and Linux technologies. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication skills, both written and verbal. Relevant certifications (e.g., VMware Certified Professional) are a plus. Preferred Skills: Familiarity with other virtualization technologies. Experience with network configuration and management. Knowledge of backup and disaster recovery solutions. Ability to script and automate tasks using PowerShell or other scripting languages. Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Data Scientist SDE-2, SDE-3 & SDE-4 (Staff Data Scientist) Location: Noida & Bangalore Job Type: Full Time About us: PhysicsWallah is an Indian online education technology startup based in Delhi, originally created as a YouTube channel in 2014 by Mr. Alakh Pandey. We are the first company aiming to build an affordable online education platform for each Indian student who dreams of IIT & AIIMS but is unable to afford the existing offline/online education providers. We provide e-learning via our YouTube Channel and PhysicsWallah App/Website by providing lectures for JEE Mains and Advanced level, NEET and Board Exams. We are India’s first most viewed Educational channel on Youtube. YouTube Channel- https://youtube.com/c/PhysicsWallah About the Role: Qualification & Eligibility: Bachelor's or higher degree in a quantitative discipline (computer science, statistics, engineering, applied mathematics) Working Experience: SDE-2: 3 to 5 Years SDE-3: 5 to 7 Years SDE-4 (Staff): 7 to 10 Years Startup experience preferred, Edtech work experience bonus Roles & responsibilities: Help teams understand what data science can do for them and set the right expectations. Use deep learning, machine learning, and analytical methods to create scalable solutions for business problems. Create innovative solutions and applications utilising advanced NLP algorithms/architectures including (but not limited to ) LLMs for tasks such as text generation, summarization, translation, entity extraction and concept recognition, clustering, and more. Contribute to the execution of our vision for NLP-based technology solutions using various NLP toolkits like Huggingface, Spacy, CoreNLP, OpenNLP, etc. Perform relevant data analysis and benchmark the NLP solutions to improve our offerings. Be able to clearly communicate results and recommendations to various stakeholders. Evaluate the effectiveness of the solutions and improve upon them in a continuous manner. We expect one to have a mix of a strong technical background, the ability to understand the business implications of their work, and the ability to empathise with our users and work towards helping PhysicsWallah give them the best experience. Help and mentor junior members to become better data scientists. Skill Sets: Experience in building NLP (ML/DL) models. Strong foundational knowledge in transformers ( BERT, GPTs, T5s, etc.), embedded. Expertise in SQL and Python is a must. Hands-on experience with latest GenAI modelling (GPTs, Mistral, Falcon, and LLama) and approaches (RAG, Langchain, etc.). Experience using machine learning (structured, text, audio/video data) libraries (preferably in Python), deep learning (Tensorflow, PyTorch) Good to have: Foundational knowledge of any cloud (AWS/Azure/GCP) Expertise in querying relational, non-relational, graph databases. Experience in big data technologies (Spark, MapReduce, Pig, and Hive) Show more Show less
Posted 2 weeks ago
1.0 - 2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Intern, Global Commercial Pharma Pipeline Analytics Our Human Health Digital Data and Analytics team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of engaging, interacting with our customers and patients leveraging digital, data and analytics and measuring the impact. The Internship Program at our company features Cooperative (Co-op) education that lasts up to 6 months and will include one or more projects. These opportunities in our Human Health division can provide you with great development and a chance to see if we are the right company for your long-term goals. The program allows students to work on one or more data science projects within our Company's Human Health Commercial division. These opportunities are designed to facilitate the transition from academia to industry for soon-to-be graduates and will involve participation and contribution in real projects being carried out by our company’s data science team. This is also a way for you to identify if our company may be the right company for your long-term career goals. These positions are typically 3-6 months long and have various start dates throughout the year. Most of the projects will involve analyses in support of our company's commercial objectives. Candidates will be responsible for providing analytical and technical support, which includes the collection and analysis of internal and external pharmaceutical data to assist in making meaningful business decisions. Candidates will help to solve novel commercial analytic questions through use of Advanced Analytics (AI/ML). Required Education And Skills Candidates must be currently enrolled in bachelor’s / master’s / PhD degree program in a Quantitative discipline such as Data Science, Statistics, Mathematics, Physics, Economics, Computational Biology, Engineering, Management or other relevant discipline. Candidates must be expected to graduate in the next 1-2 years. Candidates must have fluency in at least one programming environments such as SQL/Python and sound understanding of OOPs Candidates must have familiarity with the basics of data science and advanced statistical methods (clustering, regression, classification, etc.) and ability to determine the correct method for the task or have the willingness to learn the same Candidates must have demonstrated ability to problem solve independently on complex analytical projects Candidates must have an interest in supporting pharmaceutical data and analytics initiatives like segmentation & targeting, digital analytics, big data analytics, patient claims analytics Candidates will be required to serve as a quantitative methodology professional by developing creative analytical solutions and applying appropriate statistical and/or machine learning methodologies to answer novel commercial analytic questions Candidates should be effective oral and written communicators Candidates must be able to strike a balance between methodological rigor and project timelines/deliverables Preferred Experience And Skills Knowledge on MS Office Package - Excel and PPT is must Candidates having experience with one or more of the standard machine learning and data manipulation packages such as scikit-learn/NumPy/Pandas Candidates having familiarity with SQL & databases Redshift/Athena Candidates having familiarity with Natural Language Processing & Large Language Models Planned Learning Experience And Skills To gain experience in healthcare data, insights, and analytics Therapeutic area experience in infectious diseases/HIV, immunology, cardio-metabolic and/or ophthalmology diseases Global market exposure Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics, and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Intern/Co-op (Fixed Term) Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Design, Data Engineering, Data Modeling, Data Science, Data Visualization, Machine Learning, Software Development, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 06/1/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345715 Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Data Engineering Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 1 Minimum 15 years of Full-time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the team in implementing new technologies for application development - Conduct regular code reviews to ensure quality and efficiency - Stay updated on industry trends and best practices for application development Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analysis & Interpretation - Good To Have Skills: Experience with Data Engineering - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in Data Analysis & Interpretation - This position is based at our Bengaluru office - A minimum of 15 years of Full-time education is required 1 Minimum 15 years of Full-time education Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Title - S&C Global Network - AI - Merchandising Analytics - Manager Management Level: 7-Manager Location: Bengaluru, BDC7C Must-have skills: Merchandising Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Must have knowledge of SQL, R and Python language and at-least one cloud-based technology (Azure, AWS, GCP) Must have knowledge of building price/discount elasticity models and conduct non-linear optimization. Must have good knowledge of state space & mixed-effect modeling techniques and optimization algorithms and applicability to industry data Must have AI capability migration experience from one cloud platform to another. Manage documentation of data models, architecture, and maintenance processes Have an understanding of econometric/statistical modeling and analysis techniques such as regression analysis, hypothesis testing, multivariate statistical analysis, time series techniques, optimization techniques, and statistical packages such as R, Python, SQL, PySpark. Proficient in Excel, MS word, PowerPoint, etc. Strong client and team management and planning of large-scale projects with risk assessment. Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. WHAT’S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20,000 smart and driven colleagues experienced in leading statistical tools, methods, and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture will continually invest in your learning and growth. You'll work with Data Driven Merchandizing experts, and Accenture will support you in growing your own tech stack and certifications. In Applied intelligence you will understand the importance of sound analytical decision-making, relationship of tasks to the overall project, and executes projects in the context of a business performance improvement initiative. What You Would Do In This Role Working through the phases of project Define data requirements for Data Driven Merchandizing capability. Clean, aggregate, analyze, interpret data, and carry out data quality analysis. 5+ years of advanced experience in Data Driven Merchandizing which involves setting up Pricing/Promotions/Assortment Optimization capabilities across retail clients. Knowledge of price/discount elasticity estimation. Experience in working with non-linear optimization techniques. Proficiency in Statistical Timeseries models, store clustering algorithms, descriptive analytics to support merch AI capability. Hands on experience in state space modeling and mixed effect regression. Development of AI/ML models in Azure ML tech stack. Develop and Manage data pipelines. Develop and Manage Data within different layers of Snowflake environment. Aware of common design patterns for scalable machine learning architectures, as well as tools for deploying and maintaining machine learning models in production. Knowledge of cloud platforms and usage for pipelining and deploying and scaling elasticity models. Working knowledge of Price/Promo optimization. Well versed with creating insights presentations and client ready decks. Should be able to mentor and guide a team of 10-15 people under him/her. Manage client relationships and expectations and communicate insights and recommendations effectively. Capability building and thought leadership. Logical Thinking – Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Notices discrepancies and inconsistencies in information and materials. Task Management – Advanced level of task management knowledge and experience. Should be able to plan own tasks, discuss and work on priorities, track, and report progress. About Our Company | Accenture Experience: 12-14Years Educational Qualification: Any Degree Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Devo, the cloud-native logging and security analytics company, empowers security and operations teams to maximize the value of all their data. Only the Devo platform delivers the powerful combination of real-time visibility, high-performance analytics, scalability, multitenancy, and low TCO crucial for monitoring and securing business operations as enterprises accelerate their shift to the cloud. Headquartered in Boston, Mass., Devo is backed by Insight Partners, Georgian, and Bessemer Venture Partners. Learn more at www.devo.com. Devo security products team is developing the world’s first and only Autonomous SOC to revolutionize the security industry. Candidates will be working on the most cutting-edge security technology when it comes to autonomous and automated threat detections, behavioral analysis, investigations, and hunts. Job Summary We are seeking a highly motivated Sr. Data Scientist with a proven track record of developing threat detection algorithms. The ideal candidate will be responsible for the entire machine learning life cycle, from data acquisition and feature engineering to model evaluation and monitoring. Responsibilities Experience in transferring applied research to technologies. Experience in cyber security concepts such as malware detection, fraud prevention, adversary tradecrafts, emerging threats. Design, develop, and implement scalable data pipelines using Spark/PySpark and big data frameworks to ingest, transform, and load data. Optimize and troubleshoot complex queries for efficient data retrieval and performance. Build and deploy AI-ML models across diverse data sources to extract valuable insights. Familiar with modern tools for data exploration and analysis. Collaborate with data scientists, analysts, and stakeholders to understand business needs and translate them into actionable data solutions. Document code and processes for clarity and knowledge transfer. REQUIREMENTS Bachelor’s/Master's degree in Data Science, Statistics, Computer Science, or a related field (a plus). At least 5 years of experience working in query tuning, performance tuning, implementing and debugging Spark/Pyspark and other big data solutions. Experience with anomaly detection, clustering statistics, time series analysis, reinforcement learning, generative AI/agentic frameworks, large language models. Experience with cloud platforms like AWS or GCP and containerization technologies like Docker and Kubernetes. Strong problem-solving, analytical, and critical thinking skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. DESIRED Understanding of Security Operations Center (SOC) and security management workflows in large organizations. Experience with data governance and data quality frameworks. Compensation The base salary range is what we expect to pay a substantially qualified candidate, with final offer being based on the candidate’s relevant experience and skills, as well as location and other factors. Total compensation for the role will include base salary, as well as a bonus or commission target and an equity grant applicable to the level of the role. WHY WORK AT DEVO? You’ll join a company where we value our people and provide the tremendous opportunities that come with a hyper-growth organization. Be part of an international company with a strong team culture that celebrates success. Share our core values: Be bold - Be Inventive - Be humble - Be an ally . Work in an environment that will challenge you and enable you to grow as a professional. Comprehensive Benefits, Including Top end hardware Employee referral program — get a bonus for helping friends get jobs at Devo! Employee Stock Option Plan. Company offsites and events Gender and diversity initiatives to increase visibility, inclusion, and sense of belonging. Show more Show less
Posted 2 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Juniper, we believe the network is the single greatest vehicle for knowledge, understanding, and human advancement the world has ever known. To achieve real outcomes, we know that experience is the most important requirement for networking teams and the people they serve. Delivering an experience-first, AI-Native Network pivots on the creativity and commitment of our people. It requires a consistent and committed practice, something we call the Juniper Way. Group Description Team is responsible of driving technology leadership in Juniper’s switching products (MX, PTX and ACX), deployed in some of the world’s largest enterprises and data centers. JDI is driving Juniper’s growth in revenue and market share in switching, routing space, by delivering market leading products with continuous innovation and relentless execution. About the Position: The Software Engineer in Test is a highly technical role in Networking Products validation and automation of test cases. The Candidate should be well versed with networking fundamentals, cos, firewall, L2/L3 protocols ,virtualization technologies and test frameworks & test automation. Key Responsibilities include but are not limited to: platforms features qualification Design quality test plan based on Requirements Building topology required for testing Good analytical skills Execute test cases and create automation scripts. Provide weekly reports on progress Desired Technology Knowledge on: Switches and IP Networking Fundamentals. Internet Protocols: TCP/UDP/IPv4/IPv6 Expertise in one of the below technical areas Layer 2: Data center & Switching technologies STP,RSTP,MSTP Access control, DHCP,Dot1X,POE FCOE Virtual chassis, clustering Layer 3: Routing and routed protocols MPLS, LDP, RSVP,L3VPN,L2VPN OSPF,ISIS,BGP Platform specific features : COS ,Firewall,ACL Schedulers ,Rate limiters Virtualization technologies Multi chassis, clustering Automation is mandatory for all technical areas Perl, python Experience/Qualification: BE/BTEC/ME/MTEC from reputed university/college 1-2+ year of experience in networking products qualification Certifications in networking like JNCIA/JNCIS/JNCIP or equivalent will be an added advantage Google Gtest, Python unit class based test and Machine learning experience will be added advantage About Juniper Networks Juniper Networks challenges the inherent complexity that comes with networking and security in the multicloud era. We do this with products, solutions and services that transform the way people connect, work and live. We simplify the process of transitioning to a secure and automated multicloud environment to enable secure, AI-driven networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net) or connect with Juniper on Twitter, LinkedIn and Facebook. WHERE WILL YOU DO YOUR BEST WORK? Wherever you are in the world, whether it's downtown Sunnyvale or London, Westford or Bengaluru, Juniper is a place that was founded on disruptive thinking - where colleague innovation is not only valued, but expected. We believe that the great task of delivering a new network for the next decade is delivered through the creativity and commitment of our people. The Juniper Way is the commitment to all our colleagues that the culture and company inspire their best work-their life's work. At Juniper we believe this is more than a job - it's an opportunity to help change the world. At Juniper Networks, we are committed to elevating talent by creating a trust-based environment where we can all thrive together. If you think you have what it takes, but do not necessarily check every single box, please consider applying. We’d love to speak with you. Additional Information for United States jobs: ELIGIBILITY TO WORK AND E-VERIFY In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. Juniper Networks participates in the E-Verify program. E-Verify is an Internet-based system operated by the Department of Homeland Security (DHS) in partnership with the Social Security Administration (SSA) that allows participating employers to electronically verify the employment eligibility of new hires and the validity of their Social Security Numbers. Information for applicants about E-Verify / E-Verify Información en español: This Company Participates in E-Verify / Este Empleador Participa en E-Verify Immigrant and Employee Rights Section (IER) - The Right to Work / El Derecho a Trabajar E-Verify® is a registered trademark of the U.S. Department of Homeland Security. Juniper is an Equal Opportunity workplace. We do not discriminate in employment decisions on the basis of race, color, religion, gender (including pregnancy), national origin, political affiliation, sexual orientation, gender identity or expression, marital status, disability, genetic information, age, veteran status, or any other applicable legally protected characteristic. All employment decisions are made on the basis of individual qualifications, merit, and business need. Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
India
On-site
Bloomreach is building the world’s premier agentic platform for personalization .We’re revolutionizing how businesses connect with their customers, building and deploying AI agents to personalize the entire customer journey. We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses. We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey. We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do. And we're building all of that on the intelligence of a single AI engine — Loomi AI — so that personalization isn't only autonomous…it's also consistent.From retail to financial services, hospitality to gaming, businesses use Bloomreach to drive higher growth and lasting loyalty. We power personalization for more than 1,400 global brands, including American Eagle, Sonepar, and Pandora. Our centralized data science team spanned across multiple geographies, is responsible for the data science modules that power all the products of the company, including Search Relevance, Recommendation, User Personalization, User Segmentations, Content Intelligence and Conversational Commerce. We invent and apply machine learning, data mining, and information retrieval algorithms to understand, identify, and improve web content discovery. We have built industry leading algorithms in search and recommendations for the commerce space that serve the most relevant experiences using Artificial Intelligence with the goal-building AI driven experiences beyond commerce. Your Responsibilities: Design, develop, and enhance ML/AI models which mainly power Search and Recommendation. Process historical data, search queries, product catalog, and images to extract hidden relations and features. Conduct research to explore ongoing cutting-edge ML techniques (especially deep learning) and conduct a quick POC. Work closely with Data Engineers and Senior Data Scientists to integrate and scale ML components to a production-level that can handle terabytes of data. Continuously learn and stay up to date with the current state-of-the-art techniques by reading research papers and attending AI/ML conferences. Qualifications: BS/MS degree in Computer Science or a related discipline with a strong mathematical foundation and excellent programming skills (primarily Python) 5-8 years experience building ML-driven fast and scalable ML/analytical algorithms in a corporate/startup environment. Strong awareness and understanding of recent trends in Generative AI and LLMs. Experience in working with GenAI stack will be treated as strong credentials. Strong understanding of various machine learning and natural language processing technologies, such as classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking. Excellent exploratory data analysis skills with the ability to slice and dice data at scale using SQL in Redshift/BigQuery. Good problem solving and analytical skills. Ability to learn and adapt to newer ML technologies. Exposure to deep learning stack (PyTorch/Keras/TensorFlow) and techniques (Representation/Transfer Learning, RNN/LSTM, Transformers). Experience working with Big Data in a cloud based production environment (AWS/GCP/Azure). Effective communication skill in English, both verbally and in written form. More things you'll like about Bloomreach: Culture: A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. We have defined our 5 values and the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. We believe in flexible working hours to accommodate your working style. We work virtual-first with several Bloomreach Hubs available across three continents. We organize company events to experience the global spirit of the company and get excited about what's ahead. We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*. The Bloomreach Glassdoor page elaborates on our stellar 4.4/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5 Personal Development: We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions. Our resident communication coach Ivo Večeřa is available to help navigate work-related communications & decision-making challenges.* Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins. Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)* Well-being: The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.* Subscription to Calm - sleep and meditation app.* We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones. We facilitate sports, yoga, and meditation opportunities for each other. Extended parental leave up to 26 calendar weeks for Primary Caregivers.* Compensation: Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.* Everyone gets to participate in the company's success through the company performance bonus.* We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts. We reward & celebrate work anniversaries -- Bloomversaries!* (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.) Excited? Join us and transform the future of commerce experiences! If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful! Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees. Show more Show less
Posted 2 weeks ago
7.0 - 10.0 years
8 - 14 Lacs
Patna
Work from Office
Design and optimize enterprise database systems to ensure scalability and security. Oversee database architecture, performance tuning, and cloud-based DBMS implementation. Must have expertise in SQL, NoSQL, and database optimization techniques. Collaborate with developers and data engineers to maintain high-performance data pipelines. Strong knowledge of AWS, Azure, or Google Cloud is required. Experience in managing large-scale database environments is preferred.
Posted 2 weeks ago
4.0 years
0 Lacs
Delhi, India
On-site
About Us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 61 offices in 39 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN plays a critical role in supporting Bain's case teams globally to help with analytics and research across all industries, for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who You’ll Work With This role is based out of the People & ORG CoE which sits in the broader Data & Tech cluster at the BCN. People & ORG CoE works on building and deploying analytical solutions pertaining to Operating Model and Organization Practice. The team primarily helps Bain case teams, across geographies and industries, solve critical client issues by applying battle-proven diagnostics/ solutions that can identify client pain points related to org, culture, and talent. The team also plays a significant role in creating, testing, and contributing to the proprietary products and Bain IP within the Org domain. This role will focus on development, maintenance and evolution of the state-of-the art org tool and data assets. What You’ll Do Become an expert and own, maintain and evolve advanced internal tools (Python focused) as well as help develop new tools with LLMs and GenAI, in individual capacity Be responsible for end-to-end handling of the entire tool process, i.e., developing Python scripts, troubleshooting errors, etc. Help with case delivery related to those tools and generate meaningful insights for Bain clients Potentially build and maintain internal web applications using front-end technologies (HTML, CSS, JavaScript) and frameworks like Streamlit; ensure compatibility across devices and browsers Work under the guidance of a Team Manager / Sr. Team Manager, playing a key role in driving the team’s innovation, especially on GenAI topics – identifying areas for automation and augmentation, helping team create efficiency gains Lead internal team calls and effectively communicate data, knowledge, insights and actionable next steps on tool development, relaying implications to own internal team where necessary Keep abreast of new and current statistical, database, machine learning, and advanced analytics techniques About You Candidate should be a Graduate/ Post-graduate from top-tier college with strong academic background Must-have relevant experience of 4+ years on Python, with experience using/ building tools using GenAI, LLMs or Machine Learning will be preferred Advanced understanding of database design and Azure/ AWS servers functioning would be preferred Good to have experience in SQL, Git, and hands-on experience with statistical and machine learning models (e.g., regression, classification, clustering, NLP, ensemble methods), including practical application in business contexts Good to have experience of HTML, CSS, JavaScript (ES6+), pgadmin and low-code development tools such as Streamlit, Mendix, Power Apps Experience with data science/ data analytics and ETL tools such as Alteryx, Informatica, will be a plus Must be able to generate and screen realistic answers based on sound reality checks and recommend actionable solutions Must be willing to own and maintain high visibility and high impact product Experience in managing productized solutions will be helpful Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines What Makes Us a Great Place To Work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
The D. E. Shaw group is a global investment and technology development firm with more than $65 billion in investment capital as of December 1, 2024, and offices in North America, Europe, and Asia. Since our founding in 1988, our firm has earned an international reputation for successful investing based on innovation, careful risk management, and the quality and depth of our staff. We have a significant presence in the world's capital markets, investing in a wide range of companies and financial instruments in both developed and developing economies. The D. E. Shaw group uses a combination of quantitative and qualitative tools to uncover self-governing, hard-to-find sources of return across global public and private markets. We are seeking an analyst to join our Compliance team at the firm’s office based in Hyderabad, Gurugram or Bengaluru, to work closely with a team of attorneys and compliance professionals to help ensure that the firm’s business activities are conducted in strict accordance with regulatory requirements. The Compliance group is an integral part of our global Financial Operations group and is responsible for implementing and enforcing policies and procedures across multiple regulatory requirements. This position affords the opportunity for broad exposure to the firm’s trading, operations, and software development groups, as well as the chance to handle real-time issues related to the firm’s expanding suite of global investment products in a dynamic, constantly evolving regulatory environment. WHAT YOU'LL DO DAY-TO-DAY: In this role, you will be assisting the Front Office Trading desks by conducting thorough research on the firm’s positions and applicable regulations for each asset class across jurisdictions through the deployment of advanced data analytics and data visualization techniques like statistical analysis, predictive modelling, clustering, etc. on tools like PowerBI and others. You will be required to perform regulatory filing obligations, requiring complex quantitative and qualitative analysis of the firm’s short position and long ownership reports across US, UK, EU, Japan, etc. jurisdictions. Furthermore, you will be expected to assist in monitoring and implementing controls designed to ensure compliance with global regulations by deploying various data analytics and visualization techniques. You will also be required to conduct a thorough analysis of the fundamental research data collected by the trading groups through their interactions with external industry experts/consultants around various market sectors to avoid potential trading while in possession of material, non-public information. Additionally, you will be expected to assist with long-term projects to maintain the firm’s high standards of compliance with new and existing regulations. WHO WE’RE LOOKING FOR: Basic qualifications: A master’s degree in finance, Chartered Accountancy (CA) or equivalent, along with up to 3 years of work experience in the financial services industry Self-motivation, a results-oriented mindset, robust analytical and problem-solving skills, learning agility, attention to detail, the ability to work in a dynamic environment and the capacity to meet tight deadlines Excellent written and oral communication skills along with exceptional interpersonal skills Interested candidates can apply through our website: https://www.deshawindia.com/recruit/jobs/Adv/Link/AnlyCompApr25 We encourage candidates with relevant experience looking to restart their careers after a break to apply for this position. Learn about Recommence, our gender-neutral return-to-work initiative. The Firm offers excellent benefits, a casual, collegial working environment, and an attractive compensation package. For further information about our recruitment process, including how applicant data will be processed, please visit https://www.deshawindia.com/careers. Members of the D. E. Shaw group do not discriminate in employment matters on the basis of sex, race, colour, caste, creed, religion, pregnancy, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class. Show more Show less
Posted 2 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Zeta Zeta is a Next-Gen Banking Tech company that empowers banks and fintechs to launch banking products for the future. It was founded by Bhavin Turakhia and Ramki Gaddipati in 2015.Our f lagship processing platform - Zeta Tachyon - is the industry’s first modern, cloud-native, and fully API-enabled stack that brings together issuance, processing, lending, core banking, fraud & risk, and many more capabilities as a single-vendor stack. 15M+ cards have been issued on our platform globally. Zeta is actively working with the largest Banks and Fintechs in multiple global markets transforming customer experience for multi-million card portfolios. Zeta has over 1,700+ employees across the US, EMEA, and Asia, with 70%+ roles in R&D . Backed by SoftBank, Mastercard, and other investors , we raised $330M at a $2B valuation in 2025 . Learn more @ www.zeta.tech , careers.zeta.tech , Linkedin , Twitter About The Role The BI Engineer will be responsible for organizing and reporting data related to sales numbers, market research, logistics, linguistics, or other behaviors. They utilize technical expertise to ensure data reported is accurate and high-quality. Data will need to be analyzed, designed, and presented in a way that assists individuals, business owners and customer stakeholders to make better decisions. Responsibilities Technical: Building extremely fast, highly current data reporting and analytical systems that will be used by multiple teams to drive decisions utilizing typical components such as ETLs with Python and SQL queries on both SQL and NoSQL databases. Ensure consistent optimization of performance and quality so as to enable faster decision making. Dashboard Creation and Reporting: Develop dashboards and comprehensive documentation to effectively communicate results. Regularly monitor key data metrics, facilitating informed decision-making. Business Metrics Identification: Identify and analyze key business metrics, offering strategic insights. Recommend product features based on the identified metrics to enhance overall product functionality. Cross-Functional Collaboration: Collaborate seamlessly with Engineering, Product, and Operations teams to conceptualise, design, and construct data reporting and analytical systems. Ideation and Analysis: Generate ideas for exploratory analysis, actively shaping the trajectory of future projects. Provide insightful recommendations for strategic actions based on data-driven insights. Rapid Prototyping and Product Discussions: Drive the rapid prototyping of solutions, actively participating in discussions related to product and feature development Skills Bachelor’s/Master’s degree in engineering In-depth expertise in SQL and Python programming. Exceptional quantitative and problem-solving skills Good command over analytical and visualization tools like Tableau, Metabase etc. Basic Knowledge of Data Modeling, ETL Process, statistical and ML techniques such as classification, linear regression modelling, clustering and decision trees, etc. Ability to work with cross-functional and dependent teams, think and own on delivering end to end. Excellent problem-solving skills and ability to work independently or as part of a team. Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Experience And Qualifications Bachelor’s/Master’s degree in engineering (computer science, information systems) At least 1-2years of experience in working on data, especially on reporting, data analysis Equal Opportunity Zeta is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We encourage applicants from all backgrounds, cultures, and communities to apply and believe that a diverse workforce is key to our success. Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
6 - 13 Lacs
Anupshahr, Pune, Delhi / NCR
Work from Office
We are seeking an experienced and driven Cluster Head to oversee contract farming operations, organic production, and back-end logistics within a designated cluster. The ideal candidate will have a strong background in agriculture, excellent leadership abilities, and a deep understanding of organic farming systems. This position reports to senior leadership and will manage a cluster team of 1012, including 23 direct reports. Key Responsibilities: Lead and manage all production and operational activities for the assigned cluster Oversee planning and execution of farming activities in collaboration with network farmers Ensure adherence to UFCo’s regenerative organic farming protocols and zero chemical usage Collaborate with the sales team to align production with market demand and SKU planning Mobilize, train, and motivate network farmers for organic contract farming Monitor collection centers (RACs) for sorting, grading, quality checks, and dispatch Ensure smooth logistics and delivery of produce to distribution centers Coordinate with the technical CoE to implement best practices and provide field feedback Optimize budgets and resource allocation across the cluster Identify opportunities to expand operations within the region Required Qualifications & Skills: B.Sc. or M.Sc. in Agriculture preferred; other disciplines acceptable with relevant experience 8–10 years of experience in agri-business, farming operations, or contract farming management Prior experience in organic or natural farming is highly desirable Strong communication skills in English, Hindi, and the local language Demonstrated team leadership with experience managing direct reports Proficient in MS Office (Excel, Word, PowerPoint); working knowledge of ERP systems Key Skills: Contract Farming Organic Farming / Natural Farming Cluster / Region-level Operations Farmer Engagement & Training Farm Management & Logistics Agribusiness Strategy Budgeting & Cost Optimization
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Trident Consulting is seeking a "Machine Learning Engineer" for one of our clients in "India- Remote" Position: Machine Learning Engineer Job Location: India- Remote Job Type: Fulltime About the Role: We are seeking a skilled Machine Learning Engineer with expertise in PyTorch and a strong understanding of core machine learning principles to develop and deploy anomaly detection models on top of ELK (Elasticsearch, Logstash, Kibana). The ideal candidate will be responsible for building and integrating machine learning models to detect anomalies in log data using supervised or semi-supervised approaches Responsibilities: Train and evaluate models using PyTorch for anomaly detection in time-series or log data. Preprocess and transform log data ingested via Logstash and indexed in Elasticsearch for ML consumption. Design and run experiments to benchmark anomaly detection performance. Integrate trained models with the ELK Stack for real-time anomaly scoring and visualization in Kibana. Collaborate with DevOps/SRE teams to understand system behavior and define anomaly patterns. Document model assumptions, behavior, and performance metrics clearly and thoroughly. Required Qualifications: Proven experience with PyTorch for developing machine learning models. Strong grasp of model training fundamentals. Solid understanding of anomaly detection techniques in log or time-series data. Experience working with Elasticsearch, Logstash, and Kibana. Proficiency in Python and common data manipulation libraries (Pandas, NumPy, etc.). Familiarity with data pipeline design, feature engineering, and model deployment. Comfortable working in Linux environments and using version control (Git) Preferred Qualifications Prior experience integrating ML models with ELK. Familiarity with log data formats, common system logs (e.g., syslog, nginx, application logs). Experience with real-time inference or streaming data frameworks (e.g., Kafka, Flink). Knowledge of unsupervised anomaly detection techniques (e.g., autoencoders, clustering). Exposure to MLOps practices. About Trident Trident Consulting is a premier IT staffing firm providing high-impact workforce solutions to Fortune 500 and mid-market clients. Since 2005, we've specialized in sourcing elite technology and engineering talent for contract, direct hire, and managed services roles. Our expertise spans cloud, AI/ML, cybersecurity, and data analytics, supported by a 3M+ candidate database and a 78% fill ratio. With a highly engaged leadership team and a reputation for delivering hard-to-fill, niche talent, we help organizations build agile, high-performing teams that drive innovation and business success. Some of our recent awards include: Trailblazer Women Award 2025 by Consulate General of India in San Francisco Ranked as the #1 Women Owned Business Enterprise in the large category by IT-Serve. Received the Tech-Serve Excellence award. Consistently ranked in the Inc. 5000 list of fastest-growing private companies in America Recognized in the SF Business Times as one of the Largest Bay Area BIPOC/Minority-Owned Businesses in 2022 Show more Show less
Posted 2 weeks ago
40.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Managing Oracle AIA PIPs (O2C and AABC PIP) ,Oracle SOA Suite and OSB 12c environments, performing installation, configuration, and clustering tasks, and monitoring and troubleshooting composites, pipelines, and integrations ,integration with Oracle Enterprise Manager (OEM). The administrator will oversee WebLogic domains, JMS resources, and security configurations while conducting health checks, performance tuning, and issue resolution for middleware systems. Documenting processes and adhering to best practices is also a key part of the role. Career Level - IC2 Responsibilities Install, configure, and administer Oracle AIA solutions, including O2C and AABC PIPs, ensuring proper integration with Oracle SOA Suite and Oracle OSB environments. Manage and maintain Oracle SOA/OSB Suite, Oracle AIA 12.x , and WebLogic Server environments, ensuring high availability, performance, and scalability. Monitor, troubleshoot, and resolve issues using Oracle Enterprise Manager (OEM) and other monitoring tools to ensure optimal system performance. Identify and resolve infrastructure-related issues in BPEL, ESB, and XML/XSLT workflows to maintain seamless integration across enterprise systems. Perform regular system upgrades and patches for Oracle AIA, SOA Suite, and WebLogic environments, ensuring minimal downtime and risk mitigation. Work closely with the development team to configure and deploy Oracle AIA PIPs, facilitating the integration of enterprise applications. Provide ongoing support and troubleshooting for AIA-related integrations and workflows, ensuring timely resolution of technical challenges. Document and enforce integration best practices, including change management, version control, and deployment procedures to ensure consistency and reliability across environments. Collaborate with cross-functional teams to understand business requirements and tailor AIA solutions to meet specific integration needs. Conduct periodic health checks and performance tuning for Oracle AIA and SOA environments to optimize system efficiency and response times. Ensure compliance with security standards, ensuring that Oracle AIA environments are secured and appropriately configured according to enterprise security policies. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Management Level G Company Overview : Equiniti is a leading international provider of shareholder, pension, remediation, and credit technology. With over 6000 employees, it supports 37 million people in 120 countries. EQ India began its operations in 2014 as a Global India Captive Centre for Equiniti, a leading fintech company specialising in shareholder management. Within a decade, EQ India strengthened its operations and transformed from being a capability centre to a Global Competency Centre, to support EQ's growth story worldwide. Capitalising on India’s strong reputation as a global talent hub for IT / ITES, EQ India has structured the organisation to be a part of this growth story. Today, EQ India has evolved as an indispensable part of EQ Group providing critical fintech services to the US and UK. EQ’s vision is to be the leading global share registrar, offering complementary services to its client base and our values set the core foundations to our success. We are TRUSTED to deliver on our commitments, COMMERCIAL in building long term value, COLLABORATIVE in our approach and we IMPROVE by continually enhancing our skills and services. There has never been a better time to join EQ. Role Summary The SQL Server DBA will be responsible for the implementation, configuration, maintenance, and performance of critical SQL Server RDBMS systems, to ensure the availability and consistent performance of our corporate applications. This is a “hands-on” position requiring solid technical skills, as well as excellent interpersonal and communication skills. The successful candidate will be responsible for the development and sustainment of all elements of SQL Server, ensuring its operational readiness (security, health and performance), executing data loads, and performing data modelling in support of multiple development teams. The databases support an enterprise suite of application and management tools. Must be capable of working independently and collaboratively. Core Duties/Responsibilities The successful candidate will be responsible for the following: Manage SQL Server databases through multiple product lifecycle environments, from development to mission-critical production systems both on premise and in the Public Cloud. Configure and maintain database servers and processes, including monitoring of system health and performance, to ensure high levels of performance, availability, and security. Apply data modelling techniques to ensure development and implementation support efforts meet integration and performance expectations Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end. Refine and automate regular processes, track issues, and document changes Assist developers with complex query tuning and schema refinement. Provide 24x7 support for critical production systems (upon request). Perform scheduled maintenance and support release deployment activities after hours. Share domain and technical expertise, providing technical mentorship and cross-training to peers and team members. The role is based in any Equiniti India office, working US Hours, there may also be a requirement to participate in a call out rota, and for completing some activities outside of business hours where they may be service affecting. Home working is an option. Skills, Knowledge & Experience The successful candidate will demonstrate the following experience, skills and behaviours: 5+ years MS SQL Server Administration experience required Experience of AWS Public Cloud Knowledge of Automation techniques and tools (Powershell, Terraform, Puppet) Experience with Performance Tuning and Optimization (PTO), using native monitoring and troubleshooting tools Experience with backups, restores and recovery models Knowledge of Clustering, Failover Cluster Instances (FCI) High Availability (HA) and Disaster Recovery (DR) options for SQL Server Experience working with Windows server, including Active Directory Excellent written and verbal communication Flexible, team player, “get-it-done” personality Ability to organize and plan work independently Ability to work in a rapidly changing environment Ability to multi-task and context-switch effectively between different activities and teams Relevant Microsoft certifications a plus Relevant AWS certification a plus A working knowledge of Redgate SQL monitor Benefits : Being a permanent member of the team at EQ you will be rewarded by our company benefits, these are just a few of what is on offer: 3 days of additional leaves on & above statutory requirement along with 2 days of voluntary leaves to pursue the CSR initiatives Business related certification expense reimbursement Comprehensive Medical Assurance coverage for dependents & Parents Cab transport for staff working in UK & US shift Accidental & Life cover 3 times of concerned CTC We are committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships. Please note any offer of employment is subject to satisfactory pre-employment screening checks. Show more Show less
Posted 2 weeks ago
40.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Managing Oracle AIA PIPs (O2C and AABC PIP) ,Oracle SOA Suite and OSB 12c environments, performing installation, configuration, and clustering tasks, and monitoring and troubleshooting composites, pipelines, and integrations ,integration with Oracle Enterprise Manager (OEM). The administrator will oversee WebLogic domains, JMS resources, and security configurations while conducting health checks, performance tuning, and issue resolution for middleware systems. Documenting processes and adhering to best practices is also a key part of the role. Career Level - IC3 Responsibilities Install, configure, and administer Oracle AIA solutions, including O2C and AABC PIPs, ensuring proper integration with Oracle SOA Suite and Oracle OSB environments. Manage and maintain Oracle SOA/OSB Suite, Oracle AIA 12.x , and WebLogic Server environments, ensuring high availability, performance, and scalability. Monitor, troubleshoot, and resolve issues using Oracle Enterprise Manager (OEM) and other monitoring tools to ensure optimal system performance. Identify and resolve infrastructure-related issues in BPEL, ESB, and XML/XSLT workflows to maintain seamless integration across enterprise systems. Perform regular system upgrades and patches for Oracle AIA, SOA Suite, and WebLogic environments, ensuring minimal downtime and risk mitigation. Work closely with the development team to configure and deploy Oracle AIA PIPs, facilitating the integration of enterprise applications. Provide ongoing support and troubleshooting for AIA-related integrations and workflows, ensuring timely resolution of technical challenges. Document and enforce integration best practices, including change management, version control, and deployment procedures to ensure consistency and reliability across environments. Collaborate with cross-functional teams to understand business requirements and tailor AIA solutions to meet specific integration needs. Conduct periodic health checks and performance tuning for Oracle AIA and SOA environments to optimize system efficiency and response times. Ensure compliance with security standards, ensuring that Oracle AIA environments are secured and appropriately configured according to enterprise security policies. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description and Requirements "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! BMC is looking for a Java Tech Lead, an innovator at heart, to join a team of highly skilled software developers team. Here is how, through this exciting role, YOU will contribute to BMC's and your own success : Design and develop platform solution based on Java/J2EE best practices and web standards. Discover, design, and develop analytical methods to support novel approaches of data and information processing Lead/participate in all aspects of product development, from requirements analysis to product release. Lead feature/product engineering teams and participate in architecture and design reviews. Responsible for delivery of high quality commercial software releases to aggressive schedules. Good troubleshooting and debugging skills. Ability to lead and participate on empowered virtual teams to deliver iteration deliverables, and drive the technical direction of the product. Design enterprise platform using UML, process flows, sequence diagrams, and pseudo-code level details ensuring solution alignment. Develop and implement software solutions that leverage GPT, LLM, and conversational AI technologies. Integrate GPT and LLM models into the software architecture to enable natural language understanding and generation. To ensure you’re set up for success, you will bring the following skillset & experience: You have 10+ experience in designing and developing complex framework and platform solutions with practical use of design patterns. You are expert in server-side issues such as caching, clustering, persistence, security, SSO, high scalability/availability and failover You have experience in big data engineering technologies such as: stream/stream processing frameworks, and NoSQL databases. You are experience in open source Java frameworks such as OSGI, Spring, JMS, JPA, JTA, JDBC. Kubernetes, AWS, GCP and Azure cloud platforms You are experience in PostgreSQL database and Aspect oriented architectures. You are experience in open source participation and apache projects, patent process, in depth knowledge of App server architectures and SaaS or PaaS enabling platforms. You are familiarity with REST API principles, object-oriented design, and design patterns. You have knowledge of fine tuning LLMs including BERT and GPT based Whilst these are nice to have, our team can help you develop in the following skills Familiarity with data warehouse/data lake platforms Snowflake, Databricks, Bigquery Knowledge of cloud platforms Amazon AWS, Google GCP, Oracle OCI, Microsoft Azure Experience in Generative AI frameworks such as LangChain and LlamaIndex CA-DNP Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. < Back to search results BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 4,166,900 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Qualcomm India Private Limited Job Area Information Technology Group, Information Technology Group > IT Engineering General Summary Candidate in this role will be required to operate in India hours. Manage operational and support responsibilities for numerous critical services supporting Qualcomm's software engineering environment globally. Experience in handling operations role to lead delivery for Tier1/Tier2 with SLA’s and Process enhancements. Expected to support and troubleshoot existing systems, solve technical challenges, and enhance the reliability and performance for the services. Provide technical support with an emphasis on customer service for CI/CD, Jenkins, Windows, Linux, UNIX, networking, enterprise applications. Front-end Application and Systems support for Software Engineering Compute. Basic troubleshooting of Windows & Linux related issues. Basic troubleshooting of specific applications & tools. Minimum Qualifications 3+ years of IT-related work experience with a Bachelor's degree. OR 5+ years of IT-related work experience without a Bachelor’s degree. Physical Requirements Frequently transports and installs equipment up to 20 lbs. Excellent oral and written communication skills Solid understanding of Linux and Windows server OS' fundamentals, NFS, CIFS, file systems, Networking, Performance tuning, Scripting, Active Directory. Experience in Microsoft OS (Windows 2012, Windows 2008, Windows 2003, MS Clustering). AND/OR Linux/Unix (Ubuntu 12, 14 & 16, RedHat Linux ES 5, and 6) Knowledge of DNS, TCP/IP and other networking concepts Must be a self-starter, team player Excellent troubleshooting skills and customer support experience Customer Service oriented and thrive in a dynamic environment Experience in any of Linux, Ubuntu, redhat and clustering technologies and scripting knowledge on Shell, Python. Bachelors / Masters degree in any stream Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers. 3073148 Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About The Role We are seeking a skilled and experienced Database Administrator (DBA) to join our team. The ideal candidate will have 3 + years of hands-on experience with MySQL, ProxySQL, and PostgreSQL, specializing in database administration, performance tuning, query optimization, high availability, backup, and archival. This role involves managing and supporting database systems, ensuring uptime, performance, and scalability for mission-critical applications. What You'll Do Database Management & Support: Manage and support MySQL, ProxySQL, and PostgreSQL databases in production and staging environments. Administer Group Replication clusters and maintain high-availability database architectures. Ensure database environments are optimized for both performance and reliability. High Availability & Clustering: Design, implement, and maintain Group Replication for MySQL databases, ensuring database high availability and disaster recovery. Monitor and troubleshoot database cluster health and replication lag issues. Administer ProxySQL to optimize query routing and provide high availability and load balancing. Query Optimization: Analyze and optimize slow-running queries and troubleshoot performance bottlenecks. Apply best practices for query performance tuning, including indexing strategies and query rewriting. Monitor database performance and recommend tuning adjustments to ensure efficient database operations. Backup and Archival: Implement and manage database backup strategies, ensuring data consistency and integrity. Automate and monitor backup processes for MySQL, ProxySQL, and PostgreSQL systems. Develop and manage archival strategies to comply with data retention policies and ensure efficient storage utilization. Maintenance & Upgrades: Perform regular database patching and version upgrades with minimal downtime. Carry out health checks and maintenance tasks to optimize database performance and security. Ensure all security patches are applied and maintain databases in a secure, compliant state. Troubleshooting & Support: Provide 24x7 support for production database environments, in three shifts and including on-call support for critical incidents. Diagnose and resolve database-related issues, including replication failures, performance problems, and data integrity issues. Collaborate with developers and other teams to resolve database-related application issues. Documentation & Reporting: Maintain comprehensive documentation for database configurations, backup/restore procedures, and troubleshooting guides.Prepare regular reports on database health, performance, and incident resolutions for management. We'd Love for You to Have 3 + years of experience in database administration with a strong focus on MySQL, ProxySQL, and PostgreSQL. Strong understanding and hands-on experience with Group Replication, high availability, and failover solutions for MySQL databases. Solid experience in query optimization, indexing strategies, and performance tuning. Expertise in backup strategies (full, incremental, and differential backups) and data archival solutions. Proficiency with ProxySQL for high availability and load balancing. Experience with PostgreSQL administration, including performance tuning, replication, and clustering (e.g., using Patroni or pgpool). Experience with database monitoring tools (e.g., Prometheus, Grafana, Percona Monitoring and Management). Familiarity with scripting languages (e.g., Bash, Python) for automation tasks. Excellent problem-solving skills and the ability to troubleshoot complex database-related issues. Strong attention to detail and ability to work independently and as part of a team. Good understanding of networking, storage architectures, and cloud-based database solutions (e.g., AWS RDS, Azure Database Services). Familiarity with NoSQL databases (e.g., MongoDB, Redis). Experience with containerized environments (e.g., Docker, Kubernetes). Experience with DevOps practices and database automation tools (e.g., Ansible, Terraform). Knowledge of data security best practices, including encryption, user roles, and access controls. Qualifications Bachelor’s degree in engineering (CS/IT) or equivalent degree from well-known Institutes/ Universities. Certification in MySQL DBA or PostgreSQL DBA is a plus. Additional Information Return to Office : PubMatic employees throughout the global have returned to our offices via a hybrid work schedule (3 days “in office” and 2 days “working remotely”) that is intended to maximize collaboration, innovation, and productivity among teams and across functions. Benefits : Our benefits package includes the best of what leading organizations provide, such as paternity/maternity leave, healthcare insurance, broadband reimbursement. As well, when we’re back in the office, we all benefit from a kitchen loaded with healthy snacks and drinks and catered lunches and much more! Diversity and Inclusion : PubMatic is proud to be an equal opportunity employer; we don’t just value diversity, we promote and celebrate it. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. About PubMatic PubMatic is one of the world’s leading scaled digital advertising platforms, offering more transparent advertising solutions to publishers, media buyers, commerce companies and data owners, allowing them to harness the power and potential of the open internet to drive better business outcomes. Founded in 2006 with the vision that data-driven decisioning would be the future of digital advertising, we enable content creators to run a more profitable advertising business, which in turn allows them to invest back into the multi-screen and multi-format content that consumers demand. Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Delhi, India
On-site
Job Description Job Description: We are seeking a highly motivated and enthusiastic Senior Data Scientist with 5- 8 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. Key Responsibilities Develop and implement machine learning models and algorithms. Work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. Stay updated with the latest advancements in AI/ML technologies and methodologies. Collaborate with cross-functional teams to support various AI/ML initiatives. Qualifications Bachelor’s degree in Computer Science, Data Science, or a related field. Strong understanding of machine learning , deep learning and Generative AI concepts. Preferred Skills Experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, Deep Learning stack using python Experience with cloud infrastructure for AI/ ML on AWS(Sagemaker, Quicksight,Athena, Glue). Expertise in building enterprise grade, secure data ingestion pipelines for unstructured data(ETL/ELT) – including indexing, search, and advance retrieval patterns. Proficiency in Python, TypeScript, NodeJS, ReactJS (and equivalent) and frameworks. (e.g., pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy), Glue crawler, ETL Experience with data visualization tools (e.g., Matplotlib, Seaborn, Quicksight). Knowledge of deep learning frameworks (e.g., TensorFlow, Keras, PyTorch). Experience with version control systems (e.g., Git, CodeCommit). Strong knowledge and experience in Generative AI/ LLM based development. Strong experience working with key LLM models APIs (e.g. AWS Bedrock, Azure Open AI/ OpenAI) and LLM Frameworks (e.g. LangChain, LlamaIndex). Knowledge of effective text chunking techniques for optimal processing and indexing of large documents or datasets. Proficiency in generating and working with text embeddings with understanding of embedding spaces and their applications in semantic search and information. retrieval. Experience with RAG concepts and fundamentals (VectorDBs, AWS OpenSearch, semantic search, etc.), Expertise in implementing RAG systems that combine knowledge bases with Generative AI models. Knowledge of training and fine-tuning Foundation Models (Athropic, Claud , Mistral, etc.), including multimodal inputs and outputs. Good To Have Skills Knowledge and Experience in building knowledge graphs in production. Understanding of multi-agent systems and their applications in complex problem-solving scenarios. Equal Opportunity Employer Pentair is an Equal Opportunity Employer. With our expanding global presence, cross-cultural insight and competence are essential for our ongoing success. We believe that a diverse workforce contributes different perspectives and creative ideas that enable us to continue to improve every day. Show more Show less
Posted 2 weeks ago
2.0 - 4.0 years
0 Lacs
Delhi, India
On-site
Job Description Job Description: We are seeking a highly motivated and enthusiastic Senior Data Scientist with 2-4 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. Key Responsibilities Develop and implement machine learning models and algorithms. Work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. Stay updated with the latest advancements in AI/ML technologies and methodologies. Collaborate with cross-functional teams to support various AI/ML initiatives. Qualifications Bachelor’s degree in Computer Science, Data Science, or a related field. Strong understanding of machine learning , deep learning and Generative AI concepts. Preferred Skills Experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, Deep Learning stack using python Experience with cloud infrastructure for AI/ ML on AWS(Sagemaker, Quicksight,Athena, Glue). Expertise in building enterprise grade, secure data ingestion pipelines for unstructured data(ETL/ELT) – including indexing, search, and advance retrieval patterns. Proficiency in Python, TypeScript, NodeJS, ReactJS (and equivalent) and frameworks. (e.g., pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy), Glue crawler, ETL Experience with data visualization tools (e.g., Matplotlib, Seaborn, Quicksight). Knowledge of deep learning frameworks (e.g., TensorFlow, Keras, PyTorch). Experience with version control systems (e.g., Git, CodeCommit). Strong knowledge and experience in Generative AI/ LLM based development. Strong experience working with key LLM models APIs (e.g. AWS Bedrock, Azure Open AI/ OpenAI) and LLM Frameworks (e.g. LangChain, LlamaIndex). Knowledge of effective text chunking techniques for optimal processing and indexing of large documents or datasets. Proficiency in generating and working with text embeddings with understanding of embedding spaces and their applications in semantic search and information. retrieval. Experience with RAG concepts and fundamentals (VectorDBs, AWS OpenSearch, semantic search, etc.), Expertise in implementing RAG systems that combine knowledge bases with Generative AI models. Knowledge of training and fine-tuning Foundation Models (Athropic, Claud , Mistral, etc.), including multimodal inputs and outputs. Good To Have Skills Knowledge and Experience in building knowledge graphs in production. Understanding of multi-agent systems and their applications in complex problem-solving scenarios. Equal Opportunity Employer Pentair is an Equal Opportunity Employer. With our expanding global presence, cross-cultural insight and competence are essential for our ongoing success. We believe that a diverse workforce contributes different perspectives and creative ideas that enable us to continue to improve every day. Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for clustering roles in India is thriving, with numerous opportunities available for job seekers with expertise in this area. Clustering professionals are in high demand across various industries, including IT, data science, and research. If you are considering a career in clustering, this article will provide you with valuable insights into the job market in India.
Here are 5 major cities in India actively hiring for clustering roles: 1. Bangalore 2. Pune 3. Hyderabad 4. Mumbai 5. Delhi
The average salary range for clustering professionals in India varies based on experience levels. Entry-level positions may start at around INR 3-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-20 lakhs per annum.
In the field of clustering, a typical career path may look like: - Junior Data Analyst - Data Scientist - Senior Data Scientist - Tech Lead
Apart from expertise in clustering, professionals in this field are often expected to have skills in: - Machine Learning - Data Analysis - Python/R programming - Statistics
Here are 25 interview questions for clustering roles: - What is clustering and how does it differ from classification? (basic) - Explain the K-means clustering algorithm. (medium) - What are the different types of distance metrics used in clustering? (medium) - How do you determine the optimal number of clusters in K-means clustering? (medium) - What is the Elbow method in clustering? (basic) - Define hierarchical clustering. (medium) - What is the purpose of clustering in machine learning? (basic) - Can you explain the difference between supervised and unsupervised learning? (basic) - What are the advantages of hierarchical clustering over K-means clustering? (advanced) - How does DBSCAN clustering algorithm work? (medium) - What is the curse of dimensionality in clustering? (advanced) - Explain the concept of silhouette score in clustering. (medium) - How do you handle missing values in clustering algorithms? (medium) - What is the difference between agglomerative and divisive clustering? (advanced) - How would you handle outliers in clustering analysis? (medium) - Can you explain the concept of cluster centroids? (basic) - What are the limitations of K-means clustering? (medium) - How do you evaluate the performance of a clustering algorithm? (medium) - What is the role of inertia in K-means clustering? (basic) - Describe the process of feature scaling in clustering. (basic) - How does the GMM algorithm differ from K-means clustering? (advanced) - What is the importance of feature selection in clustering? (medium) - How can you assess the quality of clustering results? (medium) - Explain the concept of cluster density in DBSCAN. (advanced) - How do you handle high-dimensional data in clustering? (medium)
As you venture into the world of clustering jobs in India, remember to stay updated with the latest trends and technologies in the field. Equip yourself with the necessary skills and knowledge to stand out in interviews and excel in your career. Good luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2