Home
Jobs

1241 Clustering Jobs - Page 20

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Experienced Chennai Posted 8 months ago Solvedge We’re dedicated to leveraging technology to make a positive impact in healthcare. Our software solutions are crafted to optimize processes, support patient care, and drive better health outcomes. As we continue to innovate, we’re seeking a passionate Junior Database Developer to join our team. If you’re enthusiastic about database development and eager to contribute to meaningful projects in healthcare technology, we want you on our journey to empower healthcare professionals with advanced tools and insights. Designation : Database Analyst Years of Experience : 4 – 6 Years What You’ll Do We are seeking a talented and motivated Senior Database Developer to join our dynamic team. In this role, you will work closely with our senior developers to design, develop, and maintain databases that support our organization’s applications and systems. This is an exciting opportunity for someone passionate about databases and eager to grow their skills in a collaborative environment. Responsibilities Database Management and Optimization Manage and optimize database systems to ensure efficient data storage and retrieval. Participate in database performance tuning and optimization efforts. Troubleshoot database issues and performance bottlenecks to maintain system reliability. Support database maintenance tasks, including backups, restores, and data migrations. Assist in troubleshooting database-related issues and implementing solutions. Data Integrity and Quality Control Ensure data accuracy and integrity by implementing rigorous quality control processes. Develop and maintain data models, schemas, and structures to support business requirements. Query Development and Reporting Create and execute queries, scripts, and reports for data analysis and visualization. Develop and optimize SQL queries, stored procedures, and database scripts. Collaboration and Communication Collaborate with cross-functional teams to identify data needs and deliver actionable insights. Collaborate with software developers to integrate database functionality into applications. Contribute to database design discussions and provide input on best practices. Continuous Learning and Improvement Stay updated with emerging technologies and best practices in database management. Stay up-to-date with emerging database technologies and trends. Design and Implementation Assist in the design and implementation of database solutions to meet business requirements. Qualifications Bachelors in Computer Science Engineering or equivalent degree with minimum 3 years of experience. Experience in microservice will be an added advantage. Primary Skills SQL: Proficiency in SQL (Structured Query Language) is essential for querying, updating, and managing data in relational databases. Database Management Systems (DBMS): Strong understanding of one or more database management systems such as MySQL, PostgreSQL, SQL Server, or Oracle. Database Design: Knowledge of database design principles, including schema design, normalization, and data modeling. Stored Procedures and Functions: Ability to write and optimize stored procedures, functions, triggers, and other database objects to implement business logic within the database. Query Optimization: Experience in optimizing SQL queries for performance, including index optimization, query tuning, and execution plan analysis Database Administration in SQL Server (2008 R2, 2016 and 2017) and Manage Cloud Based SQL Server (AWS/AZURE – RDS). Strong knowledge of backups, restores, recovery models, database shrink operations, DBCC commands, Clustering, Database mirroring, Replication. Index Management and Optimization Expertise: Strong knowledge of how indexes, index management, integrity checks, configuration, patching. How statistics work, how indexes are stored, how they can be created and managed effectively. SQL Development – ability to write and troubleshoot SQL Code and design (storedprocs, functions, tables, views, triggers, indexes, constraints) Secondary Skills SQL Server tools (Profiler, DTA, SSMS, SAC, SSCM, PerfMon, DMVs, system sprocs) Solid acquaintance with windows server, security delegation, SPNs, storage components. Cloud Platforms: Familiarity with cloud platforms like AWS, Azure, or Google Cloud Platform, and their database services (e.g., Amazon RDS, Azure SQL Database) can be valuable as more organizations migrate to the cloud. Problem-Solving Skills: Strong problem-solving skills and the ability to troubleshoot database-related issues efficiently. Communication: Good communication skills to collaborate effectively with team members, stakeholders, and end-users Why Apply? Even if you feel you don’t meet every single requirement, we encourage you to apply. We’re looking for passionate individuals who might bring diverse perspectives and skills to our team. At SolvEdge, we value talent and dedication and are committed to fostering growth and opportunity within our organization. How To Apply Ready to join our mission and make a difference? Submit your resume, a cover letter that highlights your unique qualifications, and any relevant work samples to validate. We’re excited to hear from you! SolvEdge is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. About SolvEdge Solvedge: Pioneering the Future of Digital Healthcare Our Expertise SOLVEDGE stands at the forefront of digital healthcare innovation as a premier healthcare performance company. With over 18 years of dedicated service in the healthcare industry, we specialize in a digital care journey platform that revolutionizes how hospitals and health systems engage, monitor, and connect with patients throughout their healthcare experiences. Our partnership with Fortune 100 medical device companies and hospitals nationwide underscores our position as a trusted partner in healthcare solutions. Key Features of SOLVEDGE Our Platform Is Designed To Empower Healthcare Providers With The Tools They Need To Automate And Streamline Care Delivery, There By Improving Clinical Outcomes And Patient Satisfaction Personalized Care Plans: Leveraging evidence-based data, SOLVEDGE delivers digital care plans customized to meet the individual needs and conditions of each patient. Real-Time Patient Monitoring: Through daily health checks, assessment, surveys, and integration with wearable devices, our platform facilitates continuous monitoring of patient health. Automated Care Delivery: We automate essential tasks, including appointment scheduling, sending reminders, and delivering educational content, to enhance patient engagement and reduce administrative tasks. Remote Patient Monitoring: Healthcare providers can monitor vital signs, symptoms, and treatment plan adherence remotely, enabling timely interventions and proactive care management. The SOLVEDGE Advantage Our platform offers significant benefits to healthcare providers and patients alike: Improved Clinical Outcomes: By facilitating more effective care pathways and enabling early intervention, SOLVEDGE contributes to reduced readmission rates, fewer emergency department visits, and shorter hospital stays. Enhanced Patient Satisfaction: Patients enjoy a higher quality of care with SOLVEDGE, benefiting from improved communication, comprehensive education, and continuous support. Cost Savings: Healthcare organizations can achieve substantial cost reductions by minimizing unnecessary readmission, emergency visits, and complications associated with poor care management. Applications and Impact SOLVEDGE’s versatility allows for its application across various aspects of healthcare, with a particular emphasis on surgical care. From preparing patients for surgery to monitoring their post operative recovery, our platform ensures a seamless and supportive care journey. Beyond surgical care, our focus encompasses managing care pathways, enhancing patient engagement through patient-reported outcomes, providing advanced data analytic, integrating with electronic medical records (EMR), and streamlining billing processes. Our comprehensive approach addresses the myriad challenges faced by today’s healthcare industry, backed by our commitment to excellence in service, communication, and customer experience. A Trusted Partner in Healthcare Innovation Our strategic relationships and deep understanding of healthcare challenges have positioned us as an indispensable ally to healthcare providers nationwide. As we continue to develop innovative solutions, our goal remains unchanged: to simplify healthcare delivery, improve patient outcomes, and enhance the overall patient experience. Job Features Job Category Developer Apply For This Job Attach Resume* No file chosen Browse Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Data Analytics & Insights Analyst Astreya offers comprehensive IT support and managed services. These services include Data Center and Network Management, Digital Workplace Services (like Service Desk, Audio Visual, and IT Asset Management), as well as Next-Gen Digital Engineering services encompassing Software Engineering, Data Engineering, and cybersecurity solutions. Astreya's expertise lies in creating seamless interactions between people and technology to help organizations achieve operational excellence and growth. Job Description We are seeking experienced Data and Insights Analyst to join our analytics division. You will be aligned with our Data Analytics and BI vertical and help us generate insights by leveraging the latest Analytics techniques to deliver value to our clients. You will also help us apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using Google Platforms and technologies. You will be required to utilize the existing Tools, frameworks, standards, patterns to create architectural foundations and services necessary for Analytics applications that scale from multi-user to enterprise-class You will be working as a part of the Google Analytics team which provides analytics, actionable insights and recommendations to internal & external organizations in terms of optimizing ROI & performance efficiency in operations. Requirements Experience & Education 5+ years of progressive experience in data analytics and business intelligence Bachelor's degree required; preferably in Computer Science, Analytics, Statistics, or related field Proven experience in the IT services industry or managed services provider environment Technical Expertise Technical Expertise Extensive experience in Data Science and Advanced Analytics delivery teams Strong statistical programming experience - SQL, Python. Experience working with large data sets and big data tools like GCP (BigQuery, VertexAI), AWS, MS Azure etc. Solid knowledge in at least one of the following – Multivariate Statistics, Reliability Models, Markov Models, Stochastic models Classification, Regression, Clustering Ensemble Modelling (random forest, boosted tree, etc.) Experience in at least one of these business domains: Supply Chain, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce Extensive experience in client engagement and business development Ability to work in a global collaborative team environment Industry Knowledge ● Understanding of ITIL and IT service management frameworks ● Experience with service desk metrics and KPIs ● Knowledge of data center and network management analytics ● Familiarity with cybersecurity analytics and reporting Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Job Description Job Description: We are seeking a highly motivated and enthusiastic Senior Data Scientist with 5- 8 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. Key Responsibilities Develop and implement machine learning models and algorithms. Work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. Stay updated with the latest advancements in AI/ML technologies and methodologies. Collaborate with cross-functional teams to support various AI/ML initiatives. Qualifications Bachelor’s degree in Computer Science, Data Science, or a related field. Strong understanding of machine learning , deep learning and Generative AI concepts. Preferred Skills Experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, Deep Learning stack using python Experience with cloud infrastructure for AI/ ML on AWS(Sagemaker, Quicksight,Athena, Glue). Expertise in building enterprise grade, secure data ingestion pipelines for unstructured data(ETL/ELT) – including indexing, search, and advance retrieval patterns. Proficiency in Python, TypeScript, NodeJS, ReactJS (and equivalent) and frameworks. (e.g., pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy), Glue crawler, ETL Experience with data visualization tools (e.g., Matplotlib, Seaborn, Quicksight). Knowledge of deep learning frameworks (e.g., TensorFlow, Keras, PyTorch). Experience with version control systems (e.g., Git, CodeCommit). Strong knowledge and experience in Generative AI/ LLM based development. Strong experience working with key LLM models APIs (e.g. AWS Bedrock, Azure Open AI/ OpenAI) and LLM Frameworks (e.g. LangChain, LlamaIndex). Knowledge of effective text chunking techniques for optimal processing and indexing of large documents or datasets. Proficiency in generating and working with text embeddings with understanding of embedding spaces and their applications in semantic search and information. retrieval. Experience with RAG concepts and fundamentals (VectorDBs, AWS OpenSearch, semantic search, etc.), Expertise in implementing RAG systems that combine knowledge bases with Generative AI models. Knowledge of training and fine-tuning Foundation Models (Athropic, Claud , Mistral, etc.), including multimodal inputs and outputs. Good To Have Skills Knowledge and Experience in building knowledge graphs in production. Understanding of multi-agent systems and their applications in complex problem-solving scenarios. Equal Opportunity Employer Pentair is an Equal Opportunity Employer. With our expanding global presence, cross-cultural insight and competence are essential for our ongoing success. We believe that a diverse workforce contributes different perspectives and creative ideas that enable us to continue to improve every day. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Data Scientist , you will have the opportunity to manipulate large data sets, integrate diverse data sources, data types and data structures into solutions under management guidance. You will build experience developing analytical approaches to meet business requirements, which involves translating requests into use cases, test cases, preparation of training data sets and iterative algorithm development. What You Will Do Execute code to generate business results, code efficiency improves solution delivery Analyze and prepare data for analysis, assembling data from standard and disparate data sources Research new and advanced predictive modeling techniques as appropriate for a specific solution Work with cross-functional teams to develop ideas and execute business plans Develop solution prototypes and help integrate them with the product Work closely with software development teams to communicate requirements and ensure quality of end-to-end deliverables of developed analytical solutions Contribute to the team knowledge by keeping up with the state-of-the-art in machine learning applied to our domain What Experience You Need 3-5 years of professional experience as a data scientist or statistical modeler in at least one of the following: identity and fraud, credit risk, telecommunications, financial services, payment, ecommerce, B2B or B2C, marketing, insurance, or security analytics arena. Strong quantitative analytical experience, including hands-on and background in using statistics, regression modeling, neural networks, decision trees, random forest, support vector machine, kernel based methods, clustering and similar methods Bachelors Degree required in Mathematics, Computer Science , Engineering, Operations Research, Statistics or other related discipline What Could Set You Apart Master’s degree or higher in Mathematics, Computer Science , Engineering, Operations Research, Statistics or other related discipline a definite plus Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Experience: 6+ years Shift Timings: 2:30 pm -11:30 pm Location: Chennai is preferred ,remote can also be considered Skills Required: AEM Technology, Sightly /HTL ,OSGI flexi, CMS Development, JCR /CRX, Apache sling Job Description: Experience of creating AEM applications using components and templates ,workflows , replication strategies, Versioning and publishing Pages ,JCR /CRX Deep understanding of Sightly (HTL) ,Sling Models ,Queries ,OSGi services and components Deep understanding of the AEM Sites deployment ,dispatcher configuration ,clustering and fragmentation, content fragments and experience fragments Deep understanding of java servlets ,JSP , Maven and version control tools GITLAB Experience in translating requirements of client into highly technical specifications for brief developing and implementing and providing all kind of support for business application for clients Show more Show less

Posted 1 week ago

Apply

9.0 - 14.0 years

20 - 35 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

We are seeking a candidate with strong working experience in both Data Scientist . The ideal candidate must have utilized Azure services related to data, data engineering, and machine learning. PAN India. Machine Learning 7+ years of relevant experience Experience in handling ETRM projects. Good understanding of Power Trading concepts Must have good experience in implementing machine learning models such as Prophet, ARIMA, SARIMA, XGBoost, ElasticNet, Ridge, Lasso, Random Forest, and Linear Regression on time-series data. Proficient in using Python ML packages such as scikit-learn, sktime, and darts. Must have strong expertise in key techniques for time series feature engineering, including lag features, rolling window statistics, Fourier transforms, and handling seasonality. Proven ability to tune the performance of existing deployed forecasting models. Must have experience with Azure Machine Learning Python SDK v1/v2 to: Manage data, models, and environments Build/debug AML pipelines to stitch together multiple tasks (feature engineering, training, registering models, etc.) and production workflows using Azure ML pipelines Schedule Azure ML jobs Deploy registered models to create endpoints. Good to have experience with K-Means clustering.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai

Work from Office

Naukri logo

The Senior Analyst Business Analytics will assist the Manager, Business Analytics delivering analytics projects centered on the 4 levers Acquisition, Activation, Portfolio Development and Retention. The scope of the role includes the commissioning of new types of data and the use of advanced analytics to drive assisted and predictive decision-making thereby leading to a data-intelligent future. Candidate should have minimum of 4 years experience in Banking domain Proficiency in Analytics tools SAS EG, SAS EMINER, Excel, SQL

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are hiring for our client - based in Hyderabad- GCC - who are yet to establish their presence in India. Job Summary : We are looking for a Senior Data Engineer to join our growing team of analytics experts. As a data engineer, you are responsible for designing and implementing our data pipeline architecture and optimizing data flow and collection for cross-functional groups, considering scalability in mind. Data engineering is about building the underlying infrastructure, and so being able to pass the limelight to someone else is imperative. Required Skills: Hands-on experience in Data Integration and Data Warehousing Strong proficiency in: Google BigQuery Python SQL Airflow/Cloud Composer Ascend or any modern ETL tool Experience with data quality frameworks or custom-built validations Preferred Skills: Knowledge of DBT for data transformation and modeling Familiarity with Collibra for data cataloging and governance Qualifications: Advanced working SQL knowledge and experience working with relational databases and working familiarity with a variety of databases. Strong analytic skills related to working with unstructured datasets. Experience building a serverless data warehouse in GCP or AWS 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Strong analytic skills related to working with unstructured datasets Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Design, build, and optimize data pipelines using Google BigQuery, ensuring use of best practices such as query optimization,partitioning, clustering, and scalable data modeling. Develop robust ETL/ELT processes using Python and SQL, with an emphasis on reliability, performance, and maintainability. Create and manage Ascend or equivalent tool data flows, including: Setting up read/write connectors for various data sources. Implementing custom connectors using Python. Managing scheduling, failure notifications, and data services within Ascend. Implement data quality checks (technical and business level) and participate in defining data testing strategies to ensure data reliability. Perform incremental loads and merge operations in BigQuery. Build and manage Airflow (Cloud Composer) DAGs, configure variables, and handle scheduling as part of orchestration. Work within a CI/CD (DevSecOps) setup to promote code efficiently across environments. Participate in technical solutioning: Translate business integration needs into technical user stories. Contribute to technical design documents and provide accurate estimations. Conduct and participate in code reviews, enforce standards, and mentor junior engineers. Collaborate with QA and business teams during UAT; troubleshoot and resolve issues in development, staging, and production environments. Show more Show less

Posted 1 week ago

Apply

4.0 years

7 - 9 Lacs

Hyderābād

Remote

GlassDoor logo

Job Description: Job Purpose As a SQL Database Administrator, you will be part of the Global Data Services team with accountability for designing, testing, implementing, and maintaining the organization’s databases across multiple platforms, technologies (e.g., relational, non-relational and warehousing) and computing environments (e.g., host based, distributed systems, client server). The successful candidate will be responsible for all aspects of data and database administration. Responsibilities SQL DBA with 24/7 support for financial applications. Good experience in different MS SQL Server versions 2012,2014, 2016, 2017 & 2019. MS SQL Server installations/Patching on Standalone/Cluster environment. Should support database upgrades and patches, capacity planning and other activities as may be necessary. Database backup and recovery. Creating, configuring, monitoring, scheduling & troubleshooting jobs. Configuring and troubleshooting Log shipping, Mirroring, High Availability. Configuring, Reinitializing, and troubleshooting Replication. Troubleshooting clustering issues. SQL Server Database Performance Tuning. Detecting and troubleshooting Security and remote connectivity issues. RCA Provision and Problem management. Monitored data activities (i.e. database status, logs, space utilization, extents, Checkpoints, locks and long transactions). Understanding of Mount points. Good knowledge on PowerShell scripting would be added advantage. Knowledge and Experience Bachelor’s Degree in Computer Science, Information Technology or related field 4+ years of experience with production database management, preferably with Microsoft SQL Server & Oracle Experience with Microsoft Replication design, implementation, and support Experience with Microsoft SQL Server Always On availability groups - design, implementation, and support Technical architecture and documentation skills, preferably with Visio Ability to independently complete projects with minimal guidance Ability to work an on-call schedule, after hours and weekends as needed. Ability to learn new technology with a willingness to teach.

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job description: Job Description Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ͏ Mandatory Skills: Data Science, ML, DL, Python for Data Science, Tensorflow, Pytorch, Django, SQL, MLOps Preferred Skills: NLP, Gen AI, LLM, PowerBI, Advanced Analytics, Banking exposure ͏ Strong understanding of Data Science, machine learning and deep learning principles and algorithms. Proficiency in programming languages such as Python, TensorFlow, and PyTorch. Experienced data scientist who can using python build various AI models for banking product acquisition, deepening, retention. Drive data driven personalisation, customer segmentation, in accordance with banks data privacy and security standards Expert in applying ML techniques such as: classification, clustering, deep learning, optimization methods, supervised and unsupervised techniques Optimize model performance and scalability for real-time inference and deployment. Experiment with different hyperparameters and model configurations to improve AI model quality. Ensure AI ML solutions are developed, and validations are performed in accordance with Responsible AI guidelines & Standards Working knowledge ane experience in ML Ops is a must and engineering background is preferred Excellent command of data warehousing concepts and SQL Knowledge of personal banking products is a plus Mandatory Skills: AI Cognitive . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position Name EY- Assurance – Senior - Digital Position Level Senior Position Details As part of EY GDS Assurance Digital, you will be responsible for implementing innovative ideas through AI research to develop high growth & impactful products. You will be helping EY’s sector and service line professionals by developing analytics enabled solutions, integrating data science activities with business relevant aspects to gain insight from data. You will work with multi-disciplinary teams across the entire region to support global clients. This is a core full-time AI developer role, responsible for creating innovative solutions by applying AI based techniques for business problems. As our in-house senior AI engineer, your expertise and skills will be vital in our ability to steer one of our Innovation agenda. Responsibilities Requirements (including experience, skills, and additional qualifications) Convert business problem into analytical problem and devise a solution approach. Clean, aggregate, analyze and interpret the data to derive business insights from it. Own the AI/ML implementation process: Model Design, Feature Planning, Testing, Production Setup, Monitoring, and release management. Work closely with the Solution Architects in deployment of the AI POC’s and scaling up to production level applications. Should have solid background in Python and has deployed on open-source models- Work on data extraction techniques from complex PDF/Word Doc/Forms- entities extraction, table extraction, information comparison. Key Requirements/Skills & Qualification: Excellent academic background, including at a minimum a bachelor or a master’s degree in data science, Business Analytics, Statistics, Engineering, Operational Research, or other related field with strong focus on modern data architectures, processes, and environments. Solid background in Python with excellent coding skills. 4+ years of core data science experience in one or more below areas: Machine Learning (Regression, Classification, Decision Trees, Random Forests, Timeseries Forecasting and Clustering) Understanding and usage of Large Language Models like Open AI models like ChatGPT, GPT4, frameworks like LangChain and Llama Index. Good understanding of open source LLM framework like Mistral, Llama, etc. and fine tuning on custom datasets. Deep Learning (DNN, RNN, LSTM, Encoder-Decoder Models) Natural Language Processing- Text Summarization, Aspect Mining, Question Answering, Text Classification, NER, Language Translation, NLG, Sentiment Analysis, Sentence Computer Vision- Image Classification, Object Detection, Tracking etc. SQL/NoSQL Databases and its manipulation components Working knowledge of API Deployment (Flask/FastAPI/Azure Function Apps) and webapps creation, Docker, Kubernetes. Additional skills requirements: Excellent written, oral, presentation and facilitation skills Ability to coordinate multiple projects and initiatives simultaneously through effective prioritization, organization, flexibility, and self-discipline. Must have demonstrated project management experience. Knowledge of firm’s reporting tools and processes. Proactive, organized, and self-sufficient with ability to priorities and multitask. Analyses complex or unusual problems and can deliver insightful and pragmatic solutions. Ability to quickly and easily create/ gather/ analyze data from a variety of sources. A robust and resilient disposition able to encourage discipline in team behaviors What We Look For A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 7200 + professionals, in the only integrated global assurance business worldwide. Opportunities to work with EY GDS Assurance practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from startups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

8.0 years

3 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Date: Jun 5, 2025 Job Requisition Id: 61535 Location: Gurgaon, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: Build, manage, and foster a high-functioning team of data engineers and Data analysts. Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. Experience of working with manufacturing industry in building a centralized data platform for self service reporting. Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. Responsible for managing customer, partner, and internal data on the cloud and on-premises. Evaluate and understand current data technologies and trends and promote a culture of learning. Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks Accountable for the data group’s activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc Strong understanding of data architecture, data modelling, and ETL processes. Proficiency in SQL and Pyspark Strong knowledge of building PowerBI reports and dashboards. Excellent communication skills Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: SEO Executive Department: Marketing Reporting To: Associate Director Marketing Location: Bangalore Job Purpose We are looking for a data-driven and innovative SEO Specialist to help scale organic growth and lead generation for LeadSquared, a fast-growing comprehensive SaaS CRM company. The ideal candidate should have hands-on experience in SEO for SaaS businesses, an understanding of SEO strategies (LLMs, content optimization, and technical SEO optimizations), and a passion for leveraging data to improve search visibility and conversions. This role will involve executing on-page, off-page, and technical SEO strategies while working closely with content, performance marketing, and web development teams. Key Responsibilities Execute data-driven SEO strategies to improve organic rankings, traffic, and lead generation for LeadSquared’s website. Conduct in-depth keyword research and identify content opportunities based on search intent and B2B SaaS industry trends. Optimize website architecture, metadata, internal linking, and structured data markup to improve crawlability and search visibility. Implement AI-powered SEO techniques, such as content optimization using NLP models (ChatGPT, Jasper, SurferSEO) and automated keyword clustering. Monitor Core Web Vitals and collaborate with developers to improve site speed, mobile-friendliness, and UX optimizations. Conduct technical SEO audits, fixing issues related to crawl errors, indexing, schema markup, and JavaScript SEO. Track SEO performance metrics (organic traffic, rankings, conversions, backlinks) and generate actionable insights using GA4, Search Console, Ahrefs, and SEMrush. Assist in content optimization, entity-based SEO, and topic cluster strategies to improve authority in B2B SaaS search results. Support link-building campaigns by identifying relevant industry sites, guest posting opportunities, and partnerships. Conduct competitor analysis to benchmark against top-ranking SaaS competitors and recommend growth opportunities. Stay updated on Google’s algorithm updates, AI advancements in search, and emerging SEO trends. Skills & Experience Required 1-3 years of hands-on SEO experience, preferably in B2B SaaS or tech industries. Strong knowledge of on-page, off-page, and technical SEO best practices. Experience with AI-powered SEO tools (ChatGPT, Jasper, Clearscope, SurferSEO, Etc). Hands-on experience with Google Analytics (GA4), Search Console, Ahrefs, SEMrush, Screaming Frog. Familiarity with content optimization, keyword research, and NLP-driven SEO strategies. Understanding of HTML, CSS, JavaScript SEO, and technical site architecture. Ability to interpret SEO data, track performance, and provide insights for optimization. Experience in B2B keyword strategy, topic clustering, and buyer-journey-based SEO. Strong analytical mindset and the ability to test and implement SEO experiments. Excellent communication and teamwork skills, with the ability to collaborate with content, development, and marketing teams. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

About Lowe’s Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Job Summary The Manager will be required to partner with US counterparts to develop and drive analysis in the area of Vendor Cost Strategy. This is a high-impact opportunity for someone passionate about solving strategic business problems and driving cost efficiency across critical areas such as: Vendor Negotiations, Forecasting, Margin Improvement, Inventory Cost Analysis, and Supply Chain Financing. Roles & Responsibilities Candidate should have knowledge and hands on experience to play the following role: Cost Optimization by evaluating First Cost and other landing cost Help analyze impact of tariff and commodity prices on product cost Provide insights and recommendations to business for driving better negotiations with vendors Experience on striking negotiations discussion with vendors Competitive Intelligence to the negotiation team Hands on knowledge on product clustering and segmentation to evaluate similar items along with Parent SKU Hands on knowledge on various forecasting techniques e.g. Time Series, Regression etc. Help with changes in reports and other modifications including conducting ad hoc analysis and reports building as per business needs Ensure on time and accurate delivery of all cost related requests Handled a considerable team in the past Be a go-to point for all data and insights needs for the business. Ability to present insights rather than providing numbers to business. Communicate independently and effectively with stakeholders on scope of project, presenting results, reconciliation of data etc. Pursues self-development and effective relationships with team members and partners Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law.

Posted 1 week ago

Apply

6.0 - 8.0 years

12 - 15 Lacs

India

On-site

GlassDoor logo

Location: Noida Job Type: Full-time Job Summary: We are looking for an experienced Database Administrator (DBA) with 6 to 8 years of hands-on experience in managing, maintaining, and optimizing relational and/or NoSQL database systems. The ideal candidate will be responsible for ensuring database performance, availability, scalability, and security across development, testing, and production environments. Key Responsibilities: Install, configure, and maintain database systems (e.g., PostgreSQL, MySQL, SQL Server, Oracle, MongoDB ). Monitor database performance, identify bottlenecks, and implement tuning solutions. Design and implement backup, restore, and disaster recovery strategies. Ensure database security through access control, encryption, and vulnerability assessments. Automate routine maintenance tasks using scripts and scheduled jobs. Collaborate with developers and DevOps teams to optimize query performance and data access patterns. Perform database upgrades and patch management with minimal downtime. Maintain high availability and failover solutions (e.g., clustering, replication). Conduct data migration, transformation, and integration activities. Maintain data integrity and support compliance with regulations (e.g., GDPR, HIPAA). Required Qualifications: 6–8 years of proven experience as a DBA in a complex production environment. Deep knowledge of RDBMS (e.g., PostgreSQL, Oracle, SQL Server, MySQL). Hands-on experience with NoSQL systems (e.g., MongoDB, Redis, Cassandra) is a plus. Expertise in SQL and PL/SQL or procedural extensions. Strong understanding of database architecture, indexing, partitioning, and replication. Experience with monitoring tools (e.g., New Relic, Nagios, Prometheus, pgBadger). Proficient in shell scripting, Python, or PowerShell for automation. Familiarity with cloud databases (e.g., Amazon RDS, Azure SQL, Google Cloud SQL). Solid understanding of backup/recovery solutions , such as RMAN, pg_dump, logical/physical backups. Preferred Qualifications: Experience with high-availability architectures and failover planning. Knowledge of DevOps tools and CI/CD pipelines involving database changes. Familiarity with database infrastructure in containerized environments (Docker, Kubernetes). Certifications such as Oracle Certified Professional (OCP) , Microsoft Certified: Azure Database Administrator Associate , or AWS Certified Database - Specialty . Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Work Location: In person

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Noida

On-site

GlassDoor logo

Job Description: Job Description: We are seeking a highly motivated and enthusiastic Senior Data Scientist with 5- 8 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. Key Responsibilities: Develop and implement machine learning models and algorithms. Work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. Stay updated with the latest advancements in AI/ML technologies and methodologies. Collaborate with cross-functional teams to support various AI/ML initiatives. Qualifications: Bachelor’s degree in Computer Science, Data Science, or a related field. Strong understanding of machine learning , deep learning and Generative AI concepts. Preferred Skills: Experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, Deep Learning stack using python Experience with cloud infrastructure for AI/ ML on AWS(Sagemaker, Quicksight,Athena, Glue). Expertise in building enterprise grade, secure data ingestion pipelines for unstructured data(ETL/ELT) – including indexing, search, and advance retrieval patterns. Proficiency in Python, TypeScript, NodeJS, ReactJS (and equivalent) and frameworks. (e.g., pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy), Glue crawler, ETL Experience with data visualization tools (e.g., Matplotlib, Seaborn, Quicksight). Knowledge of deep learning frameworks (e.g., TensorFlow, Keras, PyTorch). Experience with version control systems (e.g., Git, CodeCommit). Strong knowledge and experience in Generative AI/ LLM based development. Strong experience working with key LLM models APIs (e.g. AWS Bedrock, Azure Open AI/ OpenAI) and LLM Frameworks (e.g. LangChain, LlamaIndex). Knowledge of effective text chunking techniques for optimal processing and indexing of large documents or datasets. Proficiency in generating and working with text embeddings with understanding of embedding spaces and their applications in semantic search and information. retrieval. Experience with RAG concepts and fundamentals (VectorDBs, AWS OpenSearch, semantic search, etc.), Expertise in implementing RAG systems that combine knowledge bases with Generative AI models. Knowledge of training and fine-tuning Foundation Models (Athropic, Claud , Mistral, etc.), including multimodal inputs and outputs. Good to have Skills Knowledge and Experience in building knowledge graphs in production. Understanding of multi-agent systems and their applications in complex problem-solving scenarios. Equal Opportunity Employer Pentair is an Equal Opportunity Employer. With our expanding global presence, cross-cultural insight and competence are essential for our ongoing success. We believe that a diverse workforce contributes different perspectives and creative ideas that enable us to continue to improve every day.

Posted 1 week ago

Apply

2.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Heaps is one of the fastest growing health tech start-ups in India today. With more than 5 million patient interactions till date, Heaps is revolutionising how insurers, doctors and patients experience health care. We are building an AI driven end-to-end care management platform which coordinates, tracks and monitors the health of the patient empowering them with the right information to take right decisions at the right time about their health. Heaps is backed by marquee investors - titans in the healthcare industry who have invested more than $7.4 million in Series A funding. Heaps is on a path for global scaling, we’ve already expanded in 5 countries across the globe. Are you a data scientist with a knack for problem solving and crisp communication? Data science team at Heaps is chartered to drive analytics/machine learning programs across the organisation. Heaps is looking for experienced Data Scientists who will be at the forefront of developing products/solutions for insurance/hospital clients across the globe. You will be responsible to drive use cases and analytics assignments to help the clients uncover actionable insights from the data which will lead to improved care for the patients. Responsibilities Understand business problems and work on the statistical and analytical approach required for the solution Work on various steps of a data science project - Data mining and Exploratory Data Analysis (EDA), Feature Engineering, Statistical modelling, integration with AI/ML platform, visualisation and result inferencing and presentation to stakeholders Leverage machine learning techniques such as regression, classification, clustering, bayesian algorithms, matrix factorization, graphical models, to contribute to the execution of our vision for ML-based technology solutions Leverage advanced NLP algorithms/architectures to build custom models for entity extraction and concept recognition, relation extraction, summarization, textual classification and clustering, etc. Collaborate with business stakeholders to effectively integrate and communicate analysis findings Qualifications 2-3 years of prior data science experience Data science programming languages and querying databases: Python, SQL/HIVE Deep understanding of various statistical/ML techniques Expertise in the use of cloud based infrastructure to manage the volume and veracity of complex data streams Basic knowledge of API frameworks like Flask/Fastapi, version control systems like Git and cloud platform like AWS/GCP Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

We are looking for a highly skilled and forward-thinking SEO Specialist who lives and breathes search algorithms, AI-driven search models, and organic growth strategies. The ideal candidate is not just experienced in traditional SEO but is also keenly aware of the latest Google Core Updates, Search Generative Experience (SGE) , and the evolution of Large Language Model (LLM)-based search engines like ChatGPT, Perplexity, and You.com. You’ll play a key role in strategizing, implementing, and managing SEO campaigns that drive visibility, traffic, and conversions across platforms. A deep understanding of how search is evolving in the AI era is a must . Key Responsibilities Technical SEO & Site Optimization Conduct in-depth SEO audits for client and internal websites using tools like Screaming Frog, Ahrefs, Semrush, SurferSEO , etc. Optimize site architecture, internal linking, crawl budget, Core Web Vitals, schema markup, and mobile usability. Implement and manage structured data (JSON-LD) for enhanced SERP appearance (rich snippets, FAQs, etc.). Google Algorithm & AI Search Strategy Stay on top of Google Algorithm updates (Panda, Penguin, BERT, MUM, HCU, etc.) and adjust strategies accordingly. Understand and adapt SEO strategy for Search Generative Experience (SGE) and AI-powered search outputs . Monitor emerging search platforms leveraging LLMs (like ChatGPT plug-ins, Perplexity AI, Brave Search) and strategize visibility there. Content Strategy & Semantic SEO Work with content teams to optimize content using NLP, TF-IDF, topical clustering, and semantic SEO techniques. Use tools like Frase, Clearscope, SurferSEO, MarketMuse to plan, score, and improve long-form content. Collaborate on keyword maps, content calendars, and on-page optimization strategies that align with user intent and entity-based search. Performance Analysis & Reporting Monitor KPIs and performance using Google Search Console, Google Analytics 4 (GA4), Looker Studio , and Matomo . Create detailed SEO reports, recommendations, and roadmap documents for internal and external stakeholders. Set up and monitor conversion tracking, click-through rate improvements , and engagement metrics. On-Page, Off-Page & Link Building Optimize metadata, headers, keyword placement, alt-text, canonicalization, and UX elements. Design and execute ethical, white-hat link building campaigns (guest blogging, niche outreach, digital PR). Identify opportunities for brand mentions, HARO backlinks , and authority building. Cross-Functional Collaboration Coordinate with designers, developers, and copywriters to implement SEO changes and campaigns. Communicate SEO concepts clearly to non-technical stakeholders. Educate internal teams on the impact of AI and LLMs on search and user behavior. Key Qualifications & Skills Educational & Professional Background Bachelor's degree in Information Technology, Computer Science, Digital Marketing , or related field. 1+ years of professional experience in technical and strategic SEO . Certifications from Google, Moz, HubSpot, SEMrush , or similar platforms preferred. Technical Proficiency Tools: Ahrefs, Semrush, Screaming Frog, SurferSEO, Frase, MarketMuse Google Search Console, GA4, Looker Studio, Tag Manager Ubersuggest, Moz, KWFinder, AnswerThePublic, ChatGPT CMS Experience: WordPress, Webflow, Shopify, Magento, Joomla, Drupal Coding & Technical: Basic understanding of HTML, CSS, JavaScript , and server-side rendering vs. client-side rendering . Familiar with robots.txt, .htaccess, redirects, canonical tags , and sitemap management. AI & Emerging Tech: Exposure to AI writing tools (Jasper, Copy.ai) and experience with LLM outputs and prompt optimization . Awareness of zero-click search trends , entity SEO , and voice search optimization . Soft Skills Excellent written and verbal communication in English . Strategic thinker with an analytical, data-driven mindset . Ability to work independently, manage multiple projects, and meet deadlines. Proactive learner who tracks SEO trends and platform innovations. What We Offer Opportunity to lead SEO strategy across diverse industries. Exposure to cutting-edge AI and search innovation projects. Flexible working hours and remote-friendly environment. A collaborative, growth-driven digital culture. How to Apply Send your resume, portfolio/case studies, and a short note on: “How you would future-proof an SEO strategy in the age of AI and Google SGE?” to: mansi.bagdai@ci360degrees.com Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Kanpur, Uttar Pradesh, India

On-site

Linkedin logo

Description We are looking for a data engineer and scientist with a deep understanding of blockchain network protocols, infrastructure, systems engineering, distributed systems, and architectural design. The candidate will lead research and ideation to scale data infrastructure and feasibility in production. The candidate will have the opportunity to expand expertise in different areas of engineering and applied research. This is an exciting opportunity to make a measurable impact within a corporate environment that values precision, innovation, and strategic thinking. Responsibilities Architect and build a highly scalable, resilient, and distributed system that integrates a variety of data sources. Standardizing data schemas Maintain servers and Build performant, and reliable production-grade systems. Design and optimize protocol implementations. Work across the tech stack (ETL, cache, code, deploy) Mentor fellow team members and participate in code reviews and design reviews Collaborate cross-functionally with other data scientists, designers, and product managers - to define and implement services. Develop novel clustering approaches to tackling blockchain-specific challenges around scale, security, and privacy. Research on privacy preserving blockchains Develop related pipeline using Python and other technologies Support product and intelligence teams. Eligibility Bachelor's degree or above in Computer Science or related field. At least 1 Year of experience in related domain & skills. Experience in building real time highly dynamic and data heavy systems Ability to work independently, be proactive, prioritize, and communicate novel ideas experience working in fast-paced environments or startups Prior experience in publishing research oriented articles in A* journals. Proficiency and expertise in different data platforms. Proficiency in Python, Kafka, websockets, Elasticsearch, MongoDB, Kubernetes, Docker, Neo4j, Cypher, Wireshark, Git, AWS, Google Cloud, Swagger, Linux, VS, Solidity, and network protocols. Strong ability to synthesize, interpret, and report complex information clearly and accurately. Knowledge of blockchain technologies and different platforms, with a focus on tracing transactions and identifying malicious activity. Also have working knowledge of cryptocurrencies, darknet marketplaces, Tor, experience in making payments with bitcoin and ethereum Compensation. Excellent communication skills to collaborate with cross-functional teams and able to explain to others with no technical knowledge. Understanding of coding standards Desired Eligibility Relevant certifications and experience in cybersecurity or intelligence gathering. Working knowledge of SQL is preferred. Prior experience working in fast-paced environments or startups. Travel As and when required, across the country for project execution and monitoring as well as for coordination with geographically distributed teams. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

The D. E. Shaw group is a global investment and technology development firm with more than $60 billion in investment capital as of December 1, 2023, and offices in North America, Europe, and Asia. Since our founding in 1988, our firm has earned an international reputation for successful investing based on innovation, careful risk management, and the quality and depth of our staff. We have a significant presence in the world’s capital markets, investing in a wide range of companies and financial instruments in both developed and developing economies We are looking for a resourceful candidate to join our Financial Operations group, based out of the firm’s Hyderabad office. WHAT YOU'LL DO DAY-TO-DAY: If hired, you will support one or more of the functions mentioned below: Compliance – The Compliance group is responsible for implementing and enforcing policies and procedures across multiple regulatory requirements, where you will be assisting front office trading desks by conducting thorough research on the firm’s positions and applicable regulations for each asset class across jurisdictions through the deployment of advanced data analytics and data visualization techniques like statistical analysis, predictive modelling, clustering, etc. on tools like PowerBI and others. You will also perform regulatory filing obligations, requiring complex quantitative and qualitative analysis of the firm’s short position and long ownership reports across US, UK, EU, Japan, etc. jurisdictions, and assist in monitoring and implementing controls designed to ensure compliance with global regulations by deploying various data analytics and visualization techniques. Additionally, you will be conducting thorough analysis on the fundamental research data collected by the trading groups through their interactions with external industry experts/consultants around various market sectors to avoid potential trading while in possession of material non-public information. Furthermore, you will be assisting with longterm projects to maintain the firm’s high standards of compliance with new and existing regulations. Counterparty Relationship Management – This role provides an opportunity to gain a sound understanding of key data points, including the firm’s trades, positions, and margins; research, trading, and financing costs; broker interactions and broker vote results; and revenue and client ranks. The team is responsible for coordinating with multiple stakeholders, including the Front Office, Treasury, Middle Office, technology teams, and counterparties, to ensure a seamless infrastructure for calculating, reviewing, and reporting commissions and financing costs. The team works closely with the technology team to develop platforms and tools for reporting and analysis, streamline workflows, and improve the user experience. The team is responsible for driving project integrations and analyzing multiple data points to generate actionable insights that may lead to adjustments in the firm’s counterparty relationships. Tax – The Tax team is responsible for timely and accurate filings of Federal and state tax returns for U.S. entities and issuing statements (Schedule K-1s) and projections to our investors. The team does extensive tax compliance work for Partnerships and Corporations, relating to U.S. Hedge funds, U.S. Private Equity, and Management Company entities. The team performs a detailed review of tax workpapers and tax returns, a detailed analysis of financial products and their tax treatment, analyzes the book income, computes the taxable income, and allocates it to the investors using complex tax allocation methodologies. The team interacts with various internal and external stakeholders to understand the process workflows relating to trade and fund accounting and industry-wide tax practices and their reporting. The team plays a pivotal role in various tax process automation initiatives and generates tax analytical and visualization dashboards, which aid in creating efficiencies and alpha generation. Additionally, the team works with the Tax Planning team on entity structuring consultation, analyzing tax implications for new financial products, comprehending new tax regulations, and devising various strategies to optimize tax efficiency. WHO WE’RE LOOKING FOR: A Chartered Accountancy degree and should have cleared CAInter/IPCC in their first or second attempt and completed a minimum of one year of articleship experience An ability to uphold high standards, analyze rigorously, communicate clearly and concisely, thrive on collaboration, and should demonstrate a high degree of intellectual curiosity. Knowledge of business intelligence tools, along with an understanding of global accounting frameworks, would be an added advantage High motivation and be an individual who can handle high levels of ownership in the respective area of operation while being comfortable functioning in an agile environment that entails timebound, high-quality deliverables Interested candidates can apply through our website: https://www.deshawindia.com/recruit/jobs/Ads/Linkedin/IndTrainRepFeb25 We encourage candidates with relevant experience looking to restart their careers after a break to apply for this position. Learn about Recommence, our gender-neutral return-to-work initiative. The Firm offers excellent benefits, a casual, collegial working environment, and an attractive compensation package. For further information about our recruitment process, including how applicant data will be processed, please visit https://www.deshawindia.com/careers Members of the D. E. Shaw group do not discriminate in employment matters on the basis of sex, race, colour, caste, creed, religion, pregnancy, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good – you’ve come to the right place. Role Description Our Tech and Product team is tasked with innovating and maintaining a massive distributed systems engineering platform that ships hundreds of features to production for tens of millions of users across all industries every day. Our users count on our platform to be highly reliable, lightning fast, supremely secure, and to preserve all of their customizations and integrations every time we ship. Our platform is deeply customizable to meet the differing demands of our vast user base, creating an exciting environment filled with complex challenges for our hundreds of agile engineering teams every day. Required Skills And Experience Salesforce is looking for Site Reliability Engineers to build and manage a multi-substrate kubernetes and microservices platform which powers Core CRM and a growing set of applications across Salesforce. This platform provides the ability to develop and deploy microservices quickly and efficiently, accelerating their path to production.In this role, You are responsible for the high availability of a large fleet of clusters running various technologies like Kubernetes, software load balancers, service mesh and so on. You’ll gain valuable experience troubleshooting real production issues which will expand your knowledge on the architecture of k8s ecosystem services and internals. You will contribute code wherever possible to drive improvement You will drive automation efforts in Python/Golang/Terraform/Spinnaker/Puppet/Jenkins to eliminate manual work with day-to-day operations. You will help improve the visibility of the platform by implementing necessary monitoring and metrics. You’ll implement self-healing mechanisms to proactively fix issues to reduce manual labor. You will get a chance to improve your communication and collaboration skills working with various other Infrastructure teams across Salesforce. You will be interacting with a highly innovative and creative team of developers and architects. You will evaluate new technologies to solve problems as neededYou are the ideal candidate if you have a passion for live site service ownership. You have demonstrated a strong ability to manage large distributed systems. You are comfortable with troubleshooting complex production issues that span multiple disciplines. You bring a solid understanding of how infrastructure software components work. You are able to automate tasks using a modern high-level language. You have good written and spoken communication skills.Required Skills:Experience operating large-scale distributed systems, especially in cloud environments Excellent troubleshooting skills with the ability to learn new technologies in complex distributed systems Strong working experience with Linux Systems Administration. Good knowledge of linux internals. Good experience in any of the scripting/programming languages: Python, GoLang etc ., Basic knowledge of Networking protocols and components: TCP/IP Stack, Switches, Routers, Load Balancers. Experience in any of Puppet, Chef, Ansible or other devops tools. Experience in any of the monitoring tools like Nagios, grafana, Zabbix etc., Experience with Kubernetes, Docker or Service Mesh Experience with AWS, Terraform, Spinnaker A continuous learner and a critical thinker A team player with great communication skills Areas where you may be working on include highly scalable, highly performant distributed systems with highly available and durable data storage capabilities that ensure high availability of the stack above that includes databases. A thorough understanding of distributed systems, system programming, working with system resources is required. Practical knowledge for challenges regarding clustering solutions, hands-on experience in deploying your code in the public cloud environments, working knowledge of Kubernetes and working with APIs provided by various public cloud vendors to handle data are highly desired skills. Benefits & Perks Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/ Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education. Show more Show less

Posted 1 week ago

Apply

18.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Summary We are looking for a highly experienced professional (16–18 years) with deep expertise in infrastructure and cloud database engineering, particularly with Azure SQL. The ideal candidate will have a strong SQL DBA background, hands-on experience in modern DevOps environments, and a proven track record of working on large-scale, complex cloud and infrastructure initiatives. Core Experience & Skills Infrastructure & Cloud Database Engineering Cloud DBA Experience – Azure SQL (Must Have) Senior SQL DBA Background In-Depth Technical Expertise SQL Server: MS SQL 2005 – 2019 and onwards SQL Query/Workload Performance Tuning & Troubleshooting Partitioning, In-Memory OLTP, Resource Governor, Query Store (at least 2 of these) SSRS and SSIS High Availability: AlwaysOn Availability Groups, Failover Clustering Instances Scheduled DB Maintenance Routines Debugging Code & Optimizing Queries Security & Access: Knowledge of Active Directory (least privilege access concept) – Good to Have Security Hardening & Vulnerability Management Scripting & Automation: PowerShell, T-SQL, Python – Expert Level Database-related Automation Scale & Performance: Experience Managing Large Databases (10TB+) Strong Understanding of Metrics & Statistics for DB Performance & Capacity Cloud & DevOps Skills Cloud Migrations: Experience working with dev/delivery teams on application migrations API Services: Building and delivering DB services via APIs DevOps Tooling: GIT, Azure DevOps, Terraform, ARM, Jenkins, Ansible, Puppet, Docker, Kubernetes Engineering Practices: Infra-as-Code, Compliance-as-Code, CI/CD, Agile, Shift-Left Infrastructure Knowledge Virtualisation, Networks, Windows/Linux OS, Storage PaaS Models and Cloud Platform Architecture Strong Infra Knowledge across domains Collaboration & Leadership Cross-functional collaboration with analysts, developers, infra engineers, data scientists, SMEs Vendor Relationship Management Solid Communication Skills & Self-Management Desirable Knowledge & Qualifications Cloud Database Engines (Any of the following): Oracle, MySQL, PostgreSQL, MariaDB Azure: Cosmos DB, MySQL Flexible, PostgreSQL Flexible AWS: Aurora, DynamoDB Certifications: Cloud DB Certifications – Azure SQL (Mandatory), AWS RDS (Beneficial) Cloud Platform Knowledge: Azure, AWS, GCP, Alibaba – Service capabilities & strengths Native Cloud Backup Solutions BI & Analytics Tools: Power BI, Azure Power Platform Kubernetes: Hands-on experience in designing, engineering, supporting Industry Experience: Preferably Financial Sector Cloud Transformation Programs: Experience leading or being part of such initiatives Skills Sql DQB, Azure Show more Show less

Posted 1 week ago

Apply

2.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary JOB DESCRIPTION The Software Developer codes software applications for pipeline operations and planning that provide critical operations support for client companies. The software developer should be a self-starter willing to delve into existing code as well as writing new applications. This position includes fundamental product analysis and design, source code development, and project driven development. Liquids Management System (LMS) Suite of products is a total software solution for liquid hydrocarbon logistics. Synthesis, one of the products within LMS, is an enterprise web based system that tracks liquid hydrocarbons from customer contracts, orders, inventory all the way to charges, billing and invoicing (Order to Cash). It is a highly configurable and extensible system which allows it to cater to the unique business processes of energy companies world-wide In this Role, Your Responsibilities Will Be: Determine coding design requirements from function and detailed specification o Analyze software bugs and affect code repairs Design, develop, and deliver specified software features Produce usable documentation and test procedures Deal directly with the end clients to assist in software validation and deployment Explore and evaluate opportunities to integrate AI/ML capabilities into the LMS suite, particularly for predictive analytics, optimization, and automation. Who You Are: You quickly and decisively act in constantly evolving, unexpected situations. You adjust communication content and style to meet the needs of diverse partners. You always keep the end in sight; puts in extra effort to meet deadlines. You analyze multiple and diverse sources of information to define problems accurately before moving solutions. You observe situational and group dynamics and select best-fit approach. For This Role, You Will Need: BS in Computer Science, Engineering, Mathematics or technical equivalent 2 to 7 years of experience required. Strong problem solving skills Strong Programming Skills (.NET stack, C#, ASP.NET, Web Development technologies, HTML/5, Javascript, WCF, MS SQLServer Transact-SQL). Strong communication skills (client facing). Flexibility to work harmoniously with a small development team. Familiarity with AI/ML concepts and techniques, including traditional machine learning algorithms (e.g., regression, classification, clustering) and modern Large Language Models (LLMs). Experience with machine learning libraries and frameworks (e.g., TensorFlow, PyTorch, scikit-learn). Experience in developing and deploying machine learning models. Understanding of data preprocessing, feature engineering, and model evaluation techniques. Preferred Qualifications that Set You Apart: Experience with liquid pipeline operations or volumetric accounting a plus Knowledge of oil and gas pipeline industry, also a plus. Experience with cloud-based AI/ML services (e.g., Azure Machine Learning, AWS SageMaker, Google Cloud AI Platform) is a plus. Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. About Us WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . About Emerson Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Ready to join the future of innovation in IT at NXP? Become part of the startup of a dynamic team that is leading NXP on a digital transformation journey. Your role is to be an ambassador for the Agile and DevOps way of working within our global NXP organization. There is a lot of room for new ideas and innovation, and you will be supported to have a continuous focus on development, coaching and creating a supportive environment for your team. Linux and UNIX Architect / SME Subject matter expert and in-depth hands on experience managing Linux, AIX, Solaris, Clustering, and various types of compute hardware including blade server technologies includes HPE Synergy, C7000. Subject matter expert and in-depth hands on experience managing VMware, LPAR/VPAR, Solaris Zones, KVM based virtualization, Hyper converge, Docker and Kubernetes. Design and prepare solution blueprint, high level and low level design of infrastructure diverse solutions for above technologies to implement, migrate, integrate or transform the services in datacenters on premises, hybrid and native cloud (Azure and AWS) environment. In depth knowledge and hands on experience integrating above technologies with Satellite, Dell, NetApp, Pure storage solutions, Networker backup, Oracle, MSSQL, MySQL database solutions and middleware services. Extensive experience in datacenter migrations involves above technologies. Strong knowledge and hands on experience with virtualization migrations such P2V and Virtual machine migration across different platform products. Design, configure and support of Active-Active datacenter with virtualization and clustering technologies Expertise in automating the technology stack using Ansible, GIT, Splunk, REST API and native OS scripting for provision, upgrade, changes and management. Strong knowledge in monitoring solutions like Splunk, Zabbix, HPE OneView and native OS monitoring tools. Good knowledge on storage, backup, networking and security products / principles Ensure license compliance of products Researches, identifies, selects and tests technology products required for solution delivery and architectural improvements Establishes, implements and documents the technology implementation, integration and migration strategies to help the organization achieve strategic goals and objectives Design and document DR architecture to ensure business continuity Keep current on industry trends and new technologies for the system architecture Manage the integrated infrastructure solutions to help business functions achieve objectives in a cost-effective and efficient manner. Harmonize and maintain the standardization in IT infrastructure solutions in datacenters in accordance to global IT architecture and security standards Identifies gaps, strategic impacts, financial impacts and the risks in the technical solution or offering, and provides technical support Define the monitoring KPI’s and thresholds for proactive detection of availability and performance of technology stack. Prepare, maintain and track the roadmap of technology refresh to improve efficiency, reliability and performance, eliminate technical debt and security risks Diagnose complex Infrastructure issues and drive support team to ensure zero impact delivery of services through Incident, Problem, Change and risk management. Support technical support teams to fix critical incidents and perform root cause analysis Periodically audit existing systems infrastructure and architecture to ensure an quality, compliance, accurate, high-level understanding of present capabilities Periodically perform the assessment of existing systems infrastructure and provide recommendations to capacity, improve quality, high availability and performance. Recommend and coordinate upgrades, assisting business functions in technology planning aligned with growth projections from IT managers. Work with IT managers, understand the requirements / issues and guide technology support teams with strategic and technical steps to provide solution. Defines system solutions based on business function needs, cost, and required integration with existing applications, systems or platforms. Report to IT managers and key stakeholders regarding findings, making recommendations and providing clear roadmaps for successful changes and upgrades Collaborate with other IT managers, other infrastructure teams and application eco domains to develop highly available and reliable systems solutions capable of supporting global IT goals Oversee the support teams that implement changes in infrastructure, ensuring seamless integration of new technologies. Coordinate with project teams and IT managers to track and implement the infrastructure migration and changes. Review infrastructure changes and advise the steps and plan to ensure business continuity. Qualifications Education & Experience Bachelor’s degree in Information Technology, Computer Science, or a related field. 15+ years of experience in IT architecture/SME role. Preferred Qualifications/Certifications Related technology certifications are highly desirable. Leadership & Soft Skills Excellent leadership, decision-making, and team-building abilities. Strong problem-solving skills with a focus on root cause analysis and proactive prevention. Analytical abilities, proficient in analyzing data and creating reports. Exceptional verbal and written communication and training skills, with the ability to convey technical concepts to non-technical audiences. Ability to work under pressure in high-stakes situations with a calm and focused approach. More information about NXP in India... Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: Data Scientist (Intern) Position Overview: As a Data Scientist, you will be responsible for analyzing and interpreting complex datasets to provide valuable insights and solve challenging business problems. You will work closely with cross-functional teams to gather data, build predictive models, develop algorithms, and deliver data-driven solutions. This position is ideal for a fresher who possesses strong analytical skills, programming expertise, and a passion for working with data. Key Responsibilities: Data Collection and Preprocessing: Collect, clean, and preprocess large volumes of structured and unstructured data from various sources. Conduct data quality assessments and implement data cleaning techniques to ensure accuracy and reliability. Exploratory Data Analysis (EDA): Perform exploratory data analysis to understand the characteristics of the data and identify patterns, trends, and outliers. Utilize statistical techniques and visualizations to gain insights from the data. Statistical Modeling and Machine Learning: Develop predictive models and algorithms using statistical techniques and machine learning algorithms. Apply regression analysis, classification, clustering, time series analysis, and other relevant methods to solve business problems. Evaluate model performance, fine-tune parameters, and optimize models for better accuracy. Feature Engineering: Identify and engineer relevant features from raw data to enhance model performance. Conduct feature selection techniques to improve model interpretability and efficiency. Model Deployment and Evaluation: Collaborate with software engineers to deploy models into production systems. Monitor model performance, diagnose issues, and implement improvements as needed. Data Visualization and Reporting: Communicate findings and insights effectively through clear and concise data visualizations, reports, and presentations. Present complex data-driven concepts to non-technical stakeholders in a way that is easy to understand. Continuous Learning and Research: Stay up-to-date with the latest advancements in data science, machine learning, and related fields. Conduct research and experimentation to explore new methodologies and approaches. Show more Show less

Posted 1 week ago

Apply

Exploring Clustering Jobs in India

The job market for clustering roles in India is thriving, with numerous opportunities available for job seekers with expertise in this area. Clustering professionals are in high demand across various industries, including IT, data science, and research. If you are considering a career in clustering, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

Here are 5 major cities in India actively hiring for clustering roles: 1. Bangalore 2. Pune 3. Hyderabad 4. Mumbai 5. Delhi

Average Salary Range

The average salary range for clustering professionals in India varies based on experience levels. Entry-level positions may start at around INR 3-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-20 lakhs per annum.

Career Path

In the field of clustering, a typical career path may look like: - Junior Data Analyst - Data Scientist - Senior Data Scientist - Tech Lead

Related Skills

Apart from expertise in clustering, professionals in this field are often expected to have skills in: - Machine Learning - Data Analysis - Python/R programming - Statistics

Interview Questions

Here are 25 interview questions for clustering roles: - What is clustering and how does it differ from classification? (basic) - Explain the K-means clustering algorithm. (medium) - What are the different types of distance metrics used in clustering? (medium) - How do you determine the optimal number of clusters in K-means clustering? (medium) - What is the Elbow method in clustering? (basic) - Define hierarchical clustering. (medium) - What is the purpose of clustering in machine learning? (basic) - Can you explain the difference between supervised and unsupervised learning? (basic) - What are the advantages of hierarchical clustering over K-means clustering? (advanced) - How does DBSCAN clustering algorithm work? (medium) - What is the curse of dimensionality in clustering? (advanced) - Explain the concept of silhouette score in clustering. (medium) - How do you handle missing values in clustering algorithms? (medium) - What is the difference between agglomerative and divisive clustering? (advanced) - How would you handle outliers in clustering analysis? (medium) - Can you explain the concept of cluster centroids? (basic) - What are the limitations of K-means clustering? (medium) - How do you evaluate the performance of a clustering algorithm? (medium) - What is the role of inertia in K-means clustering? (basic) - Describe the process of feature scaling in clustering. (basic) - How does the GMM algorithm differ from K-means clustering? (advanced) - What is the importance of feature selection in clustering? (medium) - How can you assess the quality of clustering results? (medium) - Explain the concept of cluster density in DBSCAN. (advanced) - How do you handle high-dimensional data in clustering? (medium)

Closing Remark

As you venture into the world of clustering jobs in India, remember to stay updated with the latest trends and technologies in the field. Equip yourself with the necessary skills and knowledge to stand out in interviews and excel in your career. Good luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies