Jobs
Interviews

3040 Clustering Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Experience in applying machine learning techniques, Natural Language Processing or Computer Vision using TensorFlow, Pytorch Strong analytical and problem-solving skills Solid software engineering skills across multiple languages including but not limited to Java or Python, C/C++ Build and deploy end to end ML models and leverage metrics to support predictions, recommendations, search, and growth strategies Deep understanding of ML techniques such as: classification, clustering, deep learning, optimization methods, supervised and unsupervised techniques Proven ability to apply, debug, and develop machine learning models Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation, Choose suitable DL algorithms, software, hardware and suggest integration methods. Ensure AI ML solutions are developed, and validations are performed in accordance with Responsible AI guidelines & Standards To closely monitor the Model Performance and ensure Model Improvements are done post Project Delivery Coach and mentor our team as we build scalable machine learning solutions Strong communication skills and an easy-going attitude Oversee development and implementation of assigned programs and guide teammates Carry out testing procedures to ensure systems are running smoothly Ensure that systems satisfy quality standards and procedures Build and manage strong relationships with stakeholders and various teams internally and externally, Provide direction and structure to assigned projects activities, establishing clear, precise goals, objectives and timeframes, run Project Governance calls with senior Stakeholders Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models Design and develop prompts suiting project needs Lead and manage team of prompt engineers Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering Key Responsibilities Strategy As the ML Engineer of AI ML Delivery team, the candidate is expected to solutionise, Develop Models and Integrate pipeline for Delivery of AIML Use cases. Business Understand the Business requirement and execute the ML solutioning and ensue the delivery commitments are delivered on time and schedule. Processes Design and Delivery of AI ML Use cases RAI, Security & Governance Model Validation & Improvements Stakeholder Management People & Talent Manage terms of project assignments and deadlines Work with the team dedicated for models related unstructured and structured data. Risk Management Ownership of the delivery, highlighting various risks on a timely manner to the stakeholders. Identifying proper remediation plan for the risks with proper risk roadmap. Governance Awareness and understanding of the regulatory framework, in which the Group operates, and the regulatory requirements and expectations relevant to the role. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Insert local regulator e.g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Business Stakeholders AIML Engineering Team AIML Product Team Product Enablement Team SCB Infrastructure Team Interfacing Program Team Skills And Experience Use NLP, Vision and ML techniques to bring order to unstructured data Experience in extracting signal from noise in large unstructured datasets a plus Work within the Engineering Team to design, code, train, test, deploy and iterate on enterprise scale machine learning systems Work alongside an excellent, cross-functional team across Engineering, Product and Design create solutions and try various algorithms to solve the problem. Stakeholder Managemen Must Have Hands on experience in Kubernetes and Docker Hands on in Azure Cloud services (VMSS, Blob, AKS, Azure LB) Azure Devops tools CI/CD Hands on in Terraform Good To Have Azure OpenAI Grafana and monitoring Qualifications Masters with specialisation in Technology 8- 12 years relevant of Hands-on Experience Strong proficiency with Python, DJANGO framework and REGEX Good understanding of Machine learning framework Pytorch and Tensorflow Knowledge of Generative AI and RAG Pipeline Good in microservice design pattern and developing scalable application. Ability to build and consume REST API Fine tune and perform code optimization for better performance. Strong understanding on OOP and design thinking Understanding the nature of asynchronous programming and its quirks and workarounds Good understanding of server-side templating languages Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments Integration of APIs, multiple data sources and databases into one system Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server. Understanding fundamental design principles behind a scalable and distributed application Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment. Implementing automated testing platforms, unit tests, and CICD Pipeline About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 2 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Exciting Opportunity at Eloelo: Join the Future of Live Streaming and Social Gaming! Are you ready to be a part of the dynamic world of live streaming and social gaming? Look no further! Eloelo, an innovative Indian platform founded in February 2020 by ex-Flipkart executives Akshay Dubey and Saurabh Pandey, is on the lookout for passionate individuals to join our growing team in Bangalore. About Us: Eloelo stands at the forefront of multi-host video and audio rooms, offering a unique blend of interactive experiences, including chat rooms, PK challenges, audio rooms, and captivating live games like Lucky 7, Tambola, Tol Mol Ke Bol, and Chidiya Udd. Our platform has successfully attracted audiences from all corners of India, providing a space for social connections and immersive gaming. Recent Milestone: In pursuit of excellence, Eloelo has secured a significant milestone by raising $22Mn in the month of October 2023 from a diverse group of investors, including Lumikai, Waterbridge Capital, Courtside Ventures, Griffin Gaming Partners, and other esteemed new and existing contributors. Why Eloelo? Be a part of a team that thrives on creativity and innovation in the live streaming and social gaming space. Rub shoulders with the stars! Eloelo regularly hosts celebrities such as Akash Chopra, Kartik Aryan, Rahul Dua, Urfi Javed, and Kiku Sharda from the Kapil Sharma Show and that's our level of celebrity collaboration. Working with a world class team ,high performance team that constantly pushes boundaries and limits , redefines what is possible Fun and work at the same place with amazing work culture , flexible timings , and vibrant atmosphere We are looking to hire a business analyst to join our growth analytics team. This role sits at the intersection of business strategy, marketing performance, creative experimentation, and customer lifecycle management, with a growing focus on AI-led insights. You’ll drive actionable insights to guide our performance marketing, creative strategy, and lifecycle interventions, while also building scalable analytics foundations for a fast-moving growth team. We’re looking for 1 to 3 years of experience in business/marketing analytics or growth-focused analytics roles Strong grasp of marketing funnel metrics, CAC, ROAS, LTV, retention, and other growth KPIs SQL Mastery: 1+ years of experience writing and optimizing complex SQL queries over large datasets (BigQuery/Redshift/Snowflake) Experience in campaign performance analytics across Meta, Google, Affiliates etc. Comfort working with creative performance data (e.g., A/B testing, video/image-led analysis) Experience with CLM campaign analysis via tools like MoEngage, Firebase. Ability to work with large datasets, break down complex problems, and derive actionable insights Hands-on experience or strong interest in applying AI/ML for automation, personalization, or insight generation is a plus Good business judgment and a strong communication style that bridges data and decision-making Comfort juggling short-term tactical asks and long-term strategic workstreams Experience in a fast-paced consumer tech or startup environment preferred You will Own reporting, insights, and experimentation across performance marketing, creative testing, and CLM Partner with growth, product, and content teams to inform campaign decisions, budget allocation, and targeting strategy Build scalable dashboards and measurement frameworks for marketing and business KPIs Drive insights into user behavior and campaign effectiveness by leveraging cohorting, segmentation, and funnel analytics Evaluate and experiment with AI tools or models to automate insights, build scoring systems, or improve targeting/personalization Be the go-to person for identifying growth levers, inefficiencies, or new opportunities across user acquisition and retention Bonus Points Experience working with marketing attribution tools (Appsflyer, Adjust etc.) Hands-on experience with Python/R for advanced analysis or automation Exposure to AI tools for marketing analytics (e.g., creative scoring, automated clustering, LLMs for insights) Past experience working in analytics for a D2C, gaming, or consumer internet company You’ve built marketing mix models or predictive LTV models

Posted 2 days ago

Apply

2.0 - 3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About the Company : METTLER TOLEDO is a global leader in precision instruments and services. We are renowned for innovation and quality across laboratory, process analytics, industrial, product inspection, and retailing applications. Our sales and service network is one of the most extensive in the industry. Our products are sold in more than 140 countries, and we have a direct presence in approximately 40 countries. For more information, please visit www.mt.com. About the Role : We are seeking an ML Engineer to join the Global Finance Carbonation team to design, implement, and maintain classical machine learning models that drive process and system improvements. Responsibilities : Join the Global Finance Carbonation team to design, implement, and maintain classical machine learning models that drive process and system improvements. Improve processes/systems with diagnostic analytics, data mining, process mining. Execute data analytics, predictive analytics, AI/ML projects. Qualifications : B.E./B.Tech. /BSc. /MSc. (Comp/IT). Required Skills : 2-3 years in Classical ML techniques: regression, decision trees, clustering, ensembles, time series forecasting Python programming Large datasets handling & end-to-end ML pipelines Data collection, cleaning, preprocessing Feature engineering Model validation & statistical assessment Interpretation & communication of model insights Shift Timings: 12 - 9 PM (9 hours). Equal Opportunity Employment We promote equal opportunity worldwide and value diversity in our teams in terms of business background, area of expertise, gender and ethnicity.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Position Overview At our Companywe are leveraging analytics and technology, as we invent for life on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making, that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Integrated Research and Forecasting (IRF) is a global function encompassing long-range pharmaceutical asset forecasting across the product lifecycles of all assets within Human Health (Oncology, Vaccines, Hospital Specialty / Primary care). Assets include early and late-stage molecules in clinical development, companies under considerations by business development for partnering and/ or acquisition as well as currently launched products. Forecasting deliverables support division planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across divisions, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Manager – Strategic Forecasting”. We are looking for a team member within strategic forecasting team based out of Pune. Robust forecasting is a priority for businesses, as the product potential has major implications to a wide range of disciplines. While forecasting of realistic potential can be arrived through both qualitative and quantitative methods, the challenge lies in selecting and deploying the right methodology. Thus, it is essential to have someone who understands and aspires to implement advanced analytics techniques such as Monte Carlo simulations, agent-based modeling, conjoint frameworks, NLP, clustering etc. within forecasting vertical. Primary Responsibilities Include, But Are Not Limited To Responsible for one/multiple therapy areas – demonstrating good pharmaceutical knowledge and project management capability Responsible for conceptualizing and delivering forecasts and analytical solutions, using both strategic as well as statistical techniques within area of responsibility Drive continuous enhancements to evolve the existing forecasting capabilities in terms of value-add, risk/ opportunity/uncertainty - identify and elevate key forecasting levers/insights/findings to inform decision making Collaborate across stakeholders – our Manufacturing Division, Human Health, Finance, Research, Country, and senior leadership – to build and robust assumptions, ensuring forecast accuracy improves over time to support decision making Drive innovation and automation to bring in robustness and efficiency gains in forecasting/process; incorporate best-in-class statistical forecasting methods to improve the accuracy Communicate effectively across stakeholders and proactively identify and resolve conflicts by engaging with relevant stakeholders Responsible for delivery of forecasts in a timely manner with allocated resources Determine the optimal method for forecasting, considering the context of the forecast, availability of data, the degree of accuracy desired, and the timeline available Contribute in evolving our offerings through innovation, standardization/ automation of various offerings, models and processes Qualification And Skills Engineering / Management / Pharma post-graduates with 3+ years of experience in the relevant roles; with 1-2 years of experience in pharmaceutical strategic forecasting or analytics Proven ability to work collaboratively across large and diverse functions and stakeholders Ability to manage ambiguous environments, and to adapt to changing needs of business Strong analytical skills; an aptitude for problem solving and strategic thinking Working knowledge of Monte Carlo simulations and range forecasting Ability to synthesize complex information into clear and actionable insights Proven ability to communicate effectively with stakeholders Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions We are driven by our purpose to develop and deliver innovative products that save and improve lives. With 69,000 employees operating in more than 140 countries, we offer state of the art laboratories, plants and offices that are designed to Inspire our employees as we learn, develop and grow in our careers. We are proud of our 125 years of service to humanity and continue to be one of the world’s biggest investors in Research & Development. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Analysis, Marketing, Numerical Analysis, Stakeholder Relationship Management, Strategic Planning, Waterfall Model Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R337310

Posted 2 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Sr. Manager, Strategic Forecasting At our company we are leveraging analytics and technology, as we invent for life on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making, that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Integrated Research and Forecasting (IRF) is a global function encompassing long-range pharmaceutical asset forecasting across the product lifecycles of all assets within Human Health (Oncology, Vaccines, Hospital Specialty / Primary care). Assets include early and late-stage molecules in clinical development, companies under considerations by business development for partnering and/ or acquisition as well as currently launched products. Forecasting deliverables support division planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across divisions, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Senior Specialist – Strategic Forecasting” We are looking for a team member within strategic forecasting team based out of Pune. Robust forecasting is a priority for businesses, as the product potential has major implications to a wide range of disciplines. While forecasting of realistic potential can be arrived through both qualitative and quantitative methods, the challenge lies in selecting and deploying the right methodology. Thus, it is essential to have someone who understands and aspires to implement advanced analytics techniques such as Monte Carlo simulations, agent-based modeling, conjoint frameworks, NLP, clustering etc. within forecasting vertical. Primary Responsibilities Include, But Are Not Limited To Responsible for conceptualizing and delivering forecasts and analytical solutions, using both strategic as well as statistical techniques Drive continuous enhancements to evolve the existing forecasting capabilities in terms of value-add, risk/ opportunity/uncertainty - identify and elevate key forecasting levers/insights/findings to inform decision making Collaborate across stakeholders – our Manufacturing Divisio , Human Health, Finance, Research, Country, and senior leadership – to build and robust assumptions, ensuring forecast accuracy improves over time to support decision making Drive innovation and automation to bring in robustness and efficiency gains in forecasting/process; incorporate best-in-class statistical forecasting methods to improve the accuracy Communicate effectively across stakeholders and proactively identify and resolve conflicts by engaging with relevant stakeholders Responsible for managing team and delivery of forecasts in a timely manner with allocated resources Determine the optimal method for forecasting, considering the context of the forecast, availability of data, the degree of accuracy desired, and the timeline available Contribute in evolving our offerings through standardization/ automation of various offerings, models and processes Participate in selection, talent development and trainings of our company employees Qualification And Skills Engineering / Management / Pharma post-graduates with 8+ years of experience in the relevant roles; with at least 8+ years of experience in pharmaceutical strategic forecasting or analytics Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques (expertise in Spotfire will be added advantage) Proven ability to work collaboratively across large and diverse functions and stakeholders Ability to manage ambiguous environments, and to adapt to changing needs of business Strong analytical skills; an aptitude for problem solving and strategic thinking Hands on experience on Monte Carlo simulations and range forecasting Exposure/sound understanding of advanced modeling techniques like Agent based and dynamic transmission model Ability to synthesize complex information into clear and actionable insights Proven ability to communicate effectively across all levels of stakeholders Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Analysis, Marketing, Numerical Analysis, Stakeholder Relationship Management, Strategic Planning, Waterfall Model Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335627

Posted 2 days ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

As an expectation a fitting candidate must have/be: Ability to analyze business problem and cut through the data challenges. Ability to churn the raw corpus and develop a data/ML model to provide business analytics (not just EDA), machine learning based document processing and information retrieval Quick to develop the POCs and transform it to high scale production ready code. Experience in extracting data through complex unstructured documents using NLP based technologies. Good to have : Document analysis using Image processing/computer vision and geometric deep learning Technology Stack: Python as a primary programming language. Conceptual understanding of classic ML/DL Algorithms like Regression, Support Vectors, Decision tree, Clustering, Random Forest, CART, Ensemble, Neural Networks, CNN, RNN, LSTM etc. Programming: Must Have: Must be hands-on with data structures using List, tuple, dictionary, collections, iterators, Pandas, NumPy and Object-oriented programming Good to have: Design patterns/System design, cython ML libraries: Must Have: Scikit-learn, XGBoost, imblearn, SciPy, Gensim Good to have: matplotlib/plotly, Lime/sharp Data extraction and handling: Must Have: DASK/Modin, beautifulsoup/scrappy, Multiprocessing Good to have: Data Augmentation, Pyspark, Accelerate NLP/Text analytics: Must Have: Bag of words, text ranking algorithm, Word2vec, language model, entity recognition, CRF/HMM, topic modelling, Sequence to Sequence Good to have: Machine comprehension, translation, elastic search Deep learning: Must Have: TensorFlow/PyTorch, Neural nets, Sequential models, CNN, LSTM/GRU/RNN, Attention, Transformers, Residual Networks Good to have: Knowledge of optimization, Distributed training/computing, Language models Software peripherals: Must Have: REST services, SQL/NoSQL, UNIX, Code versioning Good to have: Docker containers, data versioning Research: Must Have: Well verse with latest trends in ML and DL area. Zeal to research and implement cutting areas in AI segment to solve complex problems Good to have: Contributed to research papers/patents and it is published on internet in ML and DL Morningstar is an equal opportunity employer. Morningstar’s hybrid work environment gives you the opportunity to work remotely and collaborate in-person each week. We’ve found that we’re at our best when we’re purposely together on a regular basis, at least three days each week. A range of other benefits are also available to enhance flexibility as needs change. No matter where you are, you’ll have tools and resources to engage meaningfully with your global colleagues. I10_MstarIndiaPvtLtd Morningstar India Private Ltd. (Delhi) Legal Entity

Posted 2 days ago

Apply

4.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Experience Required: 4-7 years Education Qualification: BE/ B. Tech, MCA, MSc (Statistics), MBA from Recognized University Job Description We are looking for a Data Scientist to join our Data Science team. Data science drives all the products we develop. Our products are designed for small to mid-size financial institutions to help them create strategies based on data. The team is responsible for working on predictive model use cases and working closely with technical and functional stakeholders in an agile environment. Role & Responsibilities 3-6 years of relevant work experience in the Data Science/Analytics domain Work with the data scientists’ team, and data engineers. Take ownership of end-to-end data science projects, including problem formulation, data exploration, feature engineering, model development, validation, and deployment, ensuring high-quality deliverables that meet project objectives. Responsible for building analytic systems and predictive models as well as experimenting with new models and techniques. Collaborate with data architects and software engineers to enable deployment of sciences and technologies that will scale across the company’s ecosystem. Responsible for the conception, planning, and prioritizing of data projects Provide support to inexperienced analysts with high-level expertise in an open-source language (e.g., R, Python, etc.) Adhere to stringent quality assurance and documentation standards using version control and code repositories (e.g., Git, GitHub, Markdown) Utilize data visualization tools (Power BI) to deliver insights to stakeholders. Competencies and Technical Skills Degree holder in computer science or related discipline Proficiency in SQL and database querying for data extraction and manipulation. Proficiency in programming languages such as Python/R, and experience with data manipulation and analysis libraries (e.g., NumPy, pandas, scikit-learn). Familiarity with data visualization tools (e.g., Tableau, Power BI) and proficiency in presenting complex data visually. Solid understanding of experimental design, A/B testing, and statistical hypothesis testing. Proven hands-on experience with machine learning algorithms and parameter tuning, including Ensemble methods (Random Forest, XGBoost), Logistic Regression, Support Vector Machines (SVM), and clustering techniques (e.g., K-Means, DBSCAN). Familiarity with Generative AI concepts and AI Agents is a plus. Excellent verbal and written communication skills, with the ability to effectively convey complex concepts to both technical and non-technical stakeholders· Experience with Snowflake is not necessary but preferable.

Posted 2 days ago

Apply

9.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Tech Lead – Azure/Snowflake & AWS Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 2 days ago

Apply

9.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Tech Lead – Azure/Snowflake & AWS Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Senior Data Engineer – Azure/Snowflake Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 7+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Aws,Azure Data Lake,Python

Posted 2 days ago

Apply

1.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

The ideal candidate's favorite words are learning, data, scale, and agility. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers. Responsibilities Analyze raw data: assessing quality, cleansing, structuring for downstream processing Design accurate and scalable prediction algorithms Collaborate with engineering team to bring analytical prototypes to production Generate actionable insights for business improvements Qualifications Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 1 - 2 years' of experience in quantitative analytics or data modeling Deep understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau)

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As an AI/ML Model Developer in this role, your core responsibility will be to build and optimize AI/ML models by leveraging Azure-based AI tools, TensorFlow, PyTorch, and similar frameworks. You will focus on the Logistics & Maritime industries, addressing complex business challenges using AI. Your expertise in cloud-based AI services for data storage, processing, and model deployment, with extensive use of Azure (and optionally AWS), will be crucial in this position. Your key responsibilities will involve designing, developing, and implementing machine learning algorithms. You will collaborate with business teams to translate requirements into technical AI/ML solutions, ensuring data quality for training models through data preprocessing. Using cloud tools like Azure/AWS, you will optimize models for high performance, perform training, tuning, and evaluation of models for continuous improvement. Staying updated on advancements in AI/ML, exploring new techniques, and providing data analysis & visualization to support business decision-making will be essential. You will also enhance existing products with AI-driven features such as automation, prediction, and recommendations while maintaining records for AI models, pipelines, and experiments. To qualify for this position, you should hold a Bachelor's/Master's degree in Engineering Science, Data Science, or a related field, along with 3-4 years of relevant AI/ML experience, preferably in Logistics and Maritime sectors. Key skills and technologies required include proficiency in programming languages like Python, R, or Java, expertise in machine learning techniques such as supervised/unsupervised learning, regression, classification, and clustering. Familiarity with AI frameworks like TensorFlow, PyTorch, Keras, or Scikit-learn, and experience with cloud platforms like AWS or Azure tools for AI services are necessary. Specific experience with Logistics and Maritime products, along with additional skills in deep learning, NLP, and computer vision, will be advantageous. Strong analytical thinking, problem-solving, and communication skills are essential soft skills for this role. This is a full-time, permanent position with a day shift schedule that requires in-person work at the designated location.,

Posted 2 days ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

Role: DC LEAD Location : Gurgaon Sammaan Capital's corporate office at Augusta Point is located at 4th Floor, Augusta Point, Golf Course Road, DLF Phase-5, Sector-53, Gurugram, Haryana -122002, India. EXP: 8-9YRS BUDGET : 10-12LPA Working Days : 6 WFO Look for immediate joiners only 1. JOB DESCRIPTION – Data Centre Lead/ Data Centre Operation Manager 1. Windows Server Administration Windows Server (2016/2019/2022) installation, configuration, and troubleshooting Active Directory (AD) management, Group Policy, and Domain Controllers DNS, DHCP, and network services configuration PowerShell scripting for automation 2. Virtualization & Cloud Hyper-V and VMware administration Virtual Machine (VM) provisioning and maintenance 3. Security & Compliance Patch management and Windows Update services (WSUS) Endpoint security, antivirus, and malware protection Compliance with IT security frameworks (ISO 27001, NIST, GDPR) 4. Monitoring & Performance Optimization Performance tuning and resource optimization Monitoring tools (ME , Zabbix) Troubleshooting high CPU, memory, disk, and network utilization issues 5. High Availability & Disaster Recovery Failover clustering and load balancing Disaster recovery planning and execution Windows Server Backup and restore strategies 7. Incident & Problem Management ITIL framework and service management best practices RCA (Root Cause Analysis) and incident handling

Posted 2 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

🛠️ Job Description We’re building a mobile-first consumer product in a high-growth space. We’re looking for a Founding Engineer who can take full ownership of the tech stack and ship sleek, performant applications with little oversight. This is a 0 to 1 role — you’ll work directly with the founders to bring the product to life from scratch. If you're passionate about building clean, intuitive apps and want to have a major say in product and tech decisions, this is your shot. You’ll be responsible for: Leading the architecture, development, and deployment of the core application Designing and implementing a modern, intuitive mobile-first UI Creating fast, scalable APIs and backend logic Building and maintaining integrations (3rd party product APIs, analytics, etc.) Laying the groundwork for user data tracking , tagging, and feedback loops (Optional but a plus) Prototyping lightweight AI/ML-driven recommendation features Setting up the initial CI/CD, deployment, and cloud infrastructure ✅ Requirements Must-Have Tech Skills: Strong experience with mobile frameworks : React Native / Flutter / Swift / Kotlin Frontend: HTML, CSS, JavaScript (React preferred) Backend: Node.js, Express, or Python/Django/Flask Database: MongoDB / PostgreSQL / Firebase / any modern DB Familiarity with RESTful APIs and 3rd-party integration UI/UX intuition: you should care about aesthetics and functionality equally Bonus Skills: AI/ML: experience with any of the following: Recommendation systems Content-based filtering / embeddings User clustering or tagging via behavioral data Experience with analytics tools (e.g., Mixpanel, Segment) Experience with affiliate APIs, scrapers, or e-commerce data Familiarity with cloud infra (AWS, GCP, Vercel, Firebase, etc.) Prior experience in 0→1 product environments or consumer startups Preferred Background: IIT or equivalent Tier-1 engineering institute 1–5 years of experience, or exceptional college grads welcome Hungry, curious, and willing to take product ownership 📩 To Apply No formal cover letter needed. Just send: Your portfolio / GitHub / LinkedIn Any relevant projects you've built to founders.kreate@gmail.com We'll reach out if there’s a fit.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

We Are Looking For 8+years experienced candidates for this role. Job Location : Technopark, Trivandrum. Experience : 8+ years of experience in Microsoft SQL Server administration. Primary skills : Strong experience in Microsoft SQL Server : Bachelor's degree in computer science, software engineering or a related field. Microsoft SQL certifications (MTA Database, MCSA: SQL Server, MCSE: Data Management and Analytics) will be an advantage. Secondary Skills Experience in MySQL, PostgreSQL, and Oracle database administration. Exposure to Data Lake, Hadoop, and Azure technologies. Exposure to DevOps or ITIL. Main Duties/responsibilities Optimize database queries to ensure fast and efficient data retrieval, particularly for complex or high-volume operations. Design and implement effective indexing strategies to reduce query execution times and improve overall database performance. Monitor and profile slow or inefficient queries and recommend best practices for rewriting or re-architecting queries. Continuously analyze execution plans for SQL queries to identify bottlenecks and optimize them. Database Maintenance: Schedule and execute regular maintenance tasks, including backups, consistency checks, and index rebuilding. Health Monitoring: Implement automated monitoring systems to track database performance, availability, and critical parameters such as CPU usage, memory, disk I/O, and replication status. Proactive Issue Resolution: Diagnose and resolve database issues (e.g., locking, deadlocks, data corruption) proactively, before they impact users or operations. High Availability: Implement and manage database clustering, replication, and failover strategies to ensure high availability and disaster recovery (e.g., using tools like SQL Server Always On, Oracle RAC, MySQL Group Replication). (ref:hirist.tech)

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations Responsibilities Job Description & Summary – Associate – GenAI – Mumbai Role : Associate Exp : 3—5 Years Location: Mumbai Job Description: Candidate with 3-5 years of exp and a strong background in machine learning, technical expertise, and domain knowledge in Banking, Financial Services, and Insurance (BFSI). Experience with Generative AI (GenAI) is a must have. Key Responsibilities: Collaborate with clients to understand their business needs and provide data-driven solutions. Develop and implement machine learning models to solve complex business problems. Analyze large datasets to extract actionable insights and drive decision-making. Present findings and recommendations to stakeholders in a clear and concise manner. Stay updated with the latest trends and advancements in data science and machine learning. GenAI Experience: Generative AI (GenAI) experience, including working with models like GPT, BERT, and other transformer-based architectures Ability to leverage GenAI for tasks such as text generation, summarization, and conversational AI Experience in developing and deploying GenAI solutions to enhance business processes and customer experiences Technical Skills: Programming Languages: Proficiency in Python, R, and SQL for data manipulation, analysis, and model development. Machine Learning Frameworks: Extensive experience with TensorFlow, PyTorch, and Scikit-learn for building and deploying models. Data Visualization Tools: Strong knowledge of Tableau, Power BI, and Matplotlib to create insightful visualizations. Cloud Platforms: Expertise in AWS, Azure, and Google Cloud for scalable and efficient data solutions. Database Management: Proficiency in SQL and NoSQL databases for data storage and retrieval. Version Control: Experience with Git for collaborative development and code management. APIs and Web Services: Ability to integrate and utilize APIs for data access and model deployment. Machine Learning algorithms: Supervised and Unsupervised Learning Regression Analysis Classification Techniques Clustering Algorithms Natural Language Processing (NLP) Time Series Analysis Deep Learning Reinforcement Learning Mandatory Skill Sets GenAI Preferred Skill Sets GenAI Years Of Experience Required 3—5 Years Education Qualification B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Extract Transform Load (ETL), Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 days ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About The Company TSC Redefines Connectivity with Innovation and IntelligenceDriving the next level of intelligence powered by Cloud, Mobility, Internet of Things, Collaboration, Security, Media services and Network services, we at Tata Communications are envisaging a New World of Communications Linux Administrator L3 Engineer ( IT Operations & Infrastructure ) Employment Type On-roll Reporting Manager Direct reports Role Purpose Linux Administrator L3 Engineer - IT Operations & Infrastructure Key Responsibilities / Accountabilities We are seeking an experienced Linux Administrator Engineer (L3) to lead and manage Linux-based infrastructure across on-premises and cloud environments. This role requires expertise in advanced Linux system administration, performance tuning, security hardening, automation, high availability (HA) configurations, and troubleshooting complex issues. The ideal candidate should have deep knowledge of RHEL, CentOS, Ubuntu, SUSE, Oracle Linux, along with cloud Linux workloads (AWS, GCP, Azure, OCI), containerization (Docker, Kubernetes, OpenShift), and automation (Ansible, Terraform, Python, Bash). Major Duties & Responsibilities Linux Infrastructure Design & Management: Architect, deploy, and maintain enterprise-grade Linux environments (RHEL, CentOS, Ubuntu, SUSE, Oracle Linux). Design and implement scalable, highly available, and secure Linux-based systems. Perform advanced troubleshooting, root cause analysis (RCA), and performance tuning. Ensure system reliability, patching, and security updates for production servers. Cloud & Virtualization Administration: Optimize cloud-based Linux instances, auto-scaling, and cost management strategies. Work with VMware, KVM, Hyper-V, OpenStack for on-prem virtualization. Automation & Configuration Management: Automate Linux system administration tasks using Ansible, Terraform, Bash, Python, PowerShell. Implement Infrastructure as Code (IaC) to automate provisioning and configuration. Develop cron jobs, systemd services, and log rotation scripts. Security & Compliance: Implement Linux system hardening (CIS benchmarks, SELinux, AppArmor, PAM, SSH security). Configure firewall rules (iptables, nftables, firewalld), VPN, and access control policies. Ensure compliance with ISO 27001, PCI-DSS, HIPAA, and NIST security standards. Conduct vulnerability scanning, penetration testing, and security audits. Networking & High Availability (HA) Solutions: Configure and manage DNS, DHCP, NFS, iSCSI, SAN, CIFS, VLANs, and network bonding. Deploy Linux clusters, failover setups, and high-availability solutions (Pacemaker, Corosync, DRBD, Ceph, GlusterFS). Work with load balancing solutions (HAProxy, Nginx, F5, Cloud Load Balancers). Monitoring & Performance Optimization: Set up real-time monitoring tools (Prometheus, Grafana, Nagios, Zabbix, ELK, Site 24x7). Optimize CPU, memory, disk IO, and network performance for Linux workloads. Analyze and resolve kernel panics, memory leaks, and slow system responses. Backup & Disaster Recovery: Design and implement Linux backup & disaster recovery strategies (CommVault, Veeam, Rsync, AWS Backup, GCP Backup & DR, OCI Vaults). Perform snapshot-based recovery, failover testing, and disaster recovery planning. Collaboration & Documentation: Mentor L1 and L2 engineers, provide escalation support for critical incidents. Maintain technical documentation, SOPs, and knowledge base articles. Assist in capacity planning, forecasting, and IT infrastructure roadmaps. Required Knowledge, Skills And Abilities Expert-level knowledge of Linux OS administration, troubleshooting, and performance tuning. Strong hands-on expertise in server patching, automation, and security best practices. Deep understanding of cloud platforms (AWS, GCP, Azure, OCI) and virtualization (VMware, KVM, Hyper-V, OpenStack). Advanced networking skills in firewalls, VLANs, VPN, DNS, and routing. Proficiency in scripting (Bash, Python, Ansible, Terraform, PowerShell). Experience with high-availability architectures and clustering solutions. Strong problem-solving, analytical, and troubleshooting skills for mission-critical environments. Preferred Additional Skills And Abilities Experience with Linux-based Kubernetes clusters (EKS, AKS, GKE, OpenShift, Rancher). Understanding of CI/CD pipelines and DevOps tools (Jenkins, Git, GitLab, ArgoCD, Helm). Knowledge of big data, logging, and analytics tools (Splunk, ELK Stack, Kafka, Hadoop). Familiarity with database management on Linux (MySQL, PostgreSQL, MariaDB, MongoDB, Redis). Qualifications And Experience Following are the key skills and experience expected out of the candidate Bachelors in Communications / Computer Science OR Software Engineering OR related technical degree OR Experience 7+ years of experience in Linux administration and enterprise infrastructure. Proven track record in designing, implementing, and optimizing Linux environments. Experience with multi-cloud Linux workloads, scripting, security, and high availability. Certifications (Preferred But Not Mandatory) Red Hat Certified Engineer (RHCE) or RHCSA LPIC-3 (Linux Professional Institute Certification Level 3)

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for an experienced AI/ML Architect to spearhead the design, development, and deployment of cutting-edge AI and machine learning systems. As the ideal candidate, you should possess a strong technical background in Python and data science libraries, profound expertise in AI and ML algorithms, and hands-on experience in crafting scalable AI solutions. This role demands a blend of technical acumen, leadership skills, and innovative thinking to enhance our AI capabilities. Your responsibilities will include identifying, cleaning, and summarizing complex datasets from various sources, developing Python/PySpark scripts for data processing and transformation, and applying advanced machine learning techniques like Bayesian methods and deep learning algorithms. You will design and fine-tune machine learning models, build efficient data pipelines, and leverage distributed databases and frameworks for large-scale data processing. In addition, you will lead the design and architecture of AI systems, with a focus on Retrieval-Augmented Generation (RAG) techniques and large language models. Your qualifications should encompass 5-7 years of total experience with 2-3 years in AI/ML, proficiency in Python and data science libraries, hands-on experience with PySpark scripting and AWS services, strong knowledge of Bayesian methods and time series forecasting, and expertise in machine learning algorithms and deep learning frameworks. You should also have experience in structured, unstructured, and semi-structured data, advanced knowledge of distributed databases, and familiarity with RAG systems and large language models for AI outputs. Strong collaboration, leadership, and mentorship skills are essential. Preferred qualifications include experience with Spark MLlib, SciPy, StatsModels, SAS, and R, a proven track record in developing RAG systems, and the ability to innovate and apply the latest AI techniques to real-world business challenges. Join our team at TechAhead, a global digital transformation company known for AI-first product design thinking and bespoke development solutions. With over 14 years of experience and partnerships with Fortune 500 companies, we are committed to driving digital innovation and delivering excellence. At TechAhead, you will be part of a dynamic team that values continuous learning, growth, and crafting tailored solutions for our clients. Together, let's shape the future of digital innovation worldwide!,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a part of the Talent Acquisition team at Tesco, you will play a crucial role in representing Talent Acquisition in various forums and seminars related to process, compliance, and audit. Additionally, you will be responsible for driving a Continuous Improvement (CI) culture, implementing CI projects, and fostering innovation within the team. Your role will involve engaging with business and functional partners to gain a deep understanding of business priorities. You will be required to ask relevant questions and translate the insights into an analytical solution document. This document will highlight how the application of data science can enhance decision-making processes. To excel in this role, you must possess a strong understanding of techniques for preparing analytical data sets from multiple complex sources. You will be expected to develop statistical models and machine learning algorithms with a high level of competency. Furthermore, you will need to write structured, modularized, and codified algorithms using Continuous Improvement principles. In addition to building algorithms, you will create an easy-to-understand visualization layer on top of the analytical models. This visualization layer will empower end-users to make informed decisions. You will also be responsible for proactively promoting the adoption of solutions developed by the team and identifying areas for improvement within the larger Tesco business. Keeping abreast of the latest trends in data science and retail analytics is essential for this role. You will be expected to share your knowledge with colleagues and mentor a small team of Applied Data Scientists to deliver impactful analytics projects. Your responsibilities will include leading solution scoping and development to facilitate the collaboration between Enterprise Analytics teams and Business teams across Tesco. It is imperative to adhere to the Business Code of Conduct, act with integrity, and fulfill specific risk responsibilities related to Talent Acquisition, process compliance, and audit. To thrive in this role, you will need expertise in Applied Math, including Applied Statistics, Regression, Decision Trees, Forecasting, and Optimization algorithms. Proficiency in SQL, Hadoop, Spark, Python, Tableau, MS Excel, MS PowerPoint, and GitHub is also required. Additionally, having a basic understanding of the Retail domain and soft skills such as Analytical Thinking, Problem-solving, Storyboarding, and Stakeholder engagement will be beneficial. Joining Tesco's team in Bengaluru offers you the opportunity to be part of a multi-disciplinary team that aims to serve customers, communities, and the planet better each day. By standardizing processes, delivering cost savings, leveraging technological solutions, and empowering colleagues, Tesco in Bengaluru strives to create a sustainable competitive advantage. With a focus on reducing complexity and offering high-quality services, you will contribute to Tesco's mission of providing exceptional experiences for customers worldwide. Tesco Technology is a diverse team of over 5,000 experts located in various countries, including India. The Technology division encompasses roles in Engineering, Product Development, Programme Management, Service Desk Operations, Systems Engineering, Security & Capability, Data Science, and more. Established in 2004, Tesco in Bengaluru plays a vital role in enhancing customer experiences and streamlining operations for millions of customers and over 330,000 colleagues globally.,

Posted 3 days ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibilities: Design, develop, and optimize high-performance backend services using Rust , targeting 1000+ orders per second throughput. Implement scalable architectures with load balancing for high availability and minimal latency. Integrate and optimize Redis for caching, pub/sub, and data persistence. Work with messaging services like Kafka and RabbitMQ to ensure reliable, fault-tolerant communication between microservices. Develop and manage real-time systems with WebSockets for bidirectional communication. Write clean, efficient, and well-documented code with unit and integration tests. Collaborate with DevOps for horizontal scaling and efficient resource utilization. Diagnose performance bottlenecks and apply optimizations at the code, database, and network level. Ensure system reliability, fault tolerance, and high availability under heavy loads. Required Skills & Experience: 3+ years of professional experience with Rust in production-grade systems. Strong expertise in Redis (clustering, pipelines, Lua scripting, performance tuning). Proven experience with Kafka , RabbitMQ , or similar messaging queues. Deep understanding of load balancing, horizontal scaling , and distributed architectures. Experience with real-time data streaming and WebSocket implementations. Knowledge of system-level optimizations, memory management, and concurrency in Rust. Familiarity with high-throughput, low-latency systems and profiling tools. Understanding of cloud-native architectures (AWS, GCP, or Azure) and containerization (Docker/Kubernetes). Preferred Qualifications: Experience with microservices architecture and service discovery . Knowledge of monitoring & logging tools (Prometheus, Grafana, ELK). Exposure to CI/CD pipelines for Rust-based projects. Experience in security and fault-tolerant design for financial or trading platforms (nice to have).

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Senior / Lead Data Scientist Locations: Hyderabad Notice Period : Immediate to 15 Days Skills : Data Science, Python, Regression, Classification, Google Cloud Platform Domain : CPG Who we are Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer full-stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow. Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence. We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine Curious about the role? What your typical day would look like? As a Senior Data Scientist, your work is a combination of hands-on contribution to Loreum Ipsum, Loreum Ipsum, etc. More specifically, this will involve: Lead and contribute to developing sophisticated machine learning models, predictive analytics, and statistical analyses to solve complex business problems. Demonstrate proficiency in programming languages such as Python or R, with the ability to write clean, efficient, and maintainable code. Experience with relevant libraries and frameworks (e.g., TensorFlow, PyTorch, scikit-learn) is essential. Use your robust problem-solving skills to develop data-driven solutions, analyse complex datasets, and derive actionable insights that lead to impactful outcomes. Work closely with clients to understand their business objectives, identify opportunities for analytics-driven solutions, and communicate findings clearly and promptly. Take ownership of end-to-end model development, from problem definition and data exploration to model training, validation, and deployment. Collaborate with cross-functional teams, including data engineers, software developers, and business stakeholders, to integrate analytics solutions into business processes. Leverage a profound understanding of mathematical and statistical principles to guide developing and validating advanced data science models. Stay abreast of industry trends, emerging technologies, and best practices in data science, bringing innovative ideas to the team and contributing to continuous improvement. What do we expect? 6-10 years of total DS and model development experience A passion for writing high-quality code (Python), and the code should be modular, scalable, and end-end project execution while planning an active hands-on role Having good problem-solving skills is essential, and it is equally important to have in-depth knowledge to solve complex problems effectively. Comprehensive knowledge of the regression and classification concepts and mathematical backend along with SQL Idea about other machine learning techniques (clustering, regression, ensemble learning, neural nets, time series, optimizations etc.) to their real-world problems Encourage collaboration with various stakeholders and take complete ownership of deliverables. Adept understanding of various data science approaches, machine learning algorithms, and statistical methods. Excellent communication skills with presentability, articulation, storytelling capability, and ability to manage complex client situations Effective mentoring of a team with expertise in industry/domain/functional areas. You are important to us, let’s stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Additional Benefits: Health insurance (self & family), virtual wellness platform, and knowledge communities.

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

AI-Driven SEO Skills The core requirements for the job include the following: Prompt Engineering: Crafting prompts for tools like ChatGPT to generate SEO content ideas, meta descriptions, alt texts, FAQs, etc. SEO Automation Surfer SEO, Clearscope, and Neuron Writer for content optimization. ChatGPT + Zapier for automating reports or content workflows. Python + GPT API to bulk-generate metadata or keyword clusters. AI-Assisted Content Strategy Keyword clustering at scale using tools like Keyword Insights, Low Fruits, or Frase. Programmatic SEO (e. g., generating 100s of location/product pages with templates + AI copy). Data Analysis And Visualization Extracting insights from large keyword datasets using Python, Excel, and Looker Studio Understanding algorithm changes and adapting strategy dynamically Core SEO Expertise (Foundational) Technical SEO: Crawlability, indexing, sitemaps Core Web Vitals and page speed optimization Structured data/schema markup On-Page Optimization Keyword strategy and mapping Content structure, metadata, internal linking Off-Page SEO Link-building strategy Brand reputation and E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) Analytics And Reporting GA4 GSC, Ahrefs, Semrush, Screaming Frog KPI tracking (CTR, rankings, conversions) Strategic And Soft Skills Growth Mindset and Experimentation: A/B testing landing pages, content formats, and CTAs. Cross-Functional Collaboration: Working with devs for technical fixes, content writers for AI-assisted content, and product teams for feature pages. Content and UX Sensitivity: Balancing AI-generated content with human review. Aligning SEO with actual user intent and experience. Tools Content Optimization: Surfer SEO, Clearscope, NeuronWriter. Keyword Clustering: Keyword Insights, ChatGPT + Python. Automation: Zapier, Make, ChatGPT API, Sheet tools. Analytics: GA4 GSC, Looker Studio, Screaming Frog. Content Generation: ChatGPT, Jasper, Writesonic, Koala AI. This job was posted by Mamta Naagar from Rupeezy.

Posted 3 days ago

Apply

4.0 - 7.0 years

9 - 15 Lacs

Pune

Work from Office

Performing scheduled maintenance and support release deployment activities and upgrades. Writing database documentation, including data standards, procedures and definitions for the data dictionaries Query tuning and process optimisation Required Candidate profile Strong written and spoken English. Knowledge of data backup, recovery, security, integrity & T-SQL Familiarity with database design,documentation and coding. Ability to identify opportunities & issues

Posted 3 days ago

Apply

4.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers are expected to understand client objectives and collaborate with the Project Lead to design appropriate analytical solutions. They should be able to translate business goals into structured deliverables with defined priorities and constraints. The role involves managing, organizing, and preparing data, conducting quality checks, and ensuring readiness for analysis.They should be proficient in applying statistical and machine learning techniques such as regression (linear/non-linear), decision trees, segmentation, time series forecasting, and algorithms like Random Forest, SVM, and ANN. Sanity checks and rigorous self-QC of all outputs, including work from junior analysts, are essential to ensure accuracy.Interpretation of results in the context of the client’s industry is necessary to generate meaningful insights. Assistant Managers should be comfortable handling client calls independently and coordinating regularly with onsite leads when applicable. They should be able to discuss specific deliverables or queries over calls or video conferences.They must manage projects from initiation through closure, ensuring timely and within-budget delivery. This includes collaborating with stakeholders to refine business needs and convert them into technical specifications, managing data teams, conducting performance evaluations, and ensuring high data quality. Effective communication between technical and business stakeholders is key to aligning expectations. Continuous improvement of analytics processes and methodologies is encouraged. The role also involves leading cross-functional teams and overseeing project timelines and deliverables.Client ManagementAssistant Managers will act as the primary point of contact for clients, maintaining strong relationships and making key decisions independently. They will participate in discussions on deliverables and guide project teams on next steps and solution approaches.Technical RequirementsCandidates must have hands-on experience connecting databases with Knime (e.g., Snowflake, SQL DB) and working with SQL concepts such as joins and unions. They should be able to read from and write to databases, utilize macros to automate tasks, and enable schedulers to run workflows. The ability to design and build ETL workflows and datasets in Knime for BI reporting tools is crucial. They must perform end-to-end data validation and maintain documentation supporting BI reports.They should be experienced in developing interactive dashboards and reports using PowerBI and leading analytics projects using PowerBI, Python, and SQL. Presenting insights clearly through PowerPoint or BI dashboards (e.g., Tableau, Qlikview) is also expected.Ideal CandidateThe ideal candidate will have 4 to 7 years of relevant experience in advanced analytics for Marketing, CRM, or Pricing within Retail or CPG; other B2C sectors may also be considered. Experience in managing and analyzing large datasets using Python, R, or SAS is required, along with the use of multiple analytics and machine learning techniques.They should be able to manage client communications independently and understand consumer-facing industries such as Retail, CPG, or Telecom. Familiarity with handling various data formats (flat files, RDBMS) and platforms (Knime, SQL Server, Teradata, Hadoop, Spark) in both on-premise and cloud environments is expected. A solid foundation in advanced statistical techniques such as regressions, decision trees, clustering, forecasting (ARIMA/X), and machine learning is essential.Other SkillsStrong verbal and written communication is a must. The candidate should be able to deliver client-ready outputs using Excel and PowerPoint. Knowledge of optimization techniques (linear/non-linear), supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview is a plus. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools

Posted 3 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools

Posted 3 days ago

Apply

5.0 years

0 Lacs

Greater Bengaluru Area

On-site

Role : Data Scientist Experience : 5 years to 8 years Location : Hyderabad, Pune, Bangalore Job Description: Mandatory Skills :Machine Learning, Python Programming Tools Python Pandas NumPy Matplotlib Seaborn Scikit learn SQL complex queries joins CTEs window functions Git Data Analysis Visualization Exploratory Data Analysis EDA trend and correlation analysis hypothesis testing dashboards Power BI Tableau storytelling with data Databases Data Handling SQL Server MySQL basic knowledge of Spark BigQuery Hive is a plus Statistics ML Foundational Descriptive and inferential statistics regression classification clustering KMeans model evaluation metrics accuracy precision recall AUC over fitting under fitting cross validation LLMAI Exposure Familiarity with prompt engineering OpenAI APIs basic usage of GPT models for data text automation awareness of LLM limitations and applications in analytics Soft Skills Strong problem solving ability attention to detail stakeholder communication ability to translate business problems into data solutions Experience Highlights Conducted in depth data analysis to uncover trends patterns and anomalies that informed strategic decisions across product marketing and operations teams Designed and implemented scalable data pipelines and automated data workflows using Python and SQL Developed and maintained analytical models and dashboards to track key business metrics and performance indicators Applied statistical methods and machine learning techniques to solve real world business problems such as forecasting segmentation and performance optimization Collaborated with stakeholders to gather requirements translate business questions into analytical approaches and communicate findings with clarity Explored the use of LLMs eg OpenAI GPT for enhancing internal workflows and accelerating data driven tasks such as querying summarization and content generation

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies