Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 8.0 years
5 - 10 Lacs
Gurugram
Work from Office
Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Microsoft SQL Server Administration Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Roles and Responsibilities:1. As a Database Administrator, you will administer, develop, test, or demonstrate databases. 2. You will perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. 3. Your typical day will involve installing database management systems (DBMS) and providing input for modification of procedures and documentation used for problem resolution and day-to-day maintenance.4. Always on, Cluster, mirroring, Performance tunning, Backup and recovery Professional & Technical Skills:1. Failover Cluster, Availability Group and Replication, Patching, upgradation, and Migration of SQL Server instances2. Backup and database recovery process. 3. SQL Server Best practice, performance optimization and tuning.4. Adaptable / Flexible, Open for shifts and 24*7 Support5. Open to work in night shifts, Ability to work independently when required, Flexible in week offs. Additional Information:1. The candidate should have a minimum of 5 to 8 years of experience in Microsoft SQL Database Administration.2. This position is based at our Gurugram office.3. 15 years full time education is required. Qualifications 15 years full time education
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
The SQL DBA role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the SQL DBA domain.
Posted 2 weeks ago
4.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
JOB_POSTING-3-70798-2 Job Description Role Title: AVP, Senior Product Engineer (L10) Company Overview: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles.. Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization. Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Data team owns and manages different tools platforms which provides an environment for designing and building different data solutions. Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts Role Summary/Purpose: We are looking for an strong Individual contributor, platform administrator who will work in building and managing the NEO4J platform for scanning data sources across on-prem environments.The engineer will work cross-functionally with operations, other data engineers and product owner to assure capabilities are delivered that meet business needs Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Data team owns and manages different tools platforms which provides an environment for designing and building different data solutions. Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts Key Responsibilities Experience with Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.) with LDAP Based Authentication & Authorization for Neo4J. Expertise in graph driven data science applications, such as graph based feature engineering, graph embedding, or graph neural networks. Experience with Encryption Solutions for Neo4J. Experience in high availability and disaster recovery solutions deployment of new technology Superior decision-making, client relationship, and vendor management skills. Required Skills/Knowledge Experience in Neo4j product administration and development. Basic understanding of LINUX OS (understanding of file systems, environment, user and groups) Basic understanding of firewalls, ports and how to check connectivity between 2 environments Exposure to public cloud ecosystem (AWS, Azure and GCP) and its components Understanding on DevOps pipelines Exposure to Operations task like Job Scheduling, monitoring, Health check of the platforms, automations etc. Understanding of SAFe methodology/working in Agile environment Desired Skills/Knowledge Experience on installation and configuration of Bloom and GDS software. Hands on experience with cloud services such as S3, Redshift, etc. Extensive experience with deploying and managing applications running on Kubernetes (experience with administering Kubernetes clusters). Experience deploying and working with observability systems such as: Prometheus, Grafana, New-Relic, Splunk Logging. Eligibility Criteria Bachelor's degree in computer Science with minimum 4+ years of relevant technology experience or in lieu of degree 6+ years of relevant technology experience. Minimum 5+ years of financial services experience. Minimum 5+ years of experience managing Data platforms. Hands on experience with cloud platforms such as S3, Redshift, etc. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated competency linking business strategy with IT technology initiatives Proven track record of leading and executing on critical business initiatives on-time and within budget. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology. Superior decision-making, client relationship, and vendor management skills. Work Timings: 3PM - 12AM IST (This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details.) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L8+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L08+ Employees can apply Grade/Level: 10 Job Family Group Information Technology Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
22 - 27 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Key Responsibilities Analytics (Data & Insights): Analyze category and activity performance using advanced analytics methods. Deliver actionable insights using transactional, financial, and customer data. Design and measure experiments and pilots. Lead projects in clustering, forecasting, and causal impact analysis. Build intuitive dashboards to represent insights effectively. Operational Excellence: Improve data quality with automation tools. Develop analytical solutions and dashboards using user-centric design. Benchmark against industry standards and enhance business performance. Stakeholder Management: Collaborate with consultants, engineers, and cross-functional teams. Communicate complex insights in an understandable format. Create clear documentation linking business needs to data solutions. Promote a data-driven culture and share learnings across teams. Qualifications & Experience Education: Bachelors degree in Finance, Mathematics, Statistics, Engineering, or a related analytical discipline. Experience: 7+ years in a data analytics or quantitative role. Experience with Python, SQL, Spark, and handling large data volumes. Experience leading projects or small teams. Strong communication skills (English). Behavioral Competencies: Delivery excellence Innovation and agility Business acumen Social intelligence Technical Knowledge & Tools Retail, Supply Chain, Marketing, Customer Analytics Statistical Modeling & Time Series Analysis Python, PySpark, R MySQL, Microsoft SQL Server Power BI Azure, AWS, GCP
Posted 2 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
JOB_POSTING-3-70798-5 Job Description Role Title: AVP, Senior Product Engineer (L10) Company Overview: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles.. Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization. Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Data team owns and manages different tools platforms which provides an environment for designing and building different data solutions. Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts Role Summary/Purpose: We are looking for an strong Individual contributor, platform administrator who will work in building and managing the NEO4J platform for scanning data sources across on-prem environments.The engineer will work cross-functionally with operations, other data engineers and product owner to assure capabilities are delivered that meet business needs Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Data team owns and manages different tools platforms which provides an environment for designing and building different data solutions. Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts Key Responsibilities Experience with Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.) with LDAP Based Authentication & Authorization for Neo4J. Expertise in graph driven data science applications, such as graph based feature engineering, graph embedding, or graph neural networks. Experience with Encryption Solutions for Neo4J. Experience in high availability and disaster recovery solutions deployment of new technology Superior decision-making, client relationship, and vendor management skills. Required Skills/Knowledge Experience in Neo4j product administration and development. Basic understanding of LINUX OS (understanding of file systems, environment, user and groups) Basic understanding of firewalls, ports and how to check connectivity between 2 environments Exposure to public cloud ecosystem (AWS, Azure and GCP) and its components Understanding on DevOps pipelines Exposure to Operations task like Job Scheduling, monitoring, Health check of the platforms, automations etc. Understanding of SAFe methodology/working in Agile environment Desired Skills/Knowledge Experience on installation and configuration of Bloom and GDS software. Hands on experience with cloud services such as S3, Redshift, etc. Extensive experience with deploying and managing applications running on Kubernetes (experience with administering Kubernetes clusters). Experience deploying and working with observability systems such as: Prometheus, Grafana, New-Relic, Splunk Logging. Eligibility Criteria Bachelor's degree in computer Science with minimum 4+ years of relevant technology experience or in lieu of degree 6+ years of relevant technology experience. Minimum 5+ years of financial services experience. Minimum 5+ years of experience managing Data platforms. Hands on experience with cloud platforms such as S3, Redshift, etc. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated competency linking business strategy with IT technology initiatives Proven track record of leading and executing on critical business initiatives on-time and within budget. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology. Superior decision-making, client relationship, and vendor management skills. Work Timings: 3PM - 12AM IST (This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details.) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L8+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L08+ Employees can apply Grade/Level: 10 Job Family Group Information Technology Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Kochi
Work from Office
The Commvault Backup & Recovery role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Commvault Backup & Recovery domain.
Posted 2 weeks ago
4.0 - 5.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Job Description We are seeking an experienced AI Engineer with 4-5 years of hands-on experience in designing and implementing AI solutions. The ideal candidate should have a strong foundation in developing AI/ML-based solutions, including expertise in Computer Vision (OpenCV). Additionally, proficiency in developing, fine-tuning, and deploying Large Language Models (LLMs) is essential. As an AI Engineer, candidate will work on cutting-edge AI applications, using LLMs like GPT, LLaMA, or custom fine-tuned models to build intelligent, scalable, and impactful solutions. candidate will collaborate closely with Product, Data Science, and Engineering teams to define, develop, and optimize AI/ML models for real-world business applications. Key Responsibilities Research, design, and develop AI/ML solutions for real-world business applications, RAG is must. Collaborate with Product & Data Science teams to define core AI/ML platform features. Analyze business requirements and identify pre-trained models that align with use cases. Work with multi-agent AI frameworks like LangChain, LangGraph, and LlamaIndex. Train and fine-tune LLMs (GPT, LLaMA, Gemini, etc.) for domain-specific tasks. Implement Retrieval-Augmented Generation (RAG) workflows and optimize LLM inference. Develop NLP-based GenAI applications, including chatbots, document automation, and AI agents. Preprocess, clean, and analyze large datasets to train and improve AI models. Optimize LLM inference speed, memory efficiency, and resource utilization. Deploy AI models in cloud environments (AWS, Azure, GCP) or on-premises infrastructure. Develop APIs, pipelines, and frameworks for integrating AI solutions into products. Conduct performance evaluations and fine-tune models for accuracy, latency, and scalability. Stay updated with advancements in AI, ML, and GenAI technologies. Required Skills & Experience AI & Machine Learning: Strong experience in developing & deploying AI/ML models. Generative AI & LLMs: Expertise in LLM pretraining, fine-tuning, and optimization. NLP & Computer Vision: Hands-on experience in NLP, Transformers, OpenCV, YOLO, R-CNN. AI Agents & Multi-Agent Frameworks: Experience with LangChain, LangGraph, LlamaIndex. Deep Learning & Frameworks: Proficiency in TensorFlow, PyTorch, Keras. Cloud & Infrastructure: Strong knowledge of AWS, Azure, or GCP for AI deployment. Model Optimization: Experience in LLM inference optimization for speed & memory efficiency. Programming & Development: Proficiency in Python and experience in API development. Statistical & ML Techniques: Knowledge of Regression, Classification, Clustering, SVMs, Decision Trees, Neural Networks. Debugging & Performance Tuning: Strong skills in unit testing, debugging, and model evaluation. Hands-on experience with Vector Databases (FAISS, ChromaDB, Weaviate, Pinecone). Good To Have Experience with multi-modal AI (text, image, video, speech processing). Familiarity with containerization (Docker, Kubernetes) and model serving (FastAPI, Flask, Triton). Requirements We are seeking an experienced AI Engineer with 4-5 years of hands-on experience in designing and implementing AI solutions. The ideal candidate should have a strong foundation in developing AI/ML-based solutions, including expertise in Computer Vision (OpenCV). Additionally, proficiency in developing, fine-tuning, and deploying Large Language Models (LLMs) is essential. As an AI Engineer, candidate will work on cutting-edge AI applications, using LLMs like GPT, LLaMA, or custom fine-tuned models to build intelligent, scalable, and impactful solutions. candidate will collaborate closely with Product, Data Science, and Engineering teams to define, develop, and optimize AI/ML models for real-world business applications. Key Responsibilities: - Research, design, and develop AI/ML solutions for real-world business applications, RAG is must. - Collaborate with Product & Data Science teams to define core AI/ML platform features. - Analyze business requirements and identify pre-trained models that align with use cases. - Work with multi-agent AI frameworks like LangChain, LangGraph, and LlamaIndex. - Train and fine-tune LLMs (GPT, LLaMA, Gemini, etc.) for domain-specific tasks. - Implement Retrieval-Augmented Generation (RAG) workflows and optimize LLM inference. - Develop NLP-based GenAI applications, including chatbots, document automation, and AI agents. - Preprocess, clean, and analyze large datasets to train and improve AI models. - Optimize LLM inference speed, memory efficiency, and resource utilization. - Deploy AI models in cloud environments (AWS, Azure, GCP) or on-premises infrastructure. - Develop APIs, pipelines, and frameworks for integrating AI solutions into products. - Conduct performance evaluations and fine-tune models for accuracy, latency, and scalability. - Stay updated with advancements in AI, ML, and GenAI technologies. Required Skills & Experience: - AI & Machine Learning: Strong experience in developing & deploying AI/ML models. - Generative AI & LLMs: Expertise in LLM pretraining, fine-tuning, and optimization. - NLP & Computer Vision: Hands-on experience in NLP, Transformers, OpenCV, YOLO, R-CNN. - AI Agents & Multi-Agent Frameworks: Experience with LangChain, LangGraph, LlamaIndex. - Deep Learning & Frameworks: Proficiency in TensorFlow, PyTorch, Keras. - Cloud & Infrastructure: Strong knowledge of AWS, Azure, or GCP for AI deployment. - Model Optimization: Experience in LLM inference optimization for speed & memory efficiency. - Programming & Development: Proficiency in Python and experience in API development. - Statistical & ML Techniques: Knowledge of Regression, Classification, Clustering, SVMs, Decision Trees, Neural Networks. - Debugging & Performance Tuning: Strong skills in unit testing, debugging, and model evaluation. - Hands-on experience with Vector Databases (FAISS, ChromaDB, Weaviate, Pinecone). Good to Have: - Experience with multi-modal AI (text, image, video, speech processing). - Familiarity with containerization (Docker, Kubernetes) and model serving (FastAPI, Flask, Triton). Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsibilities Job Description Standard advanced data analyses with minimal supervision and or complex projects with some senior supervision. Examples may include, but are not limited to: Conjoint analysis, (CBC, ACBC, etc), Segmentation, different types of regression, factor analysis, and other group-specific advanced analyses Coordinate with internal teams on analytics requirement, project flow and its timely execution Provides and explains output of statistical analysis to internal teams in simple, non-technical language Works on development and implementation of new statistical models and improvement of existing models Knowledge And Skills Requirement Strong conceptual understanding in the domain of analytics. Having additional experience on Market Research industry is an advantage Should have thorough knowledge working with various statistical packages (Must- SPSS, R/Python, Sawtooth. Optional - can be seen as an advantage- Latent Gold, SAS etc) Complete understanding of traditional multivariate techniques used in market research and commonly used machine learning techniques (Like neural networks, boosting, splines, SVM, Bayesian analysis etc) Experience of working on segmentation and pricing studies, predictive analytics/propensity based modelling, text analytics Understanding the database management system (DBMS) and hands on working with large databases would be an added advantage Qualifications Post-Graduation in Management/ Business Analytics/Statistics. Graduation in Science, Statistics or Engineering a plus 2 to 4 years of specialized experience in analytics. It could be either in a market research organization or business consultancy or client side Hands-on experience with specific analytics techniques like multivariate analysis, factor analysis, regression, conjoint analysis, or clustering Mandatory Skills - R, Python, SPSS, similar analytics/visualization software Experience on projects related to translating data into actionable business insights (e.g., segmentation, customer profiling, driver analysis, etc Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 2 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Modernization Engineer Project Role Description : Build and test agile and cost effective hosting solutions. Implement scalable, high performance hosting solutions that meet the need of todays corporate and digital applications using both private and public cloud technologies. Develop and deliver legacy infrastructure transformation and migration to drive next-generation business outcomes. Must have skills : Python (Programming Language) Good to have skills : PostgreSQLMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Modernization Engineer, you will engage in the development and testing of agile and cost-effective hosting solutions. Your typical day will involve collaborating with various teams to implement scalable and high-performance hosting solutions that cater to the demands of contemporary corporate and digital applications. You will work with both private and public cloud technologies, focusing on transforming and migrating legacy infrastructure to achieve next-generation business outcomes. This role requires a proactive approach to problem-solving and a commitment to delivering innovative solutions that enhance operational efficiency and effectiveness. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentorship within the team to foster professional growth.- Analyze and assess existing systems to identify areas for improvement and optimization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with PostgreSQL.- Strong understanding of cloud computing principles and practices.- Experience with agile methodologies and project management.- Proficiency in developing and deploying applications in cloud environments.- Familiarity with infrastructure as code tools and practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving discussions, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between stakeholders to ensure project alignment and clarity.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server Administration.- Strong understanding of database design and management principles.- Experience with performance tuning and optimization of SQL queries.- Familiarity with backup and recovery strategies for SQL Server databases.- Knowledge of security best practices for database management. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft SQL Server Administration.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
6.0 - 11.0 years
10 - 20 Lacs
Pune
Work from Office
Role & responsibilities Implementing and maintaining Active Directory; Windows Server; MS Windows Clustering; enterprise SAN and NAS configurations; MS SCCM and VMware VCM; Antivirus Software, Application White Listing and Device Control; TCP/IP and other networking principles including DNS and DHCP; scripting language ; Experience with managing VMWare virtualization technologies; Virtual Center Management and Administration; vSphere Server, vSphere Client, and vCenter Server; Installation and support of VMware View to include Pool Management, Entitlements, Upgrades, and Break/Fix; deploying virtual machines and use technologies such as Snapshots, clones, templates. MCSE Certified or equivalent. What Are Our Desired Skills and Capabilities? Skills / Knowledge - Having wide-ranging experience, uses professional concepts and company objectives to resolve complex issues in creative and effective ways. Some barriers to entry exist at this level (e.g., dept./peer review). Job Complexity - Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors. Exercises judgment in selecting methods, techniques and evaluation criteria for obtaining results. Networks with key contacts outside their own area of expertise. Supervision - Determines methods and procedures on new assignments and may coordinate activities of other personnel (Team Lead). Active Directory Windows Server, MS Windows Clustering. NOTE : Candidate should be fine with 5 days WFO and rotational shift.
Posted 2 weeks ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Managing Oracle AIA PIPs (O2C and AABC PIP) ,Oracle SOA Suite and OSB 12c environments, performing installation, configuration, and clustering tasks, and monitoring and troubleshooting composites, pipelines, and integrations ,integration with Oracle Enterprise Manager (OEM). The administrator will oversee WebLogic domains, JMS resources, and security configurations while conducting health checks, performance tuning, and issue resolution for middleware systems. Documenting processes and adhering to best practices is also a key part of the role. Career Level - IC2 Responsibilities Install, configure, and administer Oracle AIA solutions, including O2C and AABC PIPs, ensuring proper integration with Oracle SOA Suite and Oracle OSB environments. Manage and maintain Oracle SOA/OSB Suite, Oracle AIA 12.x , and WebLogic Server environments, ensuring high availability, performance, and scalability. Monitor, troubleshoot, and resolve issues using Oracle Enterprise Manager (OEM) and other monitoring tools to ensure optimal system performance. Identify and resolve infrastructure-related issues in BPEL, ESB, and XML/XSLT workflows to maintain seamless integration across enterprise systems. Perform regular system upgrades and patches for Oracle AIA, SOA Suite, and WebLogic environments, ensuring minimal downtime and risk mitigation. Work closely with the development team to configure and deploy Oracle AIA PIPs, facilitating the integration of enterprise applications. Provide ongoing support and troubleshooting for AIA-related integrations and workflows, ensuring timely resolution of technical challenges. Document and enforce integration best practices, including change management, version control, and deployment procedures to ensure consistency and reliability across environments. Collaborate with cross-functional teams to understand business requirements and tailor AIA solutions to meet specific integration needs. Conduct periodic health checks and performance tuning for Oracle AIA and SOA environments to optimize system efficiency and response times. Ensure compliance with security standards, ensuring that Oracle AIA environments are secured and appropriately configured according to enterprise security policies. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
40.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Managing Oracle AIA PIPs (O2C and AABC PIP) ,Oracle SOA Suite and OSB 12c environments, performing installation, configuration, and clustering tasks, and monitoring and troubleshooting composites, pipelines, and integrations ,integration with Oracle Enterprise Manager (OEM). The administrator will oversee WebLogic domains, JMS resources, and security configurations while conducting health checks, performance tuning, and issue resolution for middleware systems. Documenting processes and adhering to best practices is also a key part of the role. Career Level - IC2 Responsibilities Install, configure, and administer Oracle AIA solutions, including O2C and AABC PIPs, ensuring proper integration with Oracle SOA Suite and Oracle OSB environments. Manage and maintain Oracle SOA/OSB Suite, Oracle AIA 12.x , and WebLogic Server environments, ensuring high availability, performance, and scalability. Monitor, troubleshoot, and resolve issues using Oracle Enterprise Manager (OEM) and other monitoring tools to ensure optimal system performance. Identify and resolve infrastructure-related issues in BPEL, ESB, and XML/XSLT workflows to maintain seamless integration across enterprise systems. Perform regular system upgrades and patches for Oracle AIA, SOA Suite, and WebLogic environments, ensuring minimal downtime and risk mitigation. Work closely with the development team to configure and deploy Oracle AIA PIPs, facilitating the integration of enterprise applications. Provide ongoing support and troubleshooting for AIA-related integrations and workflows, ensuring timely resolution of technical challenges. Document and enforce integration best practices, including change management, version control, and deployment procedures to ensure consistency and reliability across environments. Collaborate with cross-functional teams to understand business requirements and tailor AIA solutions to meet specific integration needs. Conduct periodic health checks and performance tuning for Oracle AIA and SOA environments to optimize system efficiency and response times. Ensure compliance with security standards, ensuring that Oracle AIA environments are secured and appropriately configured according to enterprise security policies. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a skilled and innovative Deep Learning Engineer to join our AI/ML team. As a Deep Learning Engineer, you will develop, train, and deploy computer vision models that solve complex visual problems. You will work on cutting-edge technology involving image processing, object detection, and video analysis, collaborating with cross-functional teams to create impactful real-world applications. Role: Data Scientist / Deep Learning Engineer Location: Pune General Summary Of The Role Develop and optimize computer vision models for object detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs). Work with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR) for text extraction from images. Work with PyTorch, TensorFlow, OpenCV for deep learning and image processing. Implement sequence-based models (RNNs, LSTMs, GRUs) for vision tasks. Optimize software for real-time performance on multiple platforms. Implement and deploy AI models via Flask/FastAPI and integrate with SQL/NoSQL databases. Use Git/GitHub for version control and team collaboration. Apply ML algorithms (regression, decision trees, clustering) as needed. Review code, mentor team members, and enhance model efficiency. Stay updated with advancements in deep learning and multimodal AI. Required Skills & Qualifications Python proficiency for AI development. Experience with PyTorch, TensorFlow, and OpenCV. Knowledge of object detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs). Experience with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR). Experience with RNNs, LSTMs, GRUs for sequence-based tasks. Experience with Generative Adversarial Networks (GANs) and Diffusion Models for image generation. Familiarity with REST APIs (Flask/FastAPI) and SQL/NoSQL databases. Strong problem-solving and real-time AI optimization skills. Experience with Git/GitHub for version control. Knowledge of Docker, Kubernetes, and model deployment at scale on serverless and onprem platforms. Understanding of vector databases (FAISS, Milvus). Preferred Qualifications Experience with cloud platforms (AWS, GCP, Azure). Experience with Vision Transformers (ViTs) and Generative AI (GANs, Stable Diffusion, LMMs). Familiarity with Frontend Technologies. Skills: computer vision,object detection,models,platforms,opencv,diffusion models,databases,classification,lstms,tensorflow,deep learning,python,rest apis,learning,grus,ocr technologies,pytorch,git,generative adversarial networks,image classification,sql/nosql databases,kubernetes,docker,vector databases,rnns,diffusion,data Show more Show less
Posted 2 weeks ago
1.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About the role: We are seeking skilled, proactive and motivational Mid Data Scientist who have good hands-on experience in data wrangling, transformations, building data-driven insights and statistical models. As part of the PDT DD&T (Data, Digital & Transformation) team, you will leverage advanced analytics to deliver data-driven solutions and strategic insights that drive impactful business outcomes. Collaborating with cross-functional teams—including Pricing, Marketing, and Forecasting—you will analyze datasets, build predictive models, and help shape data strategies to enhance organizational performance. Working closely with the Data Science Senior and fellow data scientists, you will contribute to the broader Data and Analytics chapter of the ICC by providing data-backed recommendations that support key strategic initiatives. How you will contribute: Support the Senior Data Scientist and team members in solving problems, analyzing data and finding solutions. Support in design, development, and deployment of advanced statistics, data science and Artificial Intelligence (AI/ML) solutions to solves complex business problems Collaborate with cross-functional teams to integrate data-driven insights into business strategies and operations Ensure smooth, successful and timely delivery of statistical/data science solutions, decision support solutions and AI/ML products. Preferred requirements: Bachelor’s degree from an accredited institution in Statistics, Applied Mathematics, Statistics, Data Science, Physics, Computer Science, Economics or related quantitative fields (Master’s degree preferred) For a Master's or a PhD, it is expected to have a minimum of 1 years of AI/Machine Learning/ Advanced Statistics/Data Science experience in technology companies, Life Sciences, Pharma, Biotech, or relevant regulated industries (banking, insurance etc.). Minimum 3 years of production level experience after bachelor's degree. Understanding of applied statistical modelling, math modeling, and ML techniques (e.g. Clustering, Classification, Regression, Dimensionality Reduction, Ensemble Methods, Natural Language Processing) Programming & Data Engineering: Experience in Python, PySpark, or R for data curation, transformation, analysis, modeling, and visualization tasks. Analytical Problem-Solving: Strong analytical and problem-solving abilities, focused on delivering actionable business solutions. Strong presentation and communication skills, client service and technical writing skills for both technical and business audiences. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist, Product Data & Analytics Senior Data Scientist, Product Data & Analytics Our Vision: Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. We are a hands-on global team providing scalable end-to-end data solutions by working closely with the business. We influence decisions across Mastercard through data driven insights. We are a team on analytics engineers, data architects, BI developers, data analysts and data scientists, and fully manage our own data assets and solutions. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across the continents? Are you interested in proactively looking to improve data driven decisions for a global corporation? Role Responsible for developing data-driven innovative scalable analytical solutions and identifying opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations / decisions. Accountable for delivering high quality project solutions and tools within agreed upon timelines and budget parameters and conducting post- implementation reviews. Contributes to the development of custom analyses and solutions, derives insights from extracted data to solve critical business questions. Activities include developing and creating predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Able to develop AI/ML capabilities, as needed on large volumes of data to support analytics and reporting needs across products, markets and services. Able to build end to end reusable, multi-purpose AI models to drive automated insights and recommendations. Leverage open and closed source technologies to solve business problems. Work closely with global & regional teams to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets, and services. Support initiatives in developing predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Translates client/ stakeholder needs into technical analyses and/or custom solutions in collaboration with internal and external partners, derive insights and present findings and outcomes to clients/stakeholders to solve critical business questions. Create repeatable processes to support development of modelling and reporting Delegate and reviews work for junior level colleagues to ensure downstream applications and tools are not compromised or delayed. Serves as a mentor for junior-level colleagues, and develops talent via ongoing technical training, peer review etc. All About You 6-8 years of experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack, Cloud, Snowflake, and other relevant technologies. Data visualization tools (Tableau, Domo, and/or Power BI/similar tools) experience is a plus Experience with data validation, quality control and cleansing processes to new and existing data sources. Experience on Classical and Deep Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks - Feedforward, CNN, NLP, etc. Experience on Deep Learning algorithm techniques, open-source tools and technologies, statistical tools, and programming environments such as Python, R, and Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning. Experience in automating and creating data pipeline via tools such as Alteryx, SSIS. Nifi is a plus Financial Institution or a Payments experience a plus Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills. Ownership of end-to-end Project Delivery/Risk Mitigation Virtual team management and manage stakeholders by influence Analytical/Problem Solving Able to prioritize and perform multiple tasks simultaneously Able to work across varying time zone. Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency. In depth technical knowledge, drive, and ability to learn new technologies. Must be able to interact with management, internal stakeholders Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must. Abide by Mastercard’s security policies and practices. Ensure the confidentiality and integrity of the information being accessed. Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-244065 Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Kaleris is a private equity-backed software firm focused on supply chain optimization, headquartered in Atlanta, Georgia. We are a global leader in the supply chain execution market, focused on accelerating the transformation of digital supply chain for industrial and finished goods shippers and carriers by combining best-in-class solutions for challenges tied to yard management, shipment visibility, and asset management, across rail, truck, and multi-mode transportation. The Kaleris IT Infrastructure Engineer plays a pivotal role in providing comprehensive IT infrastructure support to customers utilizing the N4 software, both in on-premises and cloud-hosted environments. This position involves collaborating with a diverse range of customers to offer strategic planning, technical design, and implementation of IT infrastructure solutions. Key responsibilities include the design of hosting environments, firewall setup, security audits, health checks, network and server maintenance, disaster recovery planning, and managing WAN/communication links to ensure seamless N4 software operation. Key Responsibilities Provide ongoing maintenance, troubleshooting, and support of customer IT infrastructure through Managed Services, including servers, network hardware and software, disaster recovery, storage, WAN/communication links, and cloud hosting environments. Design, implement, and maintain both on-premises and cloud-hosted infrastructure for the N4 Terminal Operating System (TOS). Monitor and diagnose infrastructure incidents affecting N4 TOS performance and the underlying systems, ensuring timely resolution. Consult with customers and troubleshoot N4 TOS-related software and infrastructure issues, providing solutions and expert guidance. Review, audit, and administer customer hardware and cloud configurations to ensure optimal performance. Ensure customer issues are resolved in line with Service Level Agreements (SLAs), maintaining high levels of customer satisfaction. Remain on standby for critical P1 incidents and provide weekend or shift-based support as required for 24/7 customer operations. Qualifications Extensive experience in server centralization, consolidation, and virtualization of servers and storage within modern IT architectures. Strong technical expertise in network hardware, protocols, and internet standards, with a solid understanding of operating systems and their configurations. In-depth knowledge of database technologies, including scaling, redundancy, and backup strategies. Proven experience in network capacity planning and network security, with a deep understanding of best practices in these areas. Ability to conduct advanced research into networking issues and provide recommendations for technical improvements. Exceptional troubleshooting skills with a focus on hardware and infrastructure optimization. Expertise and/or qualifications in the following technologies: Load Balancers Clustering Tomcat Oracle 11g or 11g RAC, SQL, MySQL databases Red Hat Linux 5 RAID Microsoft Server 2008 ActiveMQ Microsoft SQL Server 2012 JMS Firewalls A minimum of 3 years of consulting experience with the technologies listed above. Knowledge, Skills, And Abilities Experience in the maritime or logistics industry is highly desirable. Familiarity with N4 Terminal Operating System (TOS) is a significant advantage. Experience working in distributed, virtual teams with strong collaboration skills. Positive attitude, strong work ethic, and a demonstrated commitment to excellence. Exceptional organizational and multitasking skills, with a focus on detail. Outstanding customer service and follow-up abilities, ensuring issues are resolved effectively. Ability to work collaboratively and follow instructions, contributing to a team-oriented environment. Multilingual capabilities are a plus. This position is ideal for an IT professional with a strong technical foundation, seeking to support and enhance critical infrastructure solutions for global customers in the maritime and logistics sectors. Benefits & Compensation Competitive compensation package Paid Leave (Vacation/Annual, Casual, Volunteering time off) Hospitalization Insurance Life & Accident Insurance Broadband Allowance, IT gadgets Allowance Meal & Fuel Allowance Provident Fund Tuition Reimbursement Employee Assistance Program Career growth and mentorship Kaleris is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Show more Show less
Posted 2 weeks ago
3.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred technical and professional experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Manager – Data Scientist – Trade Marketing Location: Bangalore Reporting to: Trade Marketing Manager Purpose of the role Use statistical and mathematical techniques to measure the impact of Promotions, Trade Programs, or Activations through Tests. Maintain python programming codes for the model which runs Test vs Control methodology Conduct exploratory studies to identify opportunities to increase ROI for future promotions. Optimize Sales package – Check spends Enhance, maintain python codes for the model which runs Test vs Control Strategy Design and deploy mathematical models to create Store level target lists to implement optimal ROI promotions in the future. Build recommendation models. Add features on top of the model to create robust algorithm. Communicate analysis results to business stakeholders through intuitive Power BI solutions. Setting framework for analysis based on Business Objective, thorough with Business level presentations. Build new AI & DS Products, Projects and effective Business Presentations and contribute to Business Target Achievements. Key tasks & accountabilities Expertise on Python to prepare & update algorithm. Developing business-friendly presentations, transforming thoughts into key actions for business and showing the model to justify recommendations. Expertise on Power BI to produce dashboards. Understanding of SQL to connect Power BI to Datalake & create automated outputs/Dashboards. Understanding the business problem and translating it into an analytical problem. Ability to solve problem quickly and effectively applying logical thinking and creative mindset. Handling large data sets to connect data points and provide insights that drive business. Be assertive and goal oriented and drive results with minimal support. Excellent excel, Power point and basic VBA, SQL skillset to convert unstructured data into structured format which optimizes time spent on data handling. Exposure to AI/ML methodologies with a previous hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization, deep learning Product building experience would be a plus. Interact with and present to Senior internal or external stakeholders around project/ service conceptualization and plan of delivery. Ability to work in collaboration with multiple teams within CP&A tower Develop deeper understanding related to different concepts of Category Management. Plan and implement different projects with multiple teams in Category Management. BUSINESS ENVIRONMENT Delivering insights and provocation as per timelines given Following processes and adhering to documentation goals High quality presentations Questions from business stakeholders answered satisfactorily within agreeable time Coming up with provocation, proposing what if or next level Challenges: Primary challenges include various sources of data, incomplete data which can throw off the analysis. Outliers to be removed and data to be cleaned before analysing and presenting to stake holders. 3. Qualifications, Experience, Skills Level Of Educational Attainment Required Master Graduate in the field of Business & Marketing, Engineering/Solution or other equivalent degree or equivalent work experience MBA/Engg. in a relevant technical field such as Marketing Previous Work Experience Required 2-3 and up years’ experience handling Data science Projects Prior experience in managing multiple files, data cleaning, and maintaining data in structured formats Technical Skills Required Proficient in Python, Power BI, SQL, VBA, Advanced Excel, MS PowerPoint, Power Apps Expert level proficiency in Python(knowledge of writing end-to-end ML or data pipelines in python) Proficient in application of AI & ML concepts and optimization techniques to solve end-to-end business problems Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform Good understanding of Statistical and Mathematical concepts And above all of this, an undying love for beer! We dream big to create future with more cheers . Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Thank you for your interest in working for our Company. Recruiting the right talent is crucial to our goals. On April 1, 2024, 3M Healthcare underwent a corporate spin-off leading to the creation of a new company named Solventum. We are still in the process of updating our Careers Page and applicant documents, which currently have 3M branding. Please bear with us. In the interim, our Privacy Policy here: https://www.solventum.com/en-us/home/legal/website-privacy-statement/applicant-privacy/ continues to apply to any personal information you submit, and the 3M-branded positions listed on our Careers Page are for Solventum positions. As it was with 3M, at Solventum all qualified applicants will receive consideration for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Job Description 3M Health Care is now Solventum At Solventum, we enable better, smarter, safer healthcare to improve lives. As a new company with a long legacy of creating breakthrough solutions for our customers’ toughest challenges, we pioneer game-changing innovations at the intersection of health, material and data science that change patients' lives for the better while enabling healthcare professionals to perform at their best. Because people, and their wellbeing, are at the heart of every scientific advancement we pursue. We partner closely with the brightest minds in healthcare to ensure that every solution we create melds the latest technology with compassion and empathy. Because at Solventum, we never stop solving for you. The Impact You’ll Make in this Role We’re seeking an HR Data Scientist to help build and strengthen our People Analytics capabilities within the People Experience team. Our mission is to create a best-in-class employee experience, with data and analytics at its core. As a data scientist, you’ll have the opportunity to collaborate with team members on driving impactful analyses and/or data products that answer key talent and business questions. Key Responsibilities Include Contribute to scalable data management to ensure data accuracy and accessibility, bringing together multiple HR data systems and business-related data sources Perform statistical analysis, selecting the best method for the specific question and context (e.g., ranging from basic regression to advanced clustering or predictive models), maintaining a balance between innovative approaches and interpretability Create compelling data visualizations and data products to ensure insights are clearly and effectively communicated to key stakeholders Collaborate with stakeholders to scope and prioritize requests and drive insights aligned with organizational objectives. Help contribute to raising the overall level of analytical fluency across HR and the business. Your Skills And Expertise To set you up for success in this role from day one, Solventum requires (at a minimum) the following qualifications: Bachelor’s degree in I/O psychology, data science, applied statistics, human resources, business analytics, or a related field. Master’s degree preferred. 4+ years of professional experience in HR, business analytics, or a similar analytical position. Experience with advanced statistical programming tools (R preferred, Python, etc.), query languages, and data visualization tools. Familiarity with range of statistical and data science methods and techniques, from regression and categorical data analysis, to organizational network analysis and predictive modeling. Additional qualifications that could help you succeed even further in this role include: Experience with various types of people data, e.g., coming from HR information systems (HRIS), timekeeping, employee survey tools, workforce planning systems Experience interpreting complex analyses and findings for audiences with varying levels of analytical fluency, especially with the goal of driving action or supporting decision-making Strong problem-solving skills, including the ability to think creativity about the data and application of novel or innovative methodologies. Experience working with cloud computing platforms (e.g., AWS, Google Cloud, MS Azure). Solventum is committed to maintaining the highest standards of integrity and professionalism in our recruitment process. Applicants must remain alert to fraudulent job postings and recruitment schemes that falsely claim to represent Solventum and seek to exploit job seekers. Please note that all email communications from Solventum regarding job opportunities with the company will be from an email with a domain of @solventum.com . Be wary of unsolicited emails or messages regarding Solventum job opportunities from emails with other email domains. Please note: your application may not be considered if you do not provide your education and work history, either by: 1) uploading a resume, or 2) entering the information into the application fields directly. Solventum Global Terms of Use and Privacy Statement Carefully read these Terms of Use before using this website. Your access to and use of this website and application for a job at Solventum are conditioned on your acceptance and compliance with these terms. Please access the linked document by clicking here, select the country where you are applying for employment, and review. Before submitting your application you will be asked to confirm your agreement with the terms. Show more Show less
Posted 2 weeks ago
8.0 years
4 - 6 Lacs
Chandigarh
On-site
Location: Chandigarh, India (Office-based) Contract Type: Permanent Join a Global Leader in Multi-Cloud Solutions Cintra is a global leader in enterprise cloud transformation and managed services. With deep partnerships across Oracle, AWS, and Google Cloud, we help the world’s leading organizations modernize and manage their most critical workloads. We're now expanding our team in Chandigarh and looking for a seasoned Principal PostgreSQL DBA who also brings hands-on Oracle DBA experience to support our enterprise clients across the globe. What You’ll Be Doing As a senior member of our DBA team, you’ll combine expert-level technical skills with a consulting mindset to deliver and support complex database environments across PostgreSQL and Oracle: Lead architecture, design, implementation, and support of enterprise-grade PostgreSQL environments Handle PostgreSQL administration: installation, configuration, PITR, replication, clustering, and upgrades/migrations Provide end-to-end Oracle support: CDB/PDB management, cross-platform migrations, patching, performance tuning, and backups Tune databases using tools like OEM 13c, AWR, ADDM, and ORAchk Design and implement automation using Shell, Python, Ansible, and Terraform Collaborate closely with project managers and client stakeholders to ensure seamless delivery and support Proactively resolve incidents and drive performance optimization for mission-critical databases Work across on-premise and cloud environments (AWS, OCI, Azure) What You Bring Must-Have Skills: 8+ years of hands-on experience as a PostgreSQL DBA 5+ years of experience as an Oracle DBA Proven experience supporting production environments and working in consulting or customer-facing roles Solid understanding of PostgreSQL in cloud (especially AWS/Aurora) and on-prem environments Deep experience in database performance tuning and advanced troubleshooting Bachelor's degree in Computer Science, IT, or equivalent hands-on experience Nice-to-Haves: Familiarity with Oracle Engineered Systems (Exadata, ODA, ZFS) Working knowledge of OCI, AWS, Azure, and GCP Hands-on experience with automation tools (Ansible, Jenkins, Terraform) Exposure to DevOps practices and CI/CD pipelines Oracle and/or PostgreSQL certifications Strong understanding of Linux, storage, and networking Why Join Cintra? Be part of an award-winning, global cloud innovator Work with Fortune 500 clients and leading-edge technologies Collaborate with a high-performing team of experts Grow your career across cloud, architecture, and database domains Competitive compensation and a culture that values talent and initiative Ready to take the next step in your DBA career? Join #TeamCintra and help shape the future of enterprise multi-cloud solutions. Cintra Software & Services is an Equal Opportunity Employer. We welcome applicants from all backgrounds and ensure fair treatment throughout our hiring process.
Posted 2 weeks ago
8.0 - 10.0 years
2 - 7 Lacs
Hyderābād
On-site
Date: May 28, 2025 Job Requisition Id: 61437 Location: Pune, IN Indore, IN Hyderabad, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire ETL(Extract, Transform, Load) Professionals in the following areas : Experience 8-10 Years Job Description Key Responsibilities: Lead the end-to-end migration of databases, stored procedures, views and ETL pipelines from SQL Server to Snowflake. Analyze existing SQL Server schemas, stored procedures, and SSIS packages to design equivalent Snowflake solutions. Re-engineer and optimize ETL workflows using SSIS and Snowflake-native tools (e.g., Snowpipe, Streams, Tasks). Collaborate with data architects, analysts, and business stakeholders to ensure data integrity and performance. Develop and maintain documentation for migration processes, data mappings, and transformation logic. Monitor and troubleshoot data pipelines and performance issues post-migration. Ensure compliance with data governance and security standards during migration. Tools & Technology: SQL SERVER, SSIS, SQL, Snowflake, Required Skills & Qualifications: 8-10 years of experience with Microsoft SQL Server, including T-SQL and performance tuning. Good hands-on experience with SSIS for ETL development and deployment. Decent experience with Snowflake, including SnowSQL, data modeling, and performance optimization. Strong understanding of data warehousing concepts and cloud data architecture. Experience with version control systems (e.g., Git) and CI/CD pipelines for data workflows. Excellent problem-solving and communication skills. Good to have knowledge on Azure Data Factory. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Looking for challenging role? If you really want to make a difference - make it with us Can we energize society and fight climate change at the same time? At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world’s energy systems. Their spirit fuels our mission. Our culture is defined by caring, agile, respectful, and accountable individuals. We value excellence of any kind. Sounds like you? Your new role – challenging and future- oriented: Security Implementation and Management: Implementing and maintaining security controls, including firewalls, intrusion detection systems, and data encryption. Hands On Experience in Fortigate & Checkpoint Firewalls. Expert knowledge of FW clustering, HA, Traffic Filtering, Defining Network & Security policies, Network Segmentations (VLAN), IDS/IPS, NGFW Concepts. Log Management & Forwarding over Syslog. Vulnerability Assessment and Mitigation: Identifying and addressing potential vulnerabilities in systems and networks. Practical knowledge of VAPT tools like NESSUS Professional. Incident Response: Investigating and responding to security incidents, including breaches and attacks. Risk Management: Assessing and mitigating cybersecurity risks to the organization. Security Awareness and Training: Providing training and education to project customer on cybersecurity best practices. Disaster Recovery and Business Continuity: -Contributing to the development and maintenance of disaster recovery and business continuity plans. Communication Focused: Reporting and Communication: Preparing reports and communicating security status to management and other stakeholders. Vendor Management: Managing relationships with IT service providers and vendors to ensure security standards are met. Collaboration and Liaison: Collaborating with other departments and teams to ensure security policies are followed. Technical Support and Advice: Providing technical support and advice on security-related issues. Documentation: Documenting security processes, policies, and procedures. Other Important Responsibilities: Staying Up to Date: Keeping abreast of the latest cybersecurity trends, threats, and technologies. Problem Solving: Identifying and resolving security-related issues and problems. Compliance: Ensuring the organization complies with relevant cybersecurity regulations and standards. We’ve got quite a lot to offer. How about you? This role is based at Site (Gurgaon). You’ll also get to visit other locations in India and beyond, so you’ll need to go where this journey takes you. In return, you’ll get the chance to work with teams impacting entire cities, countries – and the shape of things to come. We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Show more Show less
Posted 2 weeks ago
0 years
2 - 9 Lacs
Chennai
On-site
Comfort level in following Python project management best practices (use of setup.py, logging, pytests, relative module imports,sphinx docs,etc.,) Familiarity in use of Github (clone, fetch, pull/push,raising issues and PR, etc.,) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai.text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc.,) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Skillset to leverage cloud platform for Data Engineering, Big Data and ML needs. Use of Dockers (experience in experimental docker features, docker-compose, etc.,) Familiarity with orchestration tools such as airflow, Kubeflow Experience in CI/CD, infrastructure as code tools like terraform etc. Kubernetes or any other containerization tool with experience in Helm, Argoworkflow, etc., Ability to develop APIs with compliance, ethical, secure and safe AI tools. Good UI skills to visualize and build better applications using Gradio, Dash, Streamlit, React, Django, etc., Deeper understanding of javascript, css, angular, html, etc., is a plus. Education : Bachelor’s or Master’s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed. Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc.,) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale. Data Engineering: Skillsets to perform distributed computing (specifically parallelism and scalability in Data Processing, Modeling and Inferencing through Spark, Dask, RapidsAI or RapidscuDF) Ability to build python-based APIs (e.g.: use of FastAPIs/ Flask/ Django for APIs) Experience in Elastic Search and Apache Solr is a plus, vector databases.
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Juniper, we believe the network is the single greatest vehicle for knowledge, understanding, and human advancement the world has ever known. To achieve real outcomes, we know that experience is the most important requirement for networking teams and the people they serve. Delivering an experience-first, AI-Native Network pivots on the creativity and commitment of our people. It requires a consistent and committed practice, something we call the Juniper Way. Job Title: Software Engineer IV/Staff - Data Scientist AI/ML Experience: 5+ Years Location: Bangalore The AIOps team’s mission is to use advanced analytics, including AI/ML, to develop end-to-end solutions to automate (detect, remediate) networking workflows for our customers, and help extend AI/ML across the Juniper portfolio. We are looking for an experienced engineer to join our growing data science team of AI/ML and data-at-scale engineers. Our ideal candidate brings their skills and experience having developed performant inferencing implementations, practiced data science hygiene to develop ML models and is a team player. As a data scientist, you will collaborate with product managers and domain specialists to identify real customer problems, use your background in NLP/ML to develop solutions that scale with terabytes of data. Qualifications/Requirements: BS/MS in Computer Science or Data Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background Excellent understanding of machine learning techniques and algorithms, including clustering, anomaly detection, optimization, Neural network, Graph ML, etc 4+ years experiences building data science-driven solutions including data collection, feature selection, model training, post-deployment validation Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning models Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow Works well in a team setting and is self-driven Desired Experience: Experience with some/equivalent: AWS, Flink, Spark, Kafka, Elastic Search, Kubeflow Knowledge and experience with NLP technology Networking domain experience and/or demonstrable problem-solving ability Conceptual understanding of system design concepts Responsibilities: Collaborate with product management and engineering teams to understand company needs, work with domain experts to identify relevant “signals” during feature engineering Take end-to-end responsibility to deliver optimized, generic and performant ML solutions Keep up to date with newest technology trends Communicate results and ideas to key decision makers Implement new statistical or other mathematical methodologies as needed for specific models or analysis Optimize joint development efforts through appropriate database use and project design About Juniper Networks Juniper Networks challenges the inherent complexity that comes with networking and security in the multicloud era. We do this with products, solutions and services that transform the way people connect, work and live. We simplify the process of transitioning to a secure and automated multicloud environment to enable secure, AI-driven networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net) or connect with Juniper on Twitter, LinkedIn and Facebook. WHERE WILL YOU DO YOUR BEST WORK? Wherever you are in the world, whether it's downtown Sunnyvale or London, Westford or Bengaluru, Juniper is a place that was founded on disruptive thinking - where colleague innovation is not only valued, but expected. We believe that the great task of delivering a new network for the next decade is delivered through the creativity and commitment of our people. The Juniper Way is the commitment to all our colleagues that the culture and company inspire their best work-their life's work. At Juniper we believe this is more than a job - it's an opportunity to help change the world. At Juniper Networks, we are committed to elevating talent by creating a trust-based environment where we can all thrive together. If you think you have what it takes, but do not necessarily check every single box, please consider applying. We’d love to speak with you. Additional Information for United States jobs: ELIGIBILITY TO WORK AND E-VERIFY In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. Juniper Networks participates in the E-Verify program. E-Verify is an Internet-based system operated by the Department of Homeland Security (DHS) in partnership with the Social Security Administration (SSA) that allows participating employers to electronically verify the employment eligibility of new hires and the validity of their Social Security Numbers. Information for applicants about E-Verify / E-Verify Información en español: This Company Participates in E-Verify / Este Empleador Participa en E-Verify Immigrant and Employee Rights Section (IER) - The Right to Work / El Derecho a Trabajar E-Verify® is a registered trademark of the U.S. Department of Homeland Security. Juniper is an Equal Opportunity workplace. We do not discriminate in employment decisions on the basis of race, color, religion, gender (including pregnancy), national origin, political affiliation, sexual orientation, gender identity or expression, marital status, disability, genetic information, age, veteran status, or any other applicable legally protected characteristic. All employment decisions are made on the basis of individual qualifications, merit, and business need. Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for clustering roles in India is thriving, with numerous opportunities available for job seekers with expertise in this area. Clustering professionals are in high demand across various industries, including IT, data science, and research. If you are considering a career in clustering, this article will provide you with valuable insights into the job market in India.
Here are 5 major cities in India actively hiring for clustering roles: 1. Bangalore 2. Pune 3. Hyderabad 4. Mumbai 5. Delhi
The average salary range for clustering professionals in India varies based on experience levels. Entry-level positions may start at around INR 3-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-20 lakhs per annum.
In the field of clustering, a typical career path may look like: - Junior Data Analyst - Data Scientist - Senior Data Scientist - Tech Lead
Apart from expertise in clustering, professionals in this field are often expected to have skills in: - Machine Learning - Data Analysis - Python/R programming - Statistics
Here are 25 interview questions for clustering roles: - What is clustering and how does it differ from classification? (basic) - Explain the K-means clustering algorithm. (medium) - What are the different types of distance metrics used in clustering? (medium) - How do you determine the optimal number of clusters in K-means clustering? (medium) - What is the Elbow method in clustering? (basic) - Define hierarchical clustering. (medium) - What is the purpose of clustering in machine learning? (basic) - Can you explain the difference between supervised and unsupervised learning? (basic) - What are the advantages of hierarchical clustering over K-means clustering? (advanced) - How does DBSCAN clustering algorithm work? (medium) - What is the curse of dimensionality in clustering? (advanced) - Explain the concept of silhouette score in clustering. (medium) - How do you handle missing values in clustering algorithms? (medium) - What is the difference between agglomerative and divisive clustering? (advanced) - How would you handle outliers in clustering analysis? (medium) - Can you explain the concept of cluster centroids? (basic) - What are the limitations of K-means clustering? (medium) - How do you evaluate the performance of a clustering algorithm? (medium) - What is the role of inertia in K-means clustering? (basic) - Describe the process of feature scaling in clustering. (basic) - How does the GMM algorithm differ from K-means clustering? (advanced) - What is the importance of feature selection in clustering? (medium) - How can you assess the quality of clustering results? (medium) - Explain the concept of cluster density in DBSCAN. (advanced) - How do you handle high-dimensional data in clustering? (medium)
As you venture into the world of clustering jobs in India, remember to stay updated with the latest trends and technologies in the field. Equip yourself with the necessary skills and knowledge to stand out in interviews and excel in your career. Good luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.