Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
4 Lacs
Ahmedabad
On-site
We are hiring a Senior Software Development Engineer for our platform. We are helping enterprises and service providers build their AI inference platforms for end users. As a Senior Software Engineer, you will take ownership of backend-heavy, full-stack feature development—building robust services, scalable APIs, and intuitive frontends that power the user experience. You’ll contribute to the core of our enterprise-grade AI platform, collaborating across teams to ensure our systems are performant, secure, and built to last. This is a high-impact, high-visibility role working at the intersection of AI infrastructure, enterprise software, and developer experience. Responsibilities: Design, develop and maintain databases, system APIs, system integrations, machine learning pipelines and web user interfaces. Scale algorithms designed by data scientists for deployment in high-performance environments. Develop and maintain continuous integration pipelines to deploy the systems. Design and implement scalable backend systems using Golang, C++, Go,Python. Model and manage data using relational (e.g., PostgreSQL , MySQL). Build frontend components and interfaces using TypeScript, and JavaScript when needed. Participate in system architecture discussions and contribute to design decisions. Write clean, idiomatic, and well-documented Go code following best practices and design patterns. Ensure high code quality through unit testing, automation, code reviews, and documentation Communicate technical concepts clearly to both technical and non-technical stakeholders. Qualifications and Criteria: 5–10 years of professional software engineering experience building enterprise-grade platforms. Deep proficiency in Golang , with real-world experience building production-grade systems. Solid knowledge of software architecture, design patterns, and clean code principles. Experience in high-level system design and building distributed systems. Expertise in Python and backend development with experience in PostgreSQL or similar databases. Hands-on experience with unit testing, integration testing, and TDD in Go. Strong debugging, profiling, and performance optimization skills. Excellent communication and collaboration skills. Hands-on experience with frontend development using JavaScript, TypeScript , and HTML/CSS. Bachelor's degree or equivalent experience in a quantitative field (Computer Science, Statistics, Applied Mathematics, Engineering, etc.). Skills: Understanding of optimisation, predictive modelling, machine learning, clustering and classification techniques, and algorithms. Fluency in a programming language (e.g. C++, Go, Python, JavaScript, TypeScript, SQL). Docker, Kubernetes, and Linux knowledge are an advantage. Experience using Git. Knowledge of continuous integration (e.g. Gitlab/Github). Basic familiarity with relational databases, preferably PostgreSQL. Strong grounding in applied mathematics. A firm understanding of and experience with the engineering approach. Ability to interact with other team members via code and design documents. Ability to work on multiple tasks simultaneously. Ability to work in high-pressure environments and meet deadlines. Compensation: Commensurate with experience Position Type: Full-time ( In House ) Location: Ahmedabad / Jamnagar Gujarat India. Submission Requirements CV All academic transcripts Submit to chintanit22@gmail.com , dipakberait@gmail.com with the name of the position you wish to apply for in the subject line. Job Type: Full-time Pay: From ₹40,000.00 per month Benefits: Paid sick time Location Type: In-person Schedule: Day shift Monday to Friday Experience: Full-stack development: 5 years (Preferred) Work Location: In person Speak with the employer +91 9904075544
Posted 5 days ago
0 years
7 - 12 Lacs
India
On-site
About Finalrentals Finalrentals is transforming the global car rental landscape by enabling local car hire companies to thrive in the digital world. Operating in over 50 countries, we’re growing fast—and we’re looking for a smart, results-driven optimization expert to help us own the search space across web, app stores, and AI ecosystems. The Role We’re looking for a multidisciplinary SEO, ASO & AI Search Optimization Expert who will lead our organic growth efforts—not just on Google, but also on the App Store, Play Store, and AI discovery tools. CHATGPT, GEMINI, PERPLEXITY and more. You’ll build and execute a full-funnel visibility strategy combining advanced SEO, App Store Optimization (ASO), and AI prompt optimization to ensure we are discovered on every relevant search and AI platform. What You’ll Be Doing: Design and execute a holistic SEO strategy across 50+ country domains—technical SEO, content, link-building, schema, and international SEO. Lead App Store Optimization (ASO) for our iOS and Android apps—including keyword targeting, description and visual optimization, A/B testing, and store performance tracking. Optimize prompt visibility and keyword ranking on AI platforms , ensuring Finalrentals appears when users search on tools like ChatGPT, Claude, Perplexity, and AI plugin marketplaces. Use AI tools like ChatGPT, Jasper, SurferSEO, and SEMrush to automate content generation, keyword clustering, and metadata at scale. Run technical audits (crawl errors, speed, indexability) and resolve issues across mobile and web platforms. Monitor and report performance via Google Analytics 4, Looker Studio, App Store Connect, and Google Play Console. Collaborate closely with product, content, and design teams to unify efforts across web and app channels. What We’re Looking For: Proven experience in SEO and ASO with measurable success (rankings, downloads, traffic, etc.). Understanding of how users search on AI tools and marketplaces; experience optimizing content/presence in AI-driven environments. Deep familiarity with keyword strategy, metadata, app previews, and user review optimization. Strong command of technical SEO, app store algorithms, and content automation tools. Experience with prompt engineering and prompt optimization for discoverability. Bonus: Multilingual SEO or experience optimizing across regions (EMEA, APAC, Americas). Why Join Finalrentals? Global footprint, local impact – a chance to shape how millions of users discover mobility solutions. Be on the frontier of AI + search optimization – a truly modern role. Autonomy to lead, test, and innovate with real resources and support. Job Types: Full-time, Permanent Pay: ₹60,000.00 - ₹100,000.00 per month Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Yearly bonus
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary: We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously improving. You will be interacting with multiple technical teams, including architects and business stakeholders to develop state of the art machine learning systems that create value for the business. Responsibilities Managing the deployment and maintenance of machine learning models in production environments and ensuring seamless integration with existing systems. Monitoring model performance using metrics such as accuracy, precision, recall, and F1 score, and addressing issues like performance degradation, drift, or bias. Troubleshoot and resolve problems, maintain documentation, and manage model versions for audit and rollback. Analyzing monitoring data to preemptively identify potential issues and providing regular performance reports to stakeholders. Optimization of the queries and pipelines. Modernization of the applications whenever required Qualifications Expertise in programming languages like Python, SQL Solid understanding of best MLOps practices and concepts for deploying enterprise level ML systems. Understanding of Machine Learning concepts, models and algorithms including traditional regression, clustering models and neural networks (including deep learning, transformers, etc.) Understanding of model evaluation metrics, model monitoring tools and practices. Experienced with GCP tools like BigQueryML, MLOPS, Vertex AI Pipelines (Kubeflow Pipelines on GCP), Model Versioning & Registry, Cloud Monitoring, Kubernetes, etc. Solid oral and written communication skills and ability to prepare detailed technical documentation of new and existing applications. Strong ownership and collaborative qualities in their domain. Takes initiative to identify and drive opportunities for improvement and process streamlining. Bachelor’s Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications Experience in Azure MLOPS, Familiarity with Cloud Billing. Experience in setting up or supporting NLP, Gen AI, LLM applications with MLOps features. Experience working in an Agile environment, understanding of Lean Agile principles. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Posted 5 days ago
5.0 years
0 Lacs
Haveli, Maharashtra, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Module Lead - SQL, Snowflake Job Date: Jun 29, 2025 Job Requisition Id: 61771 Location: Pune, IN Indore, IN Pune, MH, IN Hyderabad, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MS SQL Professionals in the following areas : Experience 5 - 7 Years Job Description Job Summary: We are looking for a skilled SQL Server, Snowflake Developer to join our data and analytics team. The ideal candidate will have strong experience in developing and maintaining data solutions using SQL Server, Snowflake. You will play a key role in building scalable data pipelines, designing data models, and delivering business intelligence solutions. Key Responsibilities: Develop and optimize complex SQL queries, stored procedures, and ETL processes in SQL Server. Design and implement data pipelines and models in Snowflake. Build and maintain SSIS packages for ETL workflows. Migrate and integrate data between on-premise SQL Server and Snowflake cloud platform. Collaborate with business analysts and stakeholders to understand reporting needs. Ensure data quality, performance tuning, and error handling across all solutions. Maintain technical documentation and support data governance initiatives. Required Skills & Qualifications: 5-7 years of experience with SQL Server (T-SQL). 2+ years of hands-on experience with Snowflake. Strong understanding of ETL/ELT processes and data warehousing principles. Experience with data modeling, performance tuning, and data integration. Familiarity with Azure cloud platforms is a plus. Good communication and problem-solving skills. Preferred / Good-to-Have Skills: Experience with Azure Data Factory (ADF) for orchestrating data workflows. Experience with Power BI or other visualization tools. Exposure to CI/CD pipelines and DevOps practices in data environments. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles: Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Scientist at Rockwell Automation, you will be part of an analytics team working alongside engineers, product managers, and partners to drive AI-powered features for the platform. Collaborating with process engineers and operations leads, you will be responsible for developing predictive-maintenance models, enhancing throughput and yield, and implementing cost-saving strategies across manufacturing lines. Your role will involve managing end-to-end projects, including scoping data-collection architectures, prototyping machine-learning solutions, deploying models into the IIoT platform, and establishing real-time monitoring dashboards. Additionally, you will mentor junior analysts, engage in pilot projects with R&D, and contribute to shaping the roadmap for advanced-analytics capabilities. If you enjoy tackling complex industrial challenges, translating diverse data sources into actionable business insights, and evolving as a trusted analytics partner for clients, Rockwell Automation offers an environment where you can advance both your career and your clients" success. Reporting to the Lead Sr Solution Architect, you will be based at our Electronics City office in Bengaluru, following a hybrid work model. Your Responsibilities: - Leading end-to-end data-science projects by defining hypotheses, designing experiments, building features, training models, and deploying them in production. - Collaborating with Engineering to integrate ML services into the microservices architecture and with Marketing for A/B testing data-driven campaigns. - Creating scalable ETL pipelines and designing data schemas to support analytics and modeling at scale. - Developing monitoring dashboards and automated retraining workflows to ensure model accuracy. Essential Qualifications: - 6-10 years of experience in Python, SQL, Pandas, scikit-learn, and PySpark. - Proficiency in supervised and unsupervised ML techniques, advanced statistics, computer vision, and generative-AI projects. - Familiarity with Docker, Kubernetes, cloud ML platforms, and communicating data insights effectively. Preferred Qualifications: - Familiarity with BI tools, MLOps frameworks, FastAPI, and Linux environments. What We Offer: - Comprehensive benefits package including mindfulness programs, volunteer time off, donation matching, employee assistance programs, wellbeing initiatives, and professional development resources. - Commitment to fostering a diverse, inclusive, and authentic workplace. At Rockwell Automation, we value diversity and encourage candidates who are interested in the role to apply even if their experience does not align perfectly with every qualification listed. You might be the ideal fit for this position or other opportunities within the organization.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Specialist in Software Development (Artificial Intelligence) at Accelya, you will lead the design, development, and implementation of AI and machine learning solutions to tackle complex business challenges. Your expertise in AI algorithms, model development, and software engineering best practices will be crucial in working with cross-functional teams to deliver intelligent systems that optimize business operations and decision-making. Your responsibilities will include designing and developing AI-driven applications and platforms using machine learning, deep learning, and NLP techniques. You will lead the implementation of advanced algorithms for supervised and unsupervised learning, reinforcement learning, and computer vision. Additionally, you will develop scalable AI models, integrate them into software applications, and build APIs and microservices for deployment in cloud environments or on-premise systems. Collaboration with data scientists and data engineers will be essential in gathering, preprocessing, and analyzing large datasets. You will also implement feature engineering techniques to enhance the accuracy and performance of machine learning models. Regular evaluation of AI models using performance metrics and fine-tuning them for optimal accuracy will be part of your role. Furthermore, you will collaborate with business stakeholders to identify AI adoption opportunities, provide technical leadership and mentorship to junior team members, and stay updated with the latest AI trends and research to introduce innovative techniques to the team. Ensuring ethical compliance, security, and continuous improvement of AI systems will also be key aspects of your role. You should hold a Bachelor's degree in Computer Science, Data Science, Artificial Intelligence, or a related field, along with at least 5 years of experience in software development focusing on AI and machine learning. Proficiency in AI frameworks and libraries, programming languages such as Python, R, or Java, and cloud platforms for deploying AI models is required. Familiarity with Agile methodologies, data structures, and databases is essential. Preferred qualifications include a Master's or PhD in Artificial Intelligence or Machine Learning, experience with NLP techniques and computer vision technologies, and certifications in AI/ML or cloud platforms. Accelya is looking for individuals who are passionate about shaping the future of the air transport industry through innovative AI solutions. If you are ready to contribute your expertise and drive continuous improvement in AI systems, this role offers you the opportunity to make a significant impact in the industry.,
Posted 6 days ago
3.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Software Engineer Dev Python AI/ML REQ/0749 Job Id: REQ/0749 Location: Chennai Experience: 3 to 8 Years CTC: 10L to 18L Date Posted: 15-Jul-2025 Employment Type: Permanent No. of Openings: 6 Looking for an ML Engineer to design & implement AI/ML models using Python, TensorFlow, PyTorch, Scikit-learn. Optimise models, run experiments, transform data via classification/clustering, and stay updated on the latest AI/ML advancements. Desired Candidate Profile 3 years of experience in Software Design & Development in Python Makes pragmatic technical decisions beyond immediate scope Strong in debugging complex issues and mentoring junior engineers Solid understanding of Data Structures and OOP Proficient in TDD, Unit & Integration testing Experience with Databases, Statistics, and Data Science Skilled in Python; can write robust, testable code Hands-on with ML frameworks: Keras, PyTorch, scikit-learn AutoML experience is a plus Familiar with AI Cloud platforms: H2O, DataRobot, AWS, Azure Education/Specific Knowledge Bachelors or Above Degree in any discipline Key Skills Python, AI/ML, Keras, PyTorch, scikit-learn, H2O, DataRobot, AWS, Azure, FastAPI/Flask, MySQL or Oracle or PostgreSQL, XML, Unit Testing Highlights To know the benefits of Sysvine please visit the bottom of this page. We are open to considering candidates who are on a long break but are still passionate about restarting their careers. Our Benefits India Annual Team Trips Happy Fridays GameVine - Annual Games AimVine - Annual Party Social Responsibilities - Tree Planting, Volunteering for Orphans, Gadget Donations, Blood Donations Camps, Flood Relief Support, Cyclone Relief Support Health Campaigns Birthday Celebrations First Aid & Fire Safety Training Guest Speakers Benefits Accidental Insurance Family Health Insurance Parental Health Insurance Sick Leave Casual Leave Privilege Leave Floating Leave Holidays Short Term Disability Insurance Long Term Disability Insurance Employee Referral Bonus Product Referral Bonus Sodexo Passes Remote Working Flexible Working Hours Maternity Benefit Leave Encashment Tuition Reimbursement Niceties Welcome Kit MacBook Pro iPhones and Android Phones for Mobile Departments Coffee and Biscuits Recreation Room Resting Room Fitness Programmes and Equipment International Traditional Day Personal Tax Management Sessions Shuttle Services from/to Train Big Monitor Recognition Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Overseas Deputations Leadership Training Programs Technical Conferences Engaging Ethical Diverse Team Lunches D-Day (Difficult Day Policy) I-Day (Inconvenient Day Policy) Technical Conferences Personal Financial Management Sessions Leadership Training Programs Tax Saving Sessions Guest Speakers Benefits Health Insurance Unemployment Insurance Paid Time Off Floating Leaves 8 Holidays Short Term Disability Insurance Workmen Compensation Employee Referral Bonus Product Referral Bonus CalSavers Tuition Reimbursement Recognition Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Technical Conferences
Posted 6 days ago
6.0 - 12.0 years
14 - 19 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
1- Business expansion strategy leveraging enriched data of locations to identify profitable locations 2- Optimize health claims through optimizing on claims process leveraging data and analytics 3- NPS improvement using customer loyalty program as well as claims settlement data and analytics 4- Identification of fraud/nexus using claims optimization and fraud prevention strategy
Posted 6 days ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Jubilant Bhartia Group Jubilant Bhartia Group is a global conglomerate founded by Mr. Shyam S Bhartia and Mr. Hari S Bhartia with strong presence in diverse sectors like Pharmaceuticals, Contract Research and Development Services, Proprietary Novel Drugs, Life Science Ingredients, Agri Products, Performance Polymers, Food Service (QSR), Food, Auto, Consulting in Aerospace and Oilfield Services. Jubilant Bhartia Group has four flagships Companies- Jubilant Pharmova Limited, Jubilant Ingrevia Limited, Jubilant FoodWorks Limited and Jubilant Industries Limited. Currently the group has a global workforce of around 43,000 employees. About Jubilant Ingrevia Limited Jubilant Ingrevia is a Global Integrated Life Science Products & Innovative Solutions provider serving, Pharmaceutical, Agrochemical, Nutrition, Consumer and Industrial customers with our customised products & solutions that are innovative, cost effective and conforming to premium quality standards. Ingrevia is born out of a union of “Ingre” denoting Ingredients & “vie” in French meaning Life (i.e. Ingredients for Life) Jubilant Ingrevia history goes back to 1978 with the incorporation of VAM Organics Limited, which later became Jubilant Organosys and then Jubilant Life Sciences and now demerged to an independent entity as Jubilant Ingrevia Limited, which is listed in both the stock exchanges of India. Over the years, company has developed global capacities and leadership in chosen business segments. We have more than 40 years of experience in Life Science Chemicals, 30+ years of experience in Pyridine Chemistry and value added Specialty Chemicals, and 20+ years of experience in Vitamin B3, B4 and other Nutraceutical products. We have strategically segmented our business into three Business Segments as explained below. We are rapidly growing the revenue in all the three segments. Speciality Chemicals Segment : We propose to launch a new platform of Diketene & its value-added derivatives, forward integrate our crop protection chemicals to value-added agrochemicals (Herbicides, Fungicides & Insecticides) by adding new facilities. We are an established ‘partner of choice’ in CDMO, with more Invest plans in GMP & Non-GMP multi-product facility for Pharma & Crop Protection customers. Nutrition & Health Solutions Segment : We propose to expand the existing capacity of Vitamin B3 to continue being one of the market leaders and introduce new branded animal as well as human nutrition and health premixes. Chemical Intermediates Segment : We propose to expand our existing acetic anhydride capacity and add value added anhydrides and aldehydes and enhance volumes in speciality ethanol. We have 5 world-class manufacturing facilities i.e. One in UP at Gajraula, Two in Gujarat at Bharuch and Baroda, Two in Maharashtra at Nira and Ambernath . We operate 61 Plants across these 5 sites giving is multi-plant and multi-location advantage. Find out more about us at www.jubilantingrevia.com The Position Organization- Jubilant Ingrevia Limited Designation - Data Scientist Location- Noida. Job Summary: - Plays a crucial role in helping businesses make informed decisions by leveraging data & will c ollaborate with stakeholders, design data models, create algorithms, and share meaningful insights to drive business success Key Responsibilities. Work with supply chain, manufacturing, Sales managers, customer account managers and quality function to produce algorithms. Gathering and interpreting data from various sources. Cleaning and verifying the accuracy of data sets to ensure data integrity. Developing and implementing data collection systems and strategies to optimize efficiency and accuracy. Applying statistical techniques to analyze and interpret complex data sets. Develop and implement statistical models for predictive analysis. Build and deploy machine learning models to solve business problems. Creating visual representations of data through charts, graphs, and dashboards to communicate findings effectively. Develop dashboards and reports for ongoing monitoring and analysis. Create, modify and improve complex manufacturing schedule. Create scenario planning model for manufacturing, develop manufacturing schedule adherence probability model. Regularly monitoring and evaluating data quality, making recommendations for improvements as necessary, ensuring compliance with data privacy and security regulations. Person Profile . Qualification - B.E/M.Sc Maths/Statistics. Experience - 2-5 Yrs. Desired Skills Desired Skills & Must Have - 2-5 years of relevant experience in chemical/ manufacturing industry. Hands on Generative AI. Exposure to Agentic AI Proficiency in data analysis tools such as Microsoft Excel, SQL, and statistical software (e.g., R or Python). Proficiency in programming languages such as Python or R. Expertise in statistical analysis, machine learning algorithms, and data manipulation. Strong analytical and problem-solving skills with the ability to handle complex data sets. Excellent attention to detail and a high level of accuracy in data analysis. Solid knowledge of data visualization techniques and experience using visualization tools like Tableau or Power BI. Strong communication skills to present findings and insights to non-technical stakeholders effectively Knowledge of statistical methodologies and techniques, including regression analysis, clustering, and hypothesis testing. Familiarity with data modeling and database management concepts. Experience in manipulating and cleansing large data sets. Ability to work collaboratively in a team environment and adapt to changing priorities. Experience with big data technologies (e.g., Hadoop, Spark). Knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud). Familiarity with data engineering and database technologies. Jubilant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, colour, gender identity or expression, genetic information, marital status, medical condition, national origin, political affiliation, race, ethnicity, religion or any other characteristic protected by applicable local laws, regulations and ordinances
Posted 6 days ago
5.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role We seek a highly skilled Senior Machine Learning Engineer / Senior Data Scientist to design, develop, and deploy advanced machine learning models and systems. The ideal candidate will have deep expertise in machine learning algorithms, data processing, and model deployment, with a proven track record of delivering scalable AI solutions in production environments. This role requires strong technical leadership, collaboration with cross-functional teams, and a passion for solving complex problems. Key tasks & accountabilities Model Development: Design, develop, and optimize machine learning models for various applications, including but not limited to natural language processing, computer vision, and predictive analytics. Data Pipeline Management: Build and maintain robust data pipelines for preprocessing, feature engineering, and data augmentation to support model training and evaluation. Model Deployment: Deploy machine learning models into production environments, ensuring scalability, reliability, and performance using tools like Docker, Kubernetes, or cloud platforms preferably Azure. Research and Innovation: Stay updated on the latest advancements in machine learning and AI, incorporating state-of-the-art techniques into projects to improve performance and efficiency. Collaboration: Work closely with data scientists, software engineers, product managers, and other stakeholders to translate business requirements into technical solutions. Performance Optimization: Monitor and optimize model performance, addressing issues like model drift, bias, and scalability challenges. Code Quality: Write clean, maintainable, and well-documented code, adhering to best practices for software development and version control (e.g., Git). Mentorship: Provide technical guidance and mentorship to junior engineers, fostering a culture of learning and innovation within the team. Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or a related field. PhD is a plus. Previous Work Experience 5+ years of experience in machine learning, data science, or a related field. Proven experience in designing, training, and deploying machine learning models in production. Hands-on experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker, Kubernetes). Technical Skills Required Proficiency in Python and libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Hugging Face. Strong understanding of machine learning algorithms (e.g., regression, classification, clustering, deep learning, reinforcement learning, optimization). Experience with big data technologies (e.g., Hadoop, Spark, or similar) and data processing pipelines. Familiarity with MLOps practices, including model versioning, monitoring, and CI/CD for ML workflows. Knowledge of software engineering principles, including object-oriented programming, API development, and microservices architecture. Other Skills Required Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, dynamic environment and manage multiple priorities. Experience with generative AI models or large language models (LLMs). Familiarity with distributed computing or high-performance computing environments. And above all of this, an undying love for beer! We dream big to create future with more cheers
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Purpose of the role This role sits at the intersection of data science and revenue growth strategy, focused on developing advanced analytical solutions to optimize pricing, trade promotions, and product mix. The candidate will lead the end-to-end design, deployment, and automation of machine learning models and statistical frameworks that support commercial decision-making, predictive scenario planning, and real-time performance tracking. By leveraging internal and external data sources—including transactional, market, and customer-level data—this role will deliver insights into price elasticity, promotional lift, channel efficiency, and category dynamics. The goal is to drive measurable improvements in gross margin, ROI on trade spend, and volume growth through data-informed strategies. Key tasks & accountabilities Design and implement price elasticity models using linear regression, log-log models, and hierarchical Bayesian frameworks to understand consumer response to pricing changes across channels and segments. Build uplift models (e.g., Linear Regression, XGBoost for treatment effect) to evaluate promotional effectiveness and isolate true incremental sales vs. base volume. Develop demand forecasting models using ARIMA, SARIMAX, and Prophet, integrating external factors such as seasonality, promotions, and competitor activity, time-series clustering and k-means segmentation to group SKUs, customers, and geographies for targeted pricing and promotion strategies. Construct assortment optimization models using conjoint analysis, choice modeling, and market basket analysis to support category planning and shelf optimization. Use Monte Carlo simulations and what-if scenario modeling to assess revenue impact under varying pricing, promo, and mix conditions. Conduct hypothesis testing (t-tests, ANOVA, chi-square) to evaluate statistical significance of pricing and promotional changes. Create LTV (lifetime value) and customer churn models to prioritize trade investment decisions and drive customer retention strategies. Integrate Nielsen, IRI, and internal POS data to build unified datasets for modeling and advanced analytics in SQL, Python (pandas, statsmodels, scikit-learn), and Azure Databricks environments. Automate reporting processes and real-time dashboards for price pack architecture (PPA), promotion performance tracking, and margin simulation using advanced Excel and Python. Lead post-event analytics using pre/post experimental designs, including difference-in-differences (DiD) methods to evaluate business interventions. Collaborate with Revenue Management, Finance, and Sales leaders to convert insights into pricing corridors, discount policies, and promotional guardrails. Translate complex statistical outputs into clear, executive-ready insights with actionable recommendations for business impact. Continuously refine model performance through feature engineering, model validation, and hyperparameter tuning to ensure accuracy and scalability. Provide mentorship to junior analysts, enhancing their skills in modeling, statistics, and commercial storytelling. Maintain documentation of model assumptions, business rules, and statistical parameters to ensure transparency and reproducibility. Other Competencies Required Presentation Skills: Effectively presenting findings and insights to stakeholders and senior leadership to drive informed decision-making. Collaboration: Working closely with cross-functional teams, including marketing, sales, and product development, to implement insights-driven strategies. Continuous Improvement: Actively seeking opportunities to enhance reporting processes and insights generation to maintain relevance and impact in a dynamic market environment. Data Scope Management: Managing the scope of data analysis, ensuring it aligns with the business objectives and insights goals. Act as a steadfast advisor to leadership, offering expert guidance on harnessing data to drive business outcomes and optimize customer experience initiatives. Serve as a catalyst for change by advocating for data-driven decision-making and cultivating a culture of continuous improvement rooted in insights gleaned from analysis. Continuously evaluate and refine reporting processes to ensure the delivery of timely, relevant, and impactful insights to leadership stakeholders while fostering an environment of ownership, collaboration, and mentorship within the team. Business Environment Main Characteristics: Work closely with Zone Revenue Management teams. Work in a fast-paced environment. Provide proactive communication to the stakeholders. This is an offshore role and requires comfort with working in a virtual environment. GCC is referred to as the offshore location. The role requires working in a collaborative manner with Zone/country business heads and GCC commercial teams. Summarize insights and recommendations to be presented back to the business. Continuously improve, automate, and optimize the process. Geographical Scope: Europe Qualifications, Experience, Skills Level of educational attainment required: Bachelor or Post-Graduate in the field of Business & Marketing, Engineering/Solution, or other equivalent degree or equivalent work experience. MBA/Engg. in a relevant technical field such as Marketing/Finance. Extensive experience solving business problems using quantitative approaches. Comfort with extracting, manipulating, and analyzing complex, high volume, high dimensionality data from varying sources. Previous Work Experience Required 5-8 years of experience in the Retail/CPG domain. Technical Skills Required Data Manipulation & Analysis: Advanced proficiency in SQL, Python (Pandas, NumPy), and Excel for structured data processing. Data Visualization: Expertise in Power BI and Tableau for building interactive dashboards and performance tracking tools. Modeling & Analytics: Hands-on experience with regression analysis, time series forecasting, and ML models using scikit-learn or XGBoost. Data Engineering Fundamentals: Knowledge of data pipelines, ETL processes, and integration of internal/external datasets for analytical readiness. Proficient in Python (pandas, scikit-learn, statsmodels), SQL, and Power BI. Skilled in regression, Bayesian modeling, uplift modeling, time-series forecasting (ARIMA, SARIMAX, Prophet), and clustering (k-means). Strong grasp of hypothesis testing, model validation, and scenario simulation. And above all of this, an undying love for beer! We dream big to create future with more cheers .
Posted 6 days ago
1.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Junior Data Scientist Location: Bangalore Reporting to: Senior Manager – Analytics Purpose of the role The Global GenAI Team at Anheuser-Busch InBev (AB InBev) is tasked with constructing competitive solutions utilizing GenAI techniques. These solutions aim to extract contextual insights and meaningful information from our enterprise data assets. The derived data-driven insights play a pivotal role in empowering our business users to make well-informed decisions regarding their respective products. In the role of a Machine Learning Engineer (MLE), you will operate at the intersection of: LLM-based frameworks, tools, and technologies Cloud-native technologies and solutions Microservices-based software architecture and design patterns As an additional responsibility, you will be involved in the complete development cycle of new product features, encompassing tasks such as the development and deployment of new models integrated into production systems. Furthermore, you will have the opportunity to critically assess and influence the product engineering, design, architecture, and technology stack across multiple products, extending beyond your immediate focus. Key tasks & accountabilities Large Language Models (LLM): Experience with LangChain, LangGraph Proficiency in building agentic patterns like ReAct, ReWoo, LLMCompiler Multi-modal Retrieval-Augmented Generation (RAG): Expertise in multi-modal AI systems (text, images, audio, video) Designing and optimizing chunking strategies and clustering for large data processing Streaming & Real-time Processing: Experience in audio/video streaming and real-time data pipelines Low-latency inference and deployment architectures NL2SQL: Natural language-driven SQL generation for databases Experience with natural language interfaces to databases and query optimization API Development: Building scalable APIs with FastAPI for AI model serving Containerization & Orchestration: Proficient with Docker for containerized AI services Experience with orchestration tools for deploying and managing services Data Processing & Pipelines: Experience with chunking strategies for efficient document processing Building data pipelines to handle large-scale data for AI model training and inference AI Frameworks & Tools: Experience with AI/ML frameworks like TensorFlow, PyTorch Proficiency in LangChain, LangGraph, and other LLM-related technologies Prompt Engineering: Expertise in advanced prompting techniques like Chain of Thought (CoT) prompting, LLM Judge, and self-reflection prompting Experience with prompt compression and optimization using tools like LLMLingua, AdaFlow, TextGrad, and DSPy Strong understanding of context window management and optimizing prompts for performance and efficiency Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) Bachelor's or masterʼs degree in Computer Science, Engineering, or a related field. Previous Work Experience Required Proven experience of 1+ years in developing and deploying applications utilizing Azure OpenAI and Redis as a vector database. Technical Skills Required Solid understanding of language model technologies, including LangChain, OpenAI Python SDK, LammaIndex, OLamma, etc. Proficiency in implementing and optimizing machine learning models for natural language processing. Experience with observability tools such as mlflow, langsmith, langfuse, weight and bias, etc. Strong programming skills in languages such as Python and proficiency in relevant frameworks. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). And above all of this, an undying love for beer! We dream big to create future with more cheer
Posted 6 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem. Key Responsibilities Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine, BigQuery, Cloud Pub/Sub, Dataflow, Cloud Storage, and AlloyDB. Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability. Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions. Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency. Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow. Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets. Ensure the scalability, reliability, and security of cloud-based data architectures. Optimize cloud storage, compute, and query performance, driving cost-effective solutions. Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes. Implement best practices for data management, including governance, quality, and monitoring of data pipelines. Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP). Extensive hands-on experience with GCP Compute Engine, BigQuery, Cloud Pub/Sub, Dataflow, Cloud Storage, and AlloyDB. Strong expertise in SQL for query optimization and performance tuning in large-scale datasets. Solid experience in designing data schemas, data pipelines, and ETL processes. Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems. Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies. Experience with managing and processing streaming data and batch data processing workflows. Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines. Familiarity with data security, governance, and compliance best practices on GCP. Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions. Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders. Preferred Qualifications Bachelor's/Master’s degree in Computer Science, Data Engineering, or a related field. Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager. GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect). Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Posted 6 days ago
0.0 - 5.0 years
0 Lacs
Ahmedabad, Gujarat
On-site
We are hiring a Senior Software Development Engineer for our platform. We are helping enterprises and service providers build their AI inference platforms for end users. As a Senior Software Engineer, you will take ownership of backend-heavy, full-stack feature development—building robust services, scalable APIs, and intuitive frontends that power the user experience. You’ll contribute to the core of our enterprise-grade AI platform, collaborating across teams to ensure our systems are performant, secure, and built to last. This is a high-impact, high-visibility role working at the intersection of AI infrastructure, enterprise software, and developer experience. Responsibilities: Design, develop and maintain databases, system APIs, system integrations, machine learning pipelines and web user interfaces. Scale algorithms designed by data scientists for deployment in high-performance environments. Develop and maintain continuous integration pipelines to deploy the systems. Design and implement scalable backend systems using Golang, C++, Go,Python. Model and manage data using relational (e.g., PostgreSQL , MySQL). Build frontend components and interfaces using TypeScript, and JavaScript when needed. Participate in system architecture discussions and contribute to design decisions. Write clean, idiomatic, and well-documented Go code following best practices and design patterns. Ensure high code quality through unit testing, automation, code reviews, and documentation Communicate technical concepts clearly to both technical and non-technical stakeholders. Qualifications and Criteria: 5–10 years of professional software engineering experience building enterprise-grade platforms. Deep proficiency in Golang , with real-world experience building production-grade systems. Solid knowledge of software architecture, design patterns, and clean code principles. Experience in high-level system design and building distributed systems. Expertise in Python and backend development with experience in PostgreSQL or similar databases. Hands-on experience with unit testing, integration testing, and TDD in Go. Strong debugging, profiling, and performance optimization skills. Excellent communication and collaboration skills. Hands-on experience with frontend development using JavaScript, TypeScript , and HTML/CSS. Bachelor's degree or equivalent experience in a quantitative field (Computer Science, Statistics, Applied Mathematics, Engineering, etc.). Skills: Understanding of optimisation, predictive modelling, machine learning, clustering and classification techniques, and algorithms. Fluency in a programming language (e.g. C++, Go, Python, JavaScript, TypeScript, SQL). Docker, Kubernetes, and Linux knowledge are an advantage. Experience using Git. Knowledge of continuous integration (e.g. Gitlab/Github). Basic familiarity with relational databases, preferably PostgreSQL. Strong grounding in applied mathematics. A firm understanding of and experience with the engineering approach. Ability to interact with other team members via code and design documents. Ability to work on multiple tasks simultaneously. Ability to work in high-pressure environments and meet deadlines. Compensation: Commensurate with experience Position Type: Full-time ( In House ) Location: Ahmedabad / Jamnagar Gujarat India. Submission Requirements CV All academic transcripts Submit to chintanit22@gmail.com , dipakberait@gmail.com with the name of the position you wish to apply for in the subject line. Job Type: Full-time Pay: From ₹40,000.00 per month Benefits: Paid sick time Location Type: In-person Schedule: Day shift Monday to Friday Experience: Full-stack development: 5 years (Preferred) Work Location: In person Speak with the employer +91 9904075544
Posted 6 days ago
4.0 - 6.0 years
7 - 10 Lacs
Delhi
On-site
Job Title: Database Administrator (Contractual) Location: Delhi Salary Range: ₹60,000 – ₹90,000 per month Experience: 4 to 6 years Job Type: Contractual – 1 Year (Extendable based on performance and project requirements) Job Summary: We are looking for a skilled and experienced Database Administrator (DBA) for a contractual position of 1 year , based in Delhi . The role may be extended depending on performance and organizational needs. The ideal candidate will have 4 to 6 years of experience in MS-SQL Server administration and must be capable of managing critical databases in standalone and clustered environments. The candidate will also handle open-source and NoSQL databases, ensuring performance and availability on a 24x7 basis. Key Responsibilities: Install, configure, upgrade, monitor, and manage multiple MS-SQL Server instances (2014 to 2022) in both standalone and clustered environments. Execute patching , replication , log shipping , and database migrations . Administer MS-SQL Server databases , including structure documentation, operational guidelines, and security. Design and implement High Availability (HA) and Disaster Recovery (DR) solutions including clustering and SCP. Monitor system performance and ensure stability and capacity using tools like MS Performance Monitor . Design physical database layers with features such as partitioning . Support project teams with guidance on database management, SQL optimization, and performance tuning. Review developer-written database procedures and oversee deployment of database objects, ensuring backups are in place. Manage user access and rights across all supported database environments. Install and manage open-source and NoSQL databases as required. Ensure compliance with Service Level Requirements (SLRs) and maintain system uptime and performance in a 24x7 environment . Collaborate with cross-functional teams to build, deliver, and support database solutions aligned to business goals. Required Skills & Qualifications: Bachelor's degree in Computer Science, IT, or related field. 4 to 6 years of hands-on experience in MS-SQL Server database administration . Proficient with MS-SQL Server 2014/2016/2019/2022 . In-depth understanding of HA/DR , performance tuning, and database security. Experience with NoSQL/open-source databases is an advantage. Strong analytical, problem-solving, and troubleshooting skills. Willingness to work in a contractual role with potential for extension. Preferred Qualifications: Microsoft certification in database administration (e.g., Azure Database Administrator Associate). Experience with cloud-based databases (AWS, Azure, GCP). Familiarity with monitoring tools and database automation. Job Type: Full-time Pay: ₹60,000.00 - ₹90,000.00 per month Schedule: Day shift
Posted 6 days ago
5.0 years
4 - 7 Lacs
Chennai
On-site
Overview Our analysts transform data into meaningful insights that drive strategic decision making. They analyze trends, interpret data, and discover opportunities. Working cross-functionally, they craft narratives from the numbers - directly contributing to our success. Their work influences key business decisions and shape the direction of Comcast. Success Profile What makes a successful Data Analyst 3 at Comcast? Check out these top traits and explore role-specific skills in the job description below. Good Listener Problem Solver Organized Collaborative Perceptive Analytical Benefits We’re proud to offer comprehensive benefits to help support you physically, financially and emotionally through the big milestones and in your everyday life. Paid Time off We know how important it can be to spend time away from work to relax, recover from illness, or take time to care for others needs. Physical Wellbeing We offer a range of benefits and support programs to ensure that you and your loved ones get the care you need. Financial Wellbeing These benefits give you personalized support designed entirely around your unique needs today and for the future. Emotional Wellbeing No matter how you’re feeling or what you’re dealing with, there are benefits to help when you need it, in the way that works for you. Life Events + Family Support Benefits that support you no matter where you are in life’s journey. Data Analyst 3 Location Chennai, India Req ID R401060 Job Type Full Time Category Analytics Date posted 07/25/2025 Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for working cross-functionally to collect data and develop models to determine trends utilizing a variety of data sources. Retrieves, analyzes and summarizes business, operations, employee, customer and/or economic data in order to develop business intelligence, optimize effectiveness, predict business outcomes and decision-making purposes. Involved with numerous key business decisions by conducting the analyses that inform our business strategy. This may include: impact measurement of new products or features via normalization techniques, optimization of business processes through robust A/B testing, clustering or segmentation of customers to identify opportunities of differentiated treatment, deep dive analyses to understand drivers of key business trends, identification of customer sentiment drivers through natural language processing (NLP) of verbatim responses to Net Promotor System (NPS) surveys and development of frameworks to drive upsell strategy for existing customers by balancing business priorities with customer activity. Has in-depth experience, knowledge and skills in own discipline. Usually determines own work priorities. Acts as resource for colleagues with less experience. Job Description Core Responsibilities Work with business leaders and stakeholders to understand data and analysis needs and develop technical requirements. Analyzes large, complex data to determine actionable business insights using self-service analytics and reporting tools. Combines data as needed from disparate data sources to complete analysis from multiple sources. Identifies key business drivers and insights by conducting exploratory data analysis and hypothesis testing. Develops forecasting models to predict business key metrics. Analyzes the results of campaigns, offers or initiatives to measure their effectiveness and identifies opportunities for improvement. Communicates findings clearly and concisely through narrative-driven presentations and effective data visualizations to Company executives and decisionmakers. Stays current with emerging trends in analytics, statistics, and machine learning and applies them to business challenges. Mandatory Skills: SQL Tableau Good Story telling capabilities Nice to have skills: PPT creation Databricks Spark LLM Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 5-7 Years
Posted 6 days ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for working cross-functionally to collect data and develop models to determine trends utilizing a variety of data sources. Retrieves, analyzes and summarizes business, operations, employee, customer and/or economic data in order to develop business intelligence, optimize effectiveness, predict business outcomes and decision-making purposes. Involved with numerous key business decisions by conducting the analyses that inform our business strategy. This may include: impact measurement of new products or features via normalization techniques, optimization of business processes through robust A/B testing, clustering or segmentation of customers to identify opportunities of differentiated treatment, deep dive analyses to understand drivers of key business trends, identification of customer sentiment drivers through natural language processing (NLP) of verbatim responses to Net Promotor System (NPS) surveys and development of frameworks to drive upsell strategy for existing customers by balancing business priorities with customer activity. Has in-depth experience, knowledge and skills in own discipline. Usually determines own work priorities. Acts as resource for colleagues with less experience. Job Description Core Responsibilities Work with business leaders and stakeholders to understand data and analysis needs and develop technical requirements. Analyzes large, complex data to determine actionable business insights using self-service analytics and reporting tools. Combines data as needed from disparate data sources to complete analysis from multiple sources. Identifies key business drivers and insights by conducting exploratory data analysis and hypothesis testing. Develops forecasting models to predict business key metrics. Analyzes the results of campaigns, offers or initiatives to measure their effectiveness and identifies opportunities for improvement. Communicates findings clearly and concisely through narrative-driven presentations and effective data visualizations to Company executives and decisionmakers. Stays current with emerging trends in analytics, statistics, and machine learning and applies them to business challenges. Mandatory Skills SQL Tableau Good Story telling capabilities Nice To Have Skills PPT creation Databricks Spark LLM Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 5-7 Years
Posted 6 days ago
5.0 years
4 - 6 Lacs
Jaipur
Remote
Job Description: We are looking for a skilled PostgreSQL Database Administrator (DBA) with 5+ year of experience to join our team. The role will begin from our Jaipur office and shift to remote after the onboarding/training phase. Key Responsibilities: Install, configure, and manage PostgreSQL database servers Ensure database performance, availability, and security Perform regular database tuning and optimization Manage backup, recovery, and disaster recovery planning Troubleshoot and resolve database issues efficiently Implement security policies and access controls Support developers with SQL queries and performance tuning Required Skills: Minimum 5+ year of hands-on experience with PostgreSQL Strong skills in SQL and PL/pgSQL Experience in replication, clustering, and high availability Familiarity with Linux environments Knowledge of monitoring and database optimization tools Good problem-solving and communication skills Nice to Have: Experience with cloud platforms (AWS, Azure, GCP) Scripting knowledge (Shell, Python, etc.) Exposure to automation tools like Ansible Job Type: Full-time Pay: ₹40,000.00 - ₹50,000.00 per month Application Question(s): Notice Period Are you available to join within one week? Education: Bachelor's (Required) Experience: PostgreSQL DBA: 5 years (Required) Location: Jaipur, Rajasthan (Required) Work Location: Remote
Posted 6 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Intuit is seeking a Sr Data Scientist to join our Data sciences in Intuit India. The AI team develops game changing technologies and experiments that redefine and disrupt our current product offerings. You’ll be building and prototyping algorithms and applications on top of the collective financial data of 100 million consumers and small businesses. Applications will span multiple business lines, including personal finance, small business accounting, and tax. You thrive on ambiguity and will enjoy the frequent pivoting that’s part of the exploration. Your team will be very small and team members frequently wear multiple hats. In this position you will have close collaboration with the engineering and design teams, as well as the product and data teams in business units. Your role will range from research experimentalist to technology innovator to consultative business facilitator. You must be comfortable partnering with those directly involved with big data infrastructure, software, and data warehousing, as well as product management. What you'll bring MS or PhD in an appropriate technology field (Computer Science, Statistics, Applied Math, Operations Research, etc.). 2+ years of experience with data science for PhD and 5+ years for Masters. Experience in modern advanced analytical tools and programming languages such as R or Python with scikit-learn. Efficient in SQL, Hive, or SparkSQL, etc. Comfortable in Linux environment Experience in data mining algorithms and statistical modeling techniques such as clustering, classification, regression, decision trees, neural nets, support vector machines, anomaly detection, recommender systems, sequential pattern discovery, and text mining. Solid communication skills: Demonstrated ability to explain complex technical issues to both technical and non-technical audiences Preferred Additional Experience Apache Spark The Hadoop ecosystem Java HP Vertica TensorFlow, reinforcement learning Ensemble Methods, Deep Learning, and other topics in the Machine Learning community Familiarity with GenAI and other LLM and DL methods How you will lead Perform hands-on data analysis and modeling with huge data sets. Apply data mining, NLP, and machine learning (both supervised and unsupervised) to improve relevance and personalization algorithms. Work side-by-side with product managers, software engineers, and designers in designing experiments and minimum viable products. Discover data sources, get access to them, import them, clean them up, and make them “model-ready”. You need to be willing and able to do your own ETL. Create and refine features from the underlying data. You’ll enjoy developing just enough subject matter expertise to have an intuition about what features might make your model perform better, and then you’ll lather, rinse and repeat. Run regular A/B tests, gather data, perform statistical analysis, draw conclusions on the impact of your optimizations and communicate results to peers and leaders. Explore new design or technology shifts in order to determine how they might connect with the customer benefits we wish to deliver.
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Purpose of the role This role sits at the intersection of data science and revenue growth strategy, focused on developing advanced analytical solutions to optimize pricing, trade promotions, and product mix. The candidate will lead the end-to-end design, deployment, and automation of machine learning models and statistical frameworks that support commercial decision-making, predictive scenario planning, and real-time performance tracking. By leveraging internal and external data sources—including transactional, market, and customer-level data—this role will deliver insights into price elasticity, promotional lift, channel efficiency, and category dynamics. The goal is to drive measurable improvements in gross margin, ROI on trade spend, and volume growth through data-informed strategies. Key tasks & accountabilities Design and implement price elasticity models using linear regression, log-log models, and hierarchical Bayesian frameworks to understand consumer response to pricing changes across channels and segments. Build uplift models (e.g., Linear Regression, XGBoost for treatment effect) to evaluate promotional effectiveness and isolate true incremental sales vs. base volume. Develop demand forecasting models using ARIMA, SARIMAX, and Prophet, integrating external factors such as seasonality, promotions, and competitor activity, time-series clustering and k-means segmentation to group SKUs, customers, and geographies for targeted pricing and promotion strategies. Construct assortment optimization models using conjoint analysis, choice modeling, and market basket analysis to support category planning and shelf optimization. Use Monte Carlo simulations and what-if scenario modeling to assess revenue impact under varying pricing, promo, and mix conditions. Conduct hypothesis testing (t-tests, ANOVA, chi-square) to evaluate statistical significance of pricing and promotional changes. Create LTV (lifetime value) and customer churn models to prioritize trade investment decisions and drive customer retention strategies. Integrate Nielsen, IRI, and internal POS data to build unified datasets for modeling and advanced analytics in SQL, Python (pandas, statsmodels, scikit-learn), and Azure Databricks environments. Automate reporting processes and real-time dashboards for price pack architecture (PPA), promotion performance tracking, and margin simulation using advanced Excel and Python. Lead post-event analytics using pre/post experimental designs, including difference-in-differences (DiD) methods to evaluate business interventions. Collaborate with Revenue Management, Finance, and Sales leaders to convert insights into pricing corridors, discount policies, and promotional guardrails. Translate complex statistical outputs into clear, executive-ready insights with actionable recommendations for business impact. Continuously refine model performance through feature engineering, model validation, and hyperparameter tuning to ensure accuracy and scalability. Provide mentorship to junior analysts, enhancing their skills in modeling, statistics, and commercial storytelling. Maintain documentation of model assumptions, business rules, and statistical parameters to ensure transparency and reproducibility. Other Competencies Required Presentation Skills: Effectively presenting findings and insights to stakeholders and senior leadership to drive informed decision-making. Collaboration: Working closely with cross-functional teams, including marketing, sales, and product development, to implement insights-driven strategies. Continuous Improvement: Actively seeking opportunities to enhance reporting processes and insights generation to maintain relevance and impact in a dynamic market environment. Data Scope Management: Managing the scope of data analysis, ensuring it aligns with the business objectives and insights goals. Act as a steadfast advisor to leadership, offering expert guidance on harnessing data to drive business outcomes and optimize customer experience initiatives. Serve as a catalyst for change by advocating for data-driven decision-making and cultivating a culture of continuous improvement rooted in insights gleaned from analysis. Continuously evaluate and refine reporting processes to ensure the delivery of timely, relevant, and impactful insights to leadership stakeholders while fostering an environment of ownership, collaboration, and mentorship within the team. Business Environment Main Characteristics: Work closely with Zone Revenue Management teams. Work in a fast-paced environment. Provide proactive communication to the stakeholders. This is an offshore role and requires comfort with working in a virtual environment. GCC is referred to as the offshore location. The role requires working in a collaborative manner with Zone/country business heads and GCC commercial teams. Summarize insights and recommendations to be presented back to the business. Continuously improve, automate, and optimize the process. Geographical Scope: Europe 3. Qualifications, Experience, Skills Level of educational attainment required: Bachelor or Post-Graduate in the field of Business & Marketing, Engineering/Solution, or other equivalent degree or equivalent work experience. MBA/Engg. in a relevant technical field such as Marketing/Finance. Extensive experience solving business problems using quantitative approaches. Comfort with extracting, manipulating, and analyzing complex, high volume, high dimensionality data from varying sources. Previous Work Experience Required 5-8 years of experience in the Retail/CPG domain. Technical Skills Required Data Manipulation & Analysis: Advanced proficiency in SQL, Python (Pandas, NumPy), and Excel for structured data processing. Data Visualization: Expertise in Power BI and Tableau for building interactive dashboards and performance tracking tools. Modeling & Analytics: Hands-on experience with regression analysis, time series forecasting, and ML models using scikit-learn or XGBoost. Data Engineering Fundamentals: Knowledge of data pipelines, ETL processes, and integration of internal/external datasets for analytical readiness. Proficient in Python (pandas, scikit-learn, statsmodels), SQL, and Power BI. Skilled in regression, Bayesian modeling, uplift modeling, time-series forecasting (ARIMA, SARIMAX, Prophet), and clustering (k-means). Strong grasp of hypothesis testing, model validation, and scenario simulation. And above all of this, an undying love for beer! We dream big to create future with more cheers .
Posted 6 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience- 7+ years Location- Hyderabad (preferred), Pune, Mumbai JD- We are seeking a skilled Snowflake Developer with 7+ years of experience in designing, developing, and optimizing Snowflake data solutions. The ideal candidate will have strong expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration. This role involves building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Key Responsibilities 1. Snowflake Development & Optimization Design and develop Snowflake databases, schemas, tables, and views following best practices. Write complex SQL queries, stored procedures, and UDFs for data transformation. Optimize query performance using clustering, partitioning, and materialized views. Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks). 2. Data Pipeline Development Build and maintain ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. Integrate Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). Develop CDC (Change Data Capture) and real-time data processing solutions. 3. Data Modeling & Warehousing Design star schema, snowflake schema, and data vault models in Snowflake. Implement data sharing, secure views, and dynamic data masking. Ensure data quality, consistency, and governance across Snowflake environments. 4. Performance Tuning & Troubleshooting Monitor and optimize Snowflake warehouse performance (scaling, caching, resource usage). Troubleshoot data pipeline failures, latency issues, and query bottlenecks. Work with DevOps teams to automate deployments and CI/CD pipelines. 5. Collaboration & Documentation Work closely with data analysts, BI teams, and business stakeholders to deliver data solutions. Document data flows, architecture, and technical specifications. Mentor junior developers on Snowflake best practices. Required Skills & Qualifications · 7+ years in database development, data warehousing, or ETL. · 4+ years of hands-on Snowflake development experience. · Strong SQL or Python skills for data processing. · Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). · Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). · Certifications: SnowPro Core Certification (preferred). Preferred Skills · Familiarity with data governance and metadata management. · Familiarity with DBT, Airflow, SSIS & IICS · Knowledge of CI/CD pipelines (Azure DevOps).
Posted 6 days ago
10.0 years
0 Lacs
Greater Chennai Area
On-site
About The Role Seeking a highly skilled Database Administrator with 10 years of experience to join our dynamic team. Requirements Should manage and maintain SQL Server databases across development, testing and production environments. Hands on experience in leading and supporting database migration projects from on-premises SQL Server to Azure SQL Database, Azure Managed Instances or IaaS. Perform performance analysis and tuning of T-SQL queries, indexes and server configurations. Monitor database health and activity using tools like SolarWinds DPA, Dynatrace, and SQL Server native DMVs. Troubleshoot and resolve SQL Server issues including slow performance, job failures and blocked processes. Expertise in developing, scheduling and monitoring SSIS packages and ETL jobs, ensuring data quality and timely delivery. Must maintain, analyse and enhance Ola Hellengram scripts for backward compatibility or migration. Work collaboratively with application developers, infrastructure teams and business stakeholders. Respond to incidents and service requests via ServiceNow (SNOW) and document resolution steps. Ensure database compliance with security and operational standards. Sound troubleshooting skills in T-SQL, replication, clustering and Always On availability groups. Proficient with SSIS, ETL pipeline development and job monitoring. Participate in disaster recovery planning and testing. Familiarity with DevOps practices, Git integration for SQL code and automation using PowerShell. Exposure to cloud cost optimization techniques and scaling strategies. Experience supporting Agile development teams and CI/CD pipelines for database deployments. Preferred certificate includes Azure Database Administrator Associate or similar Azure certifications.
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You should have a Bachelor's Degree in a technical or quantitative field from an accredited college or university. A Master's or PhD in Computer Science, Statistics, Mathematics, Engineering, Bioinformatics, Physics, Operations Research, or related fields is preferred. You must have at least 10 years of relevant experience in Machine Learning and Statistics. A strong mathematical background is required, with the ability to understand algorithms and methods both mathematically and intuitively. Your primary responsibilities will include translating business objectives into analytic approaches, identifying data sources to support analysis, and analyzing structured and unstructured data using advanced statistical methods, deep learning, and Generative AI. You will need to perform exploratory data analyses, generate and test working hypotheses, prepare and analyze historical data, and identify patterns. Proficiency in Python and SQL is essential, along with experience in working with open-source packages and commercial/enterprise applications. You should be comfortable with machine learning and statistical methods such as regression, classification, clustering, ensemble and tree-based models, and time-series analysis. Prior experience or understanding in ML Ops in any cloud platform is required. You should have a strong background in Generative AI, particularly with tools like ChatGPT and CoPilot, and hands-on experience in using APIs to integrate with CoPilot and fabric. Exposure to the Microsoft Tech Stack ecosystem is necessary, along with familiarity with Azure Document Intelligence and Azure Machine Learning for model training and deployment. Additionally, you should be familiar with at least one of the following: Azure AI Language, Azure AI Vision. As part of your role, you will need to communicate results and educate others through reports and presentations. Familiarity with DevOps, code version control, and software development coding best practices is essential. Candidates must hold at least two of the following certifications from Azure: DP-203, MB260, DP-900, AI-102, AI-900.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
The Network and Virtualization Engineer is expected to lead the design, deployment, and ongoing optimization of robust network and virtualization infrastructures. This role is critical in ensuring high availability, performance, and security across all network layers and hypervisor environments directly supporting the scalability and resilience of core business operations. To achieve the goals of this role, you must design, implement, and maintain the company's network and virtualization infrastructure. This includes modifying, patching, and extending the KVM hypervisor to enhance performance, security, isolation, or feature sets aligned with business and product needs. You will work with Linux kernel modules, OS package management tools (yum, apt, dnf, rpm, deb), and system boot processes to maintain a highly optimized virtual infrastructure. Additionally, engineer solutions that span VM image formats (qcow2, vmdk, vhd, ovf, xva), cloud-init integration, and support for high availability and clustering in virtualized environments. You will design and implement advanced virtual networking capabilities using Open vSwitch, VXLAN, L2/L3 routing, and SDN. Collaborating with infrastructure teams to ensure seamless LAN/WAN/SAN interactions. Furthermore, analyze hypervisor and kernel-level behavior to troubleshoot complex system issues, apply performance patches, and optimize virtualization throughput and latency. Ensure tight integration of KVM with containers, Kubernetes, and cloud orchestration platforms like OpenStack. Harden the hypervisor and kernel environment, enforce system integrity, and build controls to comply with internal and external security standards. Explore and implement emerging technologies including AIOps, MLOps, and hardware virtualization acceleration to advance the platform's capabilities. Maintain detailed documentation of kernel modifications, system architecture changes, and testing strategies. Contribute to engineering best practices and mentor peers. Required Skills and Qualifications: - Deep experience with Linux kernel development, KVM, and virtualization internals. - Strong programming skills in C, Python, and Go. - Experience with hypervisor integration formats (qcow2, vmdk, vhd, xva), cloud-init, and OS image management. - Familiarity with cloud platforms (AWS, Azure, GCP) and open-source cloud stacks (OpenStack). - Solid understanding of OSI model, networking protocols, SDN, firewalls, load balancers, and storage protocols (iSCSI, NFS). - Exposure to DevOps, IaC, CI/CD pipelines, and agile toolchains like GitLab and Jira. - Working knowledge of clustering, HA, containers, and Kubernetes in a virtualization context. Educational Qualifications: - A Bachelor's degree in Computer Science or a closely related technical field. Experience: - 7+ years of hands-on experience in network administration, network design, and virtualization technologies.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role includes having 4+ years of experience as a PostgreSQL DBA in production environments. You should possess a strong knowledge of PostgreSQL internals, architecture, and tools, along with proficiency in SQL and PL/pgSQL. Experience with replication (streaming, logical), clustering, and failover mechanisms is essential. Your responsibilities will involve installing, configuring, and upgrading PostgreSQL databases and related tools, monitoring database performance, and proactively addressing issues. Additionally, you will be expected to perform database tuning, indexing, and query optimization, as well as implement and manage backup and recovery strategies. What you will love about working here is joining Capgemini as a PostgreSQL DBA because of the company's global reputation for delivering cutting-edge technology solutions and its strong focus on innovation and digital transformation. Capgemini's diverse client base across industries offers a dynamic environment where you can apply PostgreSQL expertise to solve real-world challenges at scale. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a responsible and diverse group of 340,000 team members in more than 50 countries, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. The company delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough