Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Summary We are looking for an experienced R Analytics Lead to manage the day-to-day operations of our R-based analytics environment. The role focuses on monitoring the execution of existing R scripts, resolving failures through root cause analysis and data cleaning, supporting business teams with production data requests, and mentoring the R Ops team for better process efficiency and incident handling. --- Key Responsibilities Monitor Production Jobs: Oversee successful execution of scheduled R scripts; monitor failures, investigate issues, and take corrective actions. Root Cause Analysis: Troubleshoot script failures and identify data or logic issues; perform necessary fixes and re-execute the process to ensure output delivery. Data Cleaning: Handle raw or inconsistent production data by applying proper cleaning techniques to ensure smooth script execution. Production Data Requests: Fulfill various production data and reporting requests raised by business stakeholders using R and SQL. Issue Resolution & Team Support: Act as the go-to person for any technical issues in the R Ops team. Guide and support team members in identifying problems and resolving them. Process Improvement: Identify areas to improve existing R code performance, suggest enhancements, and help automate or simplify routine tasks. Collaboration with Development & QA: Support testing, deployment, and monitoring activities for new script developments or changes in the production environment. Knowledge Sharing: Train and mentor team members on R coding standards, production support practices, database usage, and debugging techniques. Required Qualifications 6+ years of experience in analytics, with at least 4 years in a lead or senior operations/support role. Strong hands-on experience in R programming (especially with packages like dplyr, data.table, readr, lubridate). Proficiency in SQL for data extraction, transformation, and analysis. Experience in handling production support, script monitoring, and issue resolution. Demonstrated ability to lead teams, train junior members, and coordinate across departments. Desirable Skills Familiarity with scheduling tools and database connections in a production environment. Ability to document processes, communicate issues clearly, and interact with business users.
Posted 3 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-2 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Purpose; To evaluate and validate the quality, accuracy, and relevance of the data and analytics outputs, and ensure they adhere to the standards and best To stay updated with the latest trends and developments in the data and analytics field, and leverage them to improve the team's capabilities and performance To lead and mentor a team of data analysts and engineers, and foster a culture of learning, innovation, and collaboration within the team and across the organization Key Accountabilities Tag Management • Own channel wise mapping and click to visit sanity • Implement best practices in Tag Management Adobe Analytics Implementation • Define and own dimensions and metrics • Define channel rules and custom implementation as per business objectives • Own GA/AA versus DB variance in transactions and revenue • Installing, configuring, testing, and troubleshooting Adobe software products such as Adobe Experience Manager, Adobe Analytics, Adobe Target, and Adobe Campaign. • Providing technical guidance and best practices for Adobe software implementation and integration. • Creating and maintaining documentation and reports on Adobe software performance, issues, and solutions. A/B Testing & Personalization • Drive the overall conversion optimization and landing page optimization agenda • Co-own traffic to lead conversion with Product Attribution and MIS Automation • Manage the complexity of multiple truths (Adwords, DBM, Facebook, Google analytics, Internal CRM) and converge on a single truth • Automation of critical dashboards for decision making and business insights Channel mix modelling & Data driven attribution • Answer the ultimate questions. How much to spend in which channel? • Who to market, when to spend and which channel to use? • How to successfully move from campaign to audience marketing? Other Competencies • Apply advanced statistical and analytical techniques, such as machine learning, predictive modeling, and optimization, to generate insights and solutions for complex business problems and opportunities Technical Competencies (Preferred domain knowledge) SQL: Competency to write and execute complex queries, join multiple tables, create views and functions, and optimize the performance of your database. Python: Ability to use Python for data manipulation, processing, and modeling, as well as for creating web applications and APIs. You should also be proficient in using libraries such as pandas, numpy, scipy, sklearn, matplotlib, seaborn, and flask. R: Use R for statistical analysis, data visualization, and machine learning. You should also be familiar with popular packages such as tidyverse, ggplot2, dplyr, tidyr, caret, and shiny. Tableau: Ability to create interactive dashboards and reports using Tableau, as well as connect to various data sources and perform data blending and aggregation. Power BI: Competency to use Power BI for data visualization and business intelligence, as well as create and share reports and dashboards using Power BI Desktop and Power BI Service. Familiar with web services and APIs such as REST, SOAP, and OAuth. Skills/Qualities Required Strong analytical and critical thinking skills, with proficiency in data analysis tools and techniques Excellent communication skills, capable of translating complex data into clear, actionable insights for non-technical stakeholders Detail-oriented with strong organizational skills, able to manage multiple projects and meet deadlines Keen interest in staying updated with the latest trends in data analytics and e-commerce
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As an integral part of the MC Analytics team, your role in Real-World Evidence (RWE) generation will be crucial in bridging the gap between clinical effectiveness and the commercial success of pharmaceutical brands. Collaboration between clinical and commercial disciplines is key at MC Analytics, where valuable insights are shared to guide decision-making. The importance of considering commercialization early in the drug development process is emphasized, and we are dedicated to partnering with you to create fully integrated RWE solutions tailored for the peri-approval phase, ensuring the collection of essential data for a successful product launch. By focusing on Real-World Evidence generation to meet the needs of key stakeholders, you will provide: - The necessary data to support regulatory approval and licensing processes. - The evidence required by physicians to confidently prescribe the drug. Your proficiency in coding will play a significant role in this role, including: - Advanced/Expertise in R programming with a focus on dplyr and tidyverse packages. - Skilled in managing complex datasets, handling missing values, and performing data pre-processing tasks efficiently. - Experience with version control systems such as GitHub and GitLab. - Advanced SQL skills to write optimized and scalable queries. - Ability to write modular and reusable code, emphasizing best practices and functions. The desired qualifications and skills for this role include: - A degree in Statistics/Data Science (B.Sc/M.Sc) with 1-2 years of experience in Real-world data or Oncology. - Proficiency in implementing statistical methods for descriptive statistics, Kaplan-Meier curves, visualization plots, etc. - Demonstrated ownership of deliverables, ensuring timely and high-quality outcomes. - Capability to work independently while proactively seeking guidance when needed. - Willingness to follow directions, accept oversight, and collaborate effectively in a team setting. - Adaptability to thrive in a cross-functional and highly collaborative work environment. Join us at MC Analytics and be a part of a dynamic team dedicated to delivering impactful Real-World Evidence solutions. Visit our website at https://mcanalytics.co.in/ to learn more about our work and culture.,
Posted 2 weeks ago
4.0 years
0 Lacs
New Delhi, Delhi, India
Remote
Title: Data Analyst Nature of position: 3-6 months consultancy contract, 20 hours/week Location: New Delhi Role overview: We are looking for a highly proficient Research Analyst with expertise in R programming to lead data analysis and visualization for public health nutrition research. The role involves working with large datasets and translating complex data into actionable, policy-relevant insights using R. Key Responsibilities: Perform data cleaning, transformation, and analysis using R Develop scripts and workflows for automated data processing and reporting Connect to external data sources, including: (SQL-based databases hosted on servers) Use R to conduct exploratory and inferential data analysis aligned with research objectives Document code, analysis steps, and data workflows for transparency and reuse Build and maintain data dashboards, visualizations, and summary reports using R packages (e.g., ggplot2, plotly) Ensure data quality, validation, and documentation throughout the analysis . Essential Qualifications and Skills: PhD in Data Science, Statistics, Public Health, Biostatistics Minimum 2–4 years of experience working in a data analyst or research analyst role Advanced proficiency in R programming, including data wrangling ( dplyr , tidyr ), visualization, and reporting Experience with connecting R to remote servers and databases (e.g., using DBI , RPostgres ) Experience working with health, nutrition, or development datasets preferred Strong understanding of data cleaning, wrangling, and quality assurance Desirable Skills: Prior experience in working with national or regional nutrition data and research in India Experience with Shiny apps, rmarkdown etc.
Posted 2 weeks ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description year Of exp- 6 to 10 Years Location- Noida, Pune, Bangalore, Nagpur Requirements Job Requirements: 6+ years of experience in a Data Scientist role Strong analytic skills related to working with unstructured datasets. Strong understanding of statistical methods and hypothesis testing. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience supporting and working with cross-functional teams in a dynamic environment. Required Skills: Programming Languages: Proficiency in Python API / Framework: Hands-on experience with PySpark Data Manipulation: Strong experience with libraries like pandas, NumPy, dplyr, etc. Machine Learning: Hands-on experience with ML algorithms (regression, classification, clustering, etc.) Machine learning Models: Experience developing and tuning machine learning models such as Random Forest, Support Vector Machines (SVM), Natural Language Processing (NLP) techniques Cloud: Hands-on experience with AWS services for data science and ML, including S3 and EC2 Orchestration tool: Experience with orchestration tools like Luigi to manage data pipelines or ML workflows. Statistical Analysis: Solid foundation in probability, statistics, hypothesis testing, and experimental design (e.g., A/B testing). Data Visualization: Experience with visualization tools such Tableau, or Power BI. SQL & Databases: Ability to write complex SQL queries; familiarity with relational databases (PostgreSQL, MySQL, etc.). CICD: Experience with CI/CD practices and tools (GitHub) for automating data workflows and model deployments. Other Tool: Proficient in Microsoft Excel for exploratory data analysis, reporting, and data validation. Soft Skills: – Strong problem-solving and critical thinking abilities. – Excellent communication skills to explain technical concepts to non-technical stakeholders. – Ability to work independently and collaboratively in a fast-paced, agile environment. – Detail-oriented, with a focus on data quality and accuracy. Good To have: Domain Knowledge. Job responsibilities Collaborate with data engineers, analysts, and business stakeholders to understand requirements and deliver solutions. Monitor model performance and retrain as needed to ensure accuracy and reliability. Collect, clean, and preprocess large datasets from various sources. Develop statistical models and machine learning algorithms to solve business problems. Stay updated with the latest industry trends, tools, and technologies. What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-2 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250449
Posted 3 weeks ago
6.0 - 11.0 years
15 - 18 Lacs
Chennai, Mumbai (All Areas)
Work from Office
Job Summary We are looking for an experienced R Analytics Lead to manage the day-to-day operations of our R-based analytics environment. The role focuses on monitoring the execution of existing R scripts, resolving failures through root cause analysis and data cleaning, supporting business teams with production data requests, and mentoring the R Ops team for better process efficiency and incident handling. --- Key Responsibilities Monitor Production Jobs: Oversee successful execution of scheduled R scripts; monitor failures, investigate issues, and take corrective actions. Root Cause Analysis: Troubleshoot script failures and identify data or logic issues; perform necessary fixes and re-execute the process to ensure output delivery. Data Cleaning: Handle raw or inconsistent production data by applying proper cleaning techniques to ensure smooth script execution. Production Data Requests: Fulfill various production data and reporting requests raised by business stakeholders using R and SQL. Issue Resolution & Team Support: Act as the go-to person for any technical issues in the R Ops team. Guide and support team members in identifying problems and resolving them. Process Improvement: Identify areas to improve existing R code performance, suggest enhancements, and help automate or simplify routine tasks. Collaboration with Development & QA: Support testing, deployment, and monitoring activities for new script developments or changes in the production environment. Knowledge Sharing: Train and mentor team members on R coding standards, production support practices, database usage, and debugging techniques. Required Qualifications 6+ years of experience in analytics, with at least 4 years in a lead or senior operations/support role. Strong hands-on experience in R programming (especially with packages like dplyr, data.table, readr, lubridate). Proficiency in SQL for data extraction, transformation, and analysis. Experience in handling production support, script monitoring, and issue resolution. Demonstrated ability to lead teams, train junior members, and coordinate across departments. Desirable Skills Familiarity with scheduling tools and database connections in a production environment. Ability to document processes, communicate issues clearly, and interact with business users.
Posted 3 weeks ago
8.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role As a Data Scientist at Kyndryl you are the bridge between business problems and innovative solutions, using a powerful blend of well-defined methodologies, statistics, mathematics, domain expertise, consulting, and software engineering. You'll wear many hats, and each day will present a new puzzle to solve, a new challenge to conquer. You will dive deep into the heart of our business, understanding its objectives and requirements – viewing them through the lens of business acumen, and converting this knowledge into a data problem. You’ll collect and explore data, seeking underlying patterns and initial insights that will guide the creation of hypotheses. Analytical professional who uses statistical methods, machine learning, and programming skills to extract insights and knowledge from data. Their primary goal is to solve complex business problems, make predictions, and drive strategic decision-making by uncovering patterns and trends within large datasets. In this role, you will embark on a transformative process of business understanding, data understanding, and data preparation. Utilizing statistical and mathematical modeling techniques, you'll have the opportunity to create models that defy convention – models that hold the key to solving intricate business challenges. With an acute eye for accuracy and generalization, you'll evaluate these models to ensure they not only solve business problems but do so optimally. Additionally, you're not just building and validating models – you’re deploying them as code to applications and processes, ensuring that the model(s) you've selected sustains its business value throughout its lifecycle. Your expertise doesn't stop at data; you'll become intimately familiar with our business processes and have the ability to navigate their complexities, identifying issues and crafting solutions that drive meaningful change in these domains. You will develop and apply standards and policies that protect our organization's most valuable asset – ensuring that data is secure, private, accurate, available, and, most importantly, usable. Your mastery extends to data management, migration, strategy, change management, and policy and regulation. Key Responsibilities: Problem Framing: Collaborating with stakeholders to understand business problems and translate them into data-driven questions. Data Collection and Cleaning: Sourcing, collecting, and cleaning large, often messy, datasets from various sources, preparing them for analysis. Exploratory Data Analysis (EDA): Performing initial investigations on data to discover patterns, spot anomalies, test hypotheses, and check assumptions with the help of summary statistics and graphical representations. Model Development: Building, training, and validating machine learning models (e.g., regression, classification, clustering, deep learning) to predict outcomes or identify relationships. Statistical Analysis: Applying statistical tests and methodologies to draw robust conclusions from data and quantify uncertainty. Feature Engineering: Creating new variables or transforming existing ones to improve model performance and provide deeper insights. Model Deployment: Working with engineering teams to deploy models into production environments, making them operational for real-time predictions or insights. Communication and Storytelling: Presenting complex findings and recommendations clearly and concisely to both technical and non-technical audiences, often through visualizations and narratives. Monitoring and Maintenance: Tracking model performance in production and updating models as data patterns evolve or new data becomes available. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Expertise 8 - 10 years of experience as an Data Scientist . Programming Languages: Strong proficiency in Python and/or R, with libraries for data manipulation (e.g., Pandas, dplyr), scientific computing (e.g., NumPy), and machine learning (e.g., Scikit-learn, TensorFlow, PyTorch). Statistics and Probability: A solid understanding of statistical inference, hypothesis testing, probability distributions, and experimental design. Machine Learning: Deep knowledge of various machine learning algorithms, their underlying principles, and when to apply them. Database Querying: Proficiency in SQL for extracting and manipulating data from relational databases. Data Visualization: Ability to create compelling and informative visualizations using tools like Matplotlib, Seaborn, Plotly, or Tableau. Big Data Concepts: Familiarity with concepts and tools for handling large datasets, though often relying on Data Engineers for infrastructure. Domain Knowledge: Understanding of the specific industry or business domain to contextualize data and insights. Preferred Technical And Professional Experience Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 month ago
5.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 3–5 years of experience owning end-to-end data pipelines. You’ll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R. Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally Bachelor’s or Master’s in a quantitative field (Statistics, CS, Economics, etc.). 3–5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains. About Company Hi there! We are Auriga IT. We power businesses across the globe through digital experiences, data and insights. From the apps we design to the platforms we engineer, we're driven by an ambition to create world-class digital solutions and make an impact. Our team has been part of building the solutions for the likes of Zomato, Yes Bank, Tata Motors, Amazon, Snapdeal, Ola, Practo, Vodafone, Meesho, Volkswagen, Droom and many more. We are a group of people who just could not leave our college-life behind and the inception of Auriga was solely based on a desire to keep working together with friends and enjoying the extended college life. Who Has not Dreamt of Working with Friends for a Lifetime Come Join In Our Website - https://aurigait.com/
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Jaipur
On-site
Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 3–5 years of experience owning end-to-end data pipelines. You’ll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities: Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R . Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills: You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally : Bachelor’s or Master’s in a quantitative field (Statistics, CS, Economics, etc.). 3–5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains. About Company Hi there! We are Auriga IT. We power businesses across the globe through digital experiences, data and insights. From the apps we design to the platforms we engineer, we're driven by an ambition to create world-class digital solutions and make an impact. Our team has been part of building the solutions for the likes of Zomato, Yes Bank, Tata Motors, Amazon, Snapdeal, Ola, Practo, Vodafone, Meesho, Volkswagen, Droom and many more. We are a group of people who just could not leave our college-life behind and the inception of Auriga was solely based on a desire to keep working together with friends and enjoying the extended college life. Who Has not Dreamt of Working with Friends for a Lifetime Come Join In https://www.aurigait.com/ -https://aurigait.com/https://aurigait.com
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: R Analytics Lead Experience: 8-10 years in Analytics (SAS/SPSS/R/Python) with at least the last 4 years in a senior position focusing on R Studio, R Server, and similar Location: Mumbai, India [Full Time Office Hours] Department: Business Analytics Company : Smartavya Analytica Private limited is niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PB’s in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialisation in very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Summary: We are seeking a highly skilled R Analytics Lead to oversee our analytics team and drive data-driven decision-making processes. The ideal candidate will have extensive experience in R programming, data analysis, and statistical modelling, and will be responsible for leading analytics projects that provide actionable insights to support business objectives. Responsibilities and Duties: Lead the development and implementation of advanced analytics solutions using R. Manage and mentor a team of data analysts and data scientists. Collaborate with cross-functional teams to identify business needs and translate them into analytical Solutions. Design and execute complex data analyse, including predictive modelling, machine learning and statistical analysis. Develop and maintain data pipelines and ETL processes. Ensure the accuracy and integrity of data and analytical results Academic Qualifications: Bachelor’s or Master’s degree in Statistics, Computer Science, Data Science, or a related field. Skills : Extensive experience with R programming and related libraries (e.g., ggplot2, dplyr, caret). Strong background in statistical modelling, machine learning, and data visualization. Proven experience in leading and managing analytics teams. Excellent problem-solving skills and attention to detail. Strong communication skills, with the ability to present complex data insights to non-technical audiences. Experience with other analytics tools and programming languages (e.g., Python, SQL) is a plus Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Responsibilities: We are seeking an experienced Data Scientist to lead the development of a Data Science program . You will work closely with various stakeholders to derive deep industry knowledge across paper, water, leather, and performance chemical industries . You will help develop a data strategy for the company including collection of the right data, creation of the data science project portfolio, partnering with external providers , and augmenting capabilities with additional internal hires. A large part of the job is communicating and developing relationships with key stakeholders and subject matter experts to tee up proofs of concept (PoC) projects to demonstrate how data science can solve old problems in unique and novel ways . You will not have a large internal team to rely on, at least initially, so individual expertise, breadth of data science knowledge , and ability to partner with external companies will be essential for success. In addition to the pure data science problems, you will be working closely with a multi-disciplinary team consisting of sensor scientists, software engineers, network engineers, mechanical/electrical engineers, and chemical engineers in the development and deployment of IoT solutions . Basic Qualification: Bachelor’s degree in a quantitative field such as Data Science, Statistics, Applied Mathematics, Physics, Engineering, or Computer Science 5+ years of relevant working experience in an analytical role involving data extraction, analysis, and visualization and expertise in the following areas: Expertise in one or more programming languages : R, Python, MATLAB, JMP, Minitab, Java, C++, Scala Key libraries such as Sklearn, XGBoost, GLMNet, Dplyr, ggplot, RShiny Experience and knowledge of data mining algorithms including supervised and unsupervised machine learning techniques in areas such as Gradient Boosting, Decision Trees, Multivariate Regressions, Logistic Regression, Neural Network, Random Forest, SVM, Naive Bayes, Time Series, Optimization Microsoft IoT/data science toolkit : Azure Machine Learning, Datalake, Datalake Analytics, Workbench, IoT Hub, Stream Analytics, CosmosDB, Time Series Insights, Power BI Data querying languages : SQL, Hadoop/Hive A demonstrated record of success with a verifiable portfolio of problems tackled Preferred Qualifications: Master’s or PhD degree in a quantitative field such as Data Science, Statistics, Applied Mathematics, Physics, Engineering, or Computer Science Experience in the specialty chemicals sector or similar industry Background in engineering, especially Chemical Engineering Experience starting up a data science program Experience working with global stakeholders Experience working in a start-up environment, preferably in an IoT company Knowledge in quantitative modeling tools and statistical analysis Personality Traits: A strong business focus, ownership, and inner self-drive to develop data science solutions to real-world customers with tangible impact. Ability to collaborate effectively with multi-disciplinary and passionate team members . Ability to communicate with a diverse set of stakeholders . Strong planning and organization skills , with the ability to manage multiple complex projects . A life-long learner who constantly updates skills. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Company Description Technocratic Solutions is a trusted and renowned provider of technical resources on a contract basis, serving businesses globally. With a dedicated team of developers, we deliver top-notch software solutions in cutting-edge technologies such as PHP, Java, JavaScript, Drupal, QA, Blockchain AI, and more. Our mission is to empower businesses worldwide by offering high-quality technical resources that meet project requirements and objectives. We prioritize exceptional customer service and satisfaction, delivering our services quickly, efficiently, and cost-effectively. Join us and experience the difference of working with a reliable partner driven by excellence and focused on your success. Job Title: AI/ML Engineer – Generative AI, Databricks, R Programming Location: Delhi NCR / Pune Experience Level: 5 years Job Summary: We are seeking a highly skilled and motivated AI/ML Engineer with hands-on experience in Generative AI, Databricks, and R programming to join our advanced analytics team. The ideal candidate will be responsible for designing, building, and deploying intelligent solutions that drive innovation, automation, and insight generation using modern AI/ML technologies. --- Key Responsibilities: Develop and deploy scalable ML and Generative AI models using Databricks (Spark-based architecture). Build pipelines for data ingestion, transformation, and model training/inference on Databricks. Implement and fine-tune Generative AI models (e.g., LLMs, diffusion models) for various use cases like content generation, summarization, and simulation. Leverage R for advanced statistical modeling, data visualization, and integration with ML pipelines. Collaborate with data scientists, data engineers, and product teams to translate business needs into technical solutions. Ensure reproducibility, performance, and governance of AI/ML models. Stay updated with the latest trends and technologies in AI/ML and GenAI and apply them where applicable. --- Required Skills & Qualifications: Bachelor's/Master’s degree in Computer Science, Data Science, Statistics, or a related field. 5 years of hands-on experience in Machine Learning/AI, with at least 2 year in Generative AI. Proficiency in Databricks, including Spark MLlib, Delta Lake, and MLflow. Strong command of R programming, especially for statistical modeling and data visualization (ggplot2, dplyr, caret, etc.). Experience with LLMs, transformers (HuggingFace, LangChain, etc.), and other GenAI frameworks. Familiarity with Python, SQL, and cloud platforms (AWS/Azure/GCP) is a plus. Excellent problem-solving, communication, and collaboration skills. Preferred: Certifications in Databricks, ML/AI (e.g., Azure/AWS ML), or R. Experience in regulated industries (finance, healthcare, etc.). Exposure to MLOps, CI/CD for ML, and version control (Git). --- What We Offer: Competitive salary and benefits Flexible work environment Opportunities for growth and learning in cutting-edge AI/ML Collaborative and innovative team culture --- Would you like this tailored to a specific company, industry, or seniority level (e.g., Lead, Junior, Consultant)? Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
R Shiny Web Developer Job Summary: We are seeking a skilled and results-driven R Shiny Web Developer with 4–5 years of experience in developing interactive web applications using R and Shiny. The ideal candidate will be responsible for building, deploying, and maintaining scalable dashboards and data-driven applications to support business intelligence and analytical needs. Key Responsibilities: Design, develop, and maintain interactive web applications using R Shiny. Collaborate with data scientists and analysts to understand requirements and translate them into user-friendly applications. Optimize Shiny applications for performance and scalability. Integrate APIs and external data sources into Shiny apps. Implement data visualization best practices using ggplot2, plotly, and other R libraries. Ensure security, accessibility, and cross-browser compatibility of applications. Write clean, efficient, and well-documented R code. Troubleshoot, debug, and upgrade existing applications. Work in an agile environment and contribute to sprint planning, reviews, and stand-ups. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Statistics, Data Science, or a related field. 4–5 years of hands-on experience in R and Shiny application development. Proficient in R packages like dplyr, tidyr, data.table, ggplot2, shinyjs, etc. Experience with HTML, CSS, and JavaScript for front-end customization. Solid understanding of reactive programming in Shiny. Experience with version control tools like Git. Ability to work with large datasets and optimize data handling in R. Strong problem-solving skills and attention to detail. Good communication and documentation skills. Preferred Skills (Nice to Have): Experience with Docker, AWS, or other cloud platforms. Familiarity with SQL, Python, or Tableau. Knowledge of deploying Shiny apps using Shiny Server or RStudio Connect. Understanding of RESTful APIs and integration techniques. If Intrested. Please submit your CV to Khushboo@Sourcebae.com or share it via WhatsApp at 8827565832 Stay updated with our latest job opportunities and company news by following us on LinkedIn: :https://www.linkedin.com/company/sourcebae Show more Show less
Posted 1 month ago
0 years
6 - 9 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Associate Analyst, R Programmer-1 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About the Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 month ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Company Description UsefulBI Corporation provides comprehensive solutions across Data Engineering, Data Science, AI/ML, and Business Intelligence. The company's mission is to empower astute business decisions through integrating data insights and cutting-edge AI. UsefulBI excels in data architecture, cloud strategies, Business Intelligence, and Generative AI to deliver outcomes that surpass individual capabilities. Role Description We are seeking a skilled R and Python Developer with hands-on experience developing and deploying applications using Posit (formerly RStudio) tools, including Shiny Server, Posit Connect, and R Markdown. The ideal candidate will have a strong background in data analysis, application development, and creating interactive dashboards for data-driven decision-making. Key Responsibilities Design, develop, and deploy interactive web applications using R Shiny and Posit Connect. Write clean, efficient, and modular code in R and Python for data processing and analysis. Build and maintain R Markdown reports and Python notebooks for business reporting. Integrate R and Python scripts for advanced analytics and automation workflows. Collaborate with data scientists, analysts, and business users to gather requirements and deliver scalable solutions. Troubleshoot application issues and optimize performance on Posit platform (RStudio Server, Posit Connect). Work with APIs, databases (SQL, NoSQL), and cloud platforms (e.g., AWS, Azure) as part of application development. Ensure version control using Git and CI/CD for application deployment. Required Qualifications 4+ years of development experience using R and Python. Strong experience with Shiny apps, R Markdown, and Posit Connect. Proficient in using packages like dplyr, ggplot2, plotly, reticulate, and shiny. Experience with Python data stack (pandas, numpy, matplotlib, etc.) Hands-on experience with deploying apps on Posit Server / Connect. Familiarity with Git, Docker, and CI/CD tools. Excellent problem-solving and communication skills. (ref:hirist.tech) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-3 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250450 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-2 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250449 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst, R Programmer-1 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard’s efforts to build a more inclusive and sustainable digital economy The Economics Institute was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company’s product suites About The Role We are looking for an R programmer to join Mastercard’s Economics Institute, reporting to the team lead for Economics Technology. An individual who will: create clear, compelling data visualisations that communicate economic insights to diverse audiences develop reusable R functions and packages to support analysis and automation create and format analytical content using R Markdown and/or Quarto design and build scalable Shiny apps develop interactive visualisations using JavaScript charting libraries (e.g. Plotly, Highcharts, D3.js) or front-end frameworks (e.g. React, Angular, Vue.js)work with databases and data platforms (eg. SQL, Hadoop) write clear, well-documented code that others can understand and maintain collaborate using Git for version control All About You proficient in R and the RStudio IDE proficient in R packages like dplyr for data cleaning, transformation, and aggregation familiarity with dependency management and documentation in R (e.g. roxygen2) familiar with version control concepts and tools (e.g. Git, GitHub, Bitbucket) for collaborative development experience writing SQL and working with relational databases creative and passionate about data, coding, and technology strong collaborator who can also work independently organized and able to prioritise work across multiple projects comfortable working with engineers, product owners, data scientists, economists Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250448 Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
About Chargebee: Chargebee is a leading provider of billing and monetization solutions for thousands of businesses at every stage of growth — from early-stage startups to global enterprises. With our powerful suite of multi-product solutions, Chargebee helps businesses unlock unparalleled revenue growth, experiment with new offerings and monetization models, and stay globally compliant as they scale. Chargebee counts innovative companies like Zapier, Freshworks, DeepL, Condé Nast, and Pret A Manger amongst its global customer base and is proud to have been consistently recognized by customers as a Leader in Subscription Management on G2. With headquarters in North Bethesda, Maryland, our 1000+ team members work remotely throughout the world, including in India, Europe and the US. Job Summary: The Chargebee Product Analytics team provides analytical insights and solutions for critical cross-company functions and processes that affect how we show up in the world for our customers, including SaaS Billing, Retention, Customer Focus, and Payments. As a Data Scientist, you can use your quantitative skill set, empathy towards customers, and collaborative spirit to work closely with Product, Ops, Marketing, Engineering, and Finance to provide data-driven insights. In addition, it will allow you to find new opportunities to take our product and business to the next level. As a Data Scientist, you'll focus on the following: The analysis, manipulation, visualization of data, and data story-telling At times you shouldn't hesitate to do data engineering/ETL (extract-transform-load) Provide product or business insights and analytical strategy There will be an additional focus on the design and analysis of experiments, and pseudo-control techniques We've noticed that the title "Data Scientist" represents various work across the industry. If you're more passionate about building machine learning (ML) or econometric models, please check out the Machine Learning roles on our career page! Roles and Responsibilities: Refine ambiguous questions and generate new hypotheses about the product and business through a deep understanding of the data, our customers, and our business. Design experiments and interpret the results to draw detailed and impactful conclusions. Define how our teams measure success by developing Key Performance Indicators, Input /Output metrics, and other customers/business metrics in close partnership with product and other subject areas such as engineering, operations, and marketing Collaborate with data scientists, BI teams, and engineers to build and improve the availability, integrity, accuracy, and reliability of data logging/telemetry, data pipelines, and reporting insights. Develop data-driven business insights and work with cross-functional partners to identify opportunities and recommend prioritization of product, growth, retention, and optimization initiatives. Must Haves: M.S. or Bachelor's degree in Math, Economics, Statistics, Engineering, Computer Science, or other quantitative fields. (If an M.S. degree, a minimum of 2+ years of industry experience is required, and if a Bachelor's degree, a minimum of 3+ years of industry experience as a Product Analyst, Data Scientist, or equivalent) Experience in at least one scripting language (SQL, Python, R, or SAS preferred) Basic understanding of experimental design (such as A/B experiments) and statistical methods Willingness to get your hands dirty with messy data to identify product opportunities and streamline data pipeline, automate and gain efficiency. Ability to communicate effectively and manage relationships with partners coming from both technical and non-technical backgrounds Strong story-telling: Distill interesting and hard-to-find insights into a compelling, concise data story Experience with experimental design and statistical methods such as causal Inference. Strong judgment, critical thinking, and decision-making skills Ability to solve complex business problems that cross multiple product/project areas and teams Balance attention to detail with swift execution Nice to have: Advanced degrees in Math, Economics, Statistics, Engineering, Computer Science, Operation Research, Quantitative analysis, Machine Learning, or other quantitative fields. 5+ years of industry experience in consumer-facing product and data science Data Science or Product analytics experience in SaaS companies or the B2B fintech sector. Technical Skillset: SQL (Advanced) : Highly proficient in writing complex SQL queries for data extraction, manipulation, and analysis from various databases. This includes understanding different joins, subqueries, window functions, and optimization techniques. Data Analysis & Interpretation: Strong ability to collect, clean, process, analyze, and interpret large and complex datasets to identify trends, patterns, anomalies, and root causes. Statistical Analysis: Solid understanding of statistical concepts (hypothesis testing, regression, correlation, A/B testing, sampling distributions) and their application to product and business problems. Programming (Python/R): Proficiency in Python or R for data manipulation (Pandas/dplyr), statistical modeling, machine learning (scikit-learn, TensorFlow/PyTorch - depending on the data science depth), and automation of data tasks. Data Visualization : Expertise in creating clear, compelling, and actionable dashboards and reports using tools like Tableau. Ability to tell a story with data. A/B Testing & Experimentation: Experience in designing, executing, and analyzing A/B tests to measure the impact of product changes and new features. Understanding of experimental design principles. Product Analytics Tools: Familiarity with product analytics platforms (e.g., Sentry, Pendo, Google Analytics, Splunk) for tracking user behavior, engagement, and feature adoption. Business & Product Skills: Deep understanding of the B2B SaaS fintech business model, sales cycles, customer journey, and revenue drivers. Product Strategy & Roadmap: Ability to translate data insights into actionable product recommendations and contribute to product strategy and roadmap development. Market Research & Competitive Analysis: Experience in conducting market research, analyzing competitor products and strategies, and identifying market opportunities. User Experience (UX) Understanding: Ability to analyze user behavior data to identify pain points, optimize user flows, and enhance overall user experience. Problem Solving: Highly analytical and methodical approach to identifying problems, breaking them down, and developing data-driven solutions. Soft Skills: Logical Thinking : Exceptional ability to think critically, identify underlying issues, and construct sound arguments based on data. Stakeholder Management : Proven ability to effectively communicate with and manage expectations of diverse stakeholders (product managers, engineers, sales, marketing, leadership). This includes translating complex technical findings into understandable business insights. Communication (Verbal & Written) : Excellent presentation skills, ability to articulate complex concepts clearly and concisely to both technical and non-technical audiences. Strong documentation skills. Collaboration : Ability to work effectively in cross-functional teams, fostering a collaborative environment. Proactive & Curious : A self-starter with a strong sense of curiosity, a desire to dig deeper into data, and a proactive approach to identifying opportunities. Adaptability : Ability to thrive in a fast-paced, dynamic B2B SaaS environment and adapt to evolving priorities. We are Globally Local With a diverse team across four continents, and customers in over 60 countries, you get to work closely with a global perspective right from your own neighborhood. We value Curiosity We believe the next great idea might just be around the corner. Perhaps it’s that random thought you had ten minutes ago. We believe in creating an ecosystem that fosters a desire to seek out hard questions, and then figure out answers to them. Customer! Customer! Customer! Everything we do is driven towards enabling our customers’ growth. This means no matter what you do, you will always be adding real value to a real business problem. It’s a lot of responsibility, but also a lot of fun. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
19 - 20 Lacs
Hyderabad, Bengaluru
Work from Office
Advance R programming skills (relevant experience over last 12-24 months) especially with packages like tidyverse, dplyr, data.table, dtplyr, Rcpp and shiny.
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
📢 NOW HIRING: EXPERT TEACHERS FOR JAVA, PYTHON, SPSS, SAS & R PROGRAMMING We at Dr. Sourav Sir’s Classes , one of India’s leading coaching institutes, are expanding our technical and data science education team. We are looking for dedicated educators and subject matter experts who can teach one or more of the following subjects: 🔹 Subjects Open for Recruitment JAVA Teach fundamentals to advanced concepts like OOP, data structures, file handling, multithreading, and JavaFX. Candidates should be well-versed in IDEs like Eclipse or IntelliJ. PYTHON Cover programming basics, data types, control structures, functions, modules, OOP in Python, and libraries like NumPy, Pandas, and Matplotlib. Exposure to Flask or Django is a plus. SPSS Teach usage of SPSS for statistical analysis, data visualization, descriptive stats, regression, and ANOVA. Ideal for candidates from research, statistics, or social sciences background. SAS Must cover BASE SAS, DATA step programming, PROC SQL, and statistical procedures. Candidates with certification or industry experience will be given preference. R Programming Should be able to teach R from scratch including data frames, statistical analysis, plotting, ggplot2, and packages like dplyr and tidyr. Exposure to RStudio and R Markdown is necessary. 🎓 Who Can Apply? (Eligibility & Skills Required) Bachelor's / Master's / Ph.D. in Computer Science , Statistics , Mathematics , Data Science , or related disciplines. Teaching experience (even informal or tutoring) is highly preferred. Should have strong command over the subject(s) applied for. Must be able to communicate concepts in simple and relatable ways . Prior exposure to academic content development or online course delivery is a bonus. Enthusiasts who can motivate and mentor students aiming for academic or professional excellence. 📘 Job Description & Responsibilities Lecture Delivery : Conduct high-quality classes (online or offline) with structured topic progression. Doubt Clarification : Solve student doubts and guide them through conceptual understanding. Content Creation : Develop assignments, practice problems, solutions, quizzes, and mock tests. Project Support : Assist students in practical/academic projects using relevant tools. Academic Strategy : Participate in course planning and innovation in teaching methods. 🧭 Work Format & Job Type Job Nature : Full-time / Part-time / Freelance (flexibility based on availability and expertise). Mode of Teaching : Online (Pan-India and global outreach) Offline (Kolkata center only) Suitable for: College Professors Working Professionals Researchers Doctoral Candidates Freelance Tutors 💼 What You Get (Benefits & Opportunities) Attractive Pay Package : Compensation as per subject and hours taught—flexible and competitive. Work-Life Balance : Flexible timing and teaching mode. Reputation : Join one of India’s most trusted educational brands. Academic Exposure : Opportunity to teach students preparing for IIT JAM, ISI, Actuarial Science, Data Science roles, and global exams. Growth : Expand your profile through content publication, webinars, and live classes. ✅ How to Apply? 📩 Step 1 – WhatsApp your updated CV to: 8981679014 📞 Step 2 – For any query or further information, Call or WhatsApp : 8981679014 Join Dr. Sourav Sir’s Classes and be a guide, a mentor, and a changemaker in the world of education. 🔗 Your knowledge deserves a wider platform. Let’s build futures together. Show more Show less
Posted 1 month ago
3.0 - 5.0 years
30 - 35 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
About the Role: We are seeking a skilled and versatile Data Scientist/Software Developer with a strong background in R programming and Shiny app development. This role will involve working on a variety of projects, each with defined objectives and timelines. Successful candidates will be comfortable in a dynamic environment, adapting to different datasets, analytical requirements, and business needs. You will be a key contributor to translating data into actionable insights through the development of robust and user-friendly Shiny applications. Responsibilities: Develop Interactive Shiny Applications: Design, develop, test, and deploy interactive web applications using the R Shiny framework. Focus on creating user-friendly interfaces that effectively present data analysis and insights. R Programming: Utilize strong R programming skills to perform data cleaning, manipulation, analysis, and visualization. This includes implementing statistical models, data mining techniques, and machine learning algorithms as needed. SQL Database Interaction: Write efficient SQL queries to extract, transform, and load data from relational databases. Ensure data integrity and optimal performance for application use. Data Understanding and Interpretation: Work closely with stakeholders to understand their requirements and translate them into robust analytical solutions. Interpret data insights and communicate findings effectively. Code Documentation and Best Practices: Write clean, well-documented, and maintainable code. Adhere to software development best practices, including version control and testing methodologies. Project Management: Manage your time effectively across multiple shorter-term projects. Work independently while keeping stakeholders updated on progress and issues. Problem Solving: Ability to understand business problems and develop data-driven solutions and develop novel approaches, if needed. Collaboration: Work collaboratively with other data scientists, analysts, and business teams as needed. Required Skills & Qualifications: Expertise in R Programming: Proven experience in using R for data manipulation, statistical analysis, and visualization. Minimum of 3-5 years of experience in a relevant technical role. Shiny App Development: Strong understanding and practical experience in developing interactive dashboards and applications using R Shiny. Minimum of 3-5 years of experience in a relevant technical role. SQL Proficiency: Ability to write and optimize SQL queries to interact with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Data Analysis Skills: Ability to perform exploratory data analysis, apply statistical methods, and interpret results. Problem-Solving Aptitude: Strong analytical and problem-solving skills with the ability to translate business requirements into technical solutions. Communication Skills: Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical audiences. Experience with Version Control: Familiarity with Git and GitHub or similar version control systems. Adaptability: Ability to manage multiple projects concurrently and adapt to changing priorities. Remote work: Comfortable working remotely and communicating regularly. Preferred Skills: Experience with other R packages relevant to data analysis and visualization (e.g., dplyr, ggplot2, tidyr). Knowledge of basic statistical modeling and machine learning techniques Experience with cloud-based data environments (e.g., AWS, Google Cloud, Azure). Familiarity with deployment of shiny apps. Understanding of data security practices. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad.
Posted 2 months ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : R Developer Required Technical Skill Set: R Data structures (vectors, lists, matrices, dataframe)/Data manipulation and cleaning using dplyr/Data manipulation and cleaning using dplyr/ Data visualization using ggplot2 (other plotting libraries)/working with different data formats (CSV, Excel, JSON, database)/Jenkins/GitHub/BDD frameworks/Agile Scrum. Work location: Noida, Bangalore, Thane, Pune, Kolkata, Hyderabad, Chennai/Pan India Experience: 5+ Years Desired Competencies (Technical/Behavioral Competency) Must-Have: 1. R 2. R - Markdown Good-to-Have: 1. SQL 2. Posit Workbench, Connect and Package Manager 3. Jenkins 4. Python 5. JavaScript 6. HTML/CSS 7. GitHub 8. Agile Scrum Responsibility of / Expectations from the Role: 1 Design, develop, test and deploy interactive R web applications 2 Worked on R packages like Data. Table, dplyr, tidyr, ggplot2, shinydashboard among others 3 Write clean, efficient and well documented R code, conduct R code reviews and R programming validation 4 Implement unit tests and ensure the quality and performance of applications. 5 Benchmark and optimize application performance. 6 Identify inconsistencies and initiate resolution of data, analytical and reporting problems. 7 Ensure applications are robust and reliable. 8 Translate complex data analysis and visualization tasks into clear and user-friendly interfaces. 9 Involving in Agile, DevOps & Automation of testing, build, deployment, CI/CD, etc. 10 Understanding of testing strategies & frameworks (Understanding of BDD/Gherkin Show more Show less
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough