Mumbai, Maharashtra, India
Not disclosed
On-site
Full Time
Job Overview: We are seeking a talented R Analytics Support to join our analytics team. The ideal candidate will possess a strong background in data analysis, statistical modeling, and proficiency in the R programming language. You will be responsible for analyzing complex datasets, providing insights, and developing statistical models to support business decisions. Key Responsibilities:Utilize R programming to analyze large and complex datasets, performing data cleaning, transformation, and analysis.Develop and implement statistical models (regression, time series, classification, etc.) to provide actionable insights.Conduct exploratory data analysis (EDA) to identify trends, patterns, and anomalies.Visualize data through plots, charts, and dashboards to effectively communicate results to stakeholders.Collaborate with cross-functional teams to define business problems and develop analytical solutions.Build and maintain R scripts and automation workflows for repetitive tasks and analysis.Stay updated with the latest developments in R packages and data science techniques.Present findings and insights to stakeholders through clear, concise reports and presentations.Provide technical support and guidance to data analysts and scientists on R-related issues, Troubleshoot and resolve R code errors and performance issues, Develop and maintain R packages and scripts to support data analysis and reporting and collaborate with data analysts and scientists to design and implement data visualizations and reports. Qualifications:Bachelor’s/Master’s degree in Statistics, Mathematics, Data Science, Computer Science, or a related field.at least the last 3-5 years in a senior role specifically focusing on R Language, R Studio, and SQL.Strong knowledge of statistical techniques (regression, clustering, hypothesis testing, etc.).Experience with data visualization tools like ggplot2, shiny, or plotly.Familiarity with SQL and database management systems.Knowledge of machine learning algorithms and their implementation in R.Ability to interpret complex data and communicate insights clearly to non-technical stakeholders.Strong problem-solving skills and attention to detail.Familiarity with version control tools like Git is a plus.
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Job Title: Sr. Product AI Engineer - Front End Software Developer Experience : 4–8 years Location : Pune Type : Full-time About the Role We’re looking for a highly motivated Sr. Product AI Engineer- Frontend Developer with proficiency in building AI based desktop apps using TypeScript and frameworks like Electron, Node.js or Tauri . You will lead the development of scalable and secure user interfaces, work on local API integrations, and optimize performance for cross-platform environments. Key Responsibilities • Develop user-friendly and efficient desktop UI for Windows and macOS. • Implement and consume local/offline APIs using REST/WebSocket protocols. • Integrate AI model workflows into the UI (offline/local deployment). • Ensure security compliance in application design and data handling. • Package and deploy desktop apps using cross-platform build tools. • Optimize app performance for speed and responsiveness. • Collaborate closely with backend, ML, and DevOps teams. • Be open to working flexible or extended hours during high-priority phases. Required Skills • TypeScript – Expert in scalable UI/application logic. • Electron or Tauri – Hands-on experience with desktop app frameworks. • Node.js – Understanding of backend service integration. • REST/WebSocket – Ability to build and consume APIs for local data exchange. • Secure Coding – Knowledge of privacy-first and secure app design. • Linux – Comfortable with Linux-based dev and deployment environments. Nice-to-Have Skills • Familiarity with AI/ML model APIs (local or hosted). • Knowledge of Redis or SQLite for lightweight data storage. • Experience in plugin/module system architecture. • Skills in cross-platform build automation (e.g., electron-builder, pkg). • Experience working in air-gapped or security-restricted environments. Ideal Candidate Traits • Curious and proactive — thrives in fast-moving, collaborative teams. • Strong sense of ownership and accountability. • Demonstrates a growth mindset and embraces continuous learning. • Clear communicator, especially in cross-functional settings Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Job Title: Data Modeler / Data Analyst Experience : 6 – 8 Years Location : Pune Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platform—primarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities · Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). · Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. · Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. · Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. · Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). · Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. · Conduct impact assessments for schema changes and guide version-control processes for data models. · Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. · Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills · 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. · Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. · Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). · Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. · Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. · Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). · Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have · Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education · Bachelor’s or master’s degree in computer science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered. Show more Show less
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.