Jobs
Interviews
31 Job openings at Smartavya Analytica
About Smartavya Analytica

Smartavya Analytica is a data analytics company that specializes in providing advanced analytical solutions to businesses using AI and machine learning concepts. They focus on enabling organizations to make data-driven decisions by providing insights and predictive analyses.

Linux System Admin

Mumbai

2 - 6 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Work Mode: Work from Office Mandatory (24/7 Rotational Shift) Responsibilities: Assist in System Setup: Help in the installation and configuration of Platform's/Networking/Linux servers (e.g., CentOS, Ubuntu, or RHEL) under guidance. Basic Monitoring: Monitor system performance and ensure that servers are running smoothly with basic supervision. Software Updates & Patches: Assist in applying basic security patches and software updates to maintain system security. User Management: Learn to manage user accounts, groups, and permissions on servers. Security Assistance: Assist in securing Linux servers by applying encryption and decryption techniques for sensitive data. Implement basic encryption (e.g., file system encryption, data-at-rest) and decryption protocols. Ensure user authentication through methods like SSH keys and password policies. Transport Layer Security (TLS/SSL): Help in setting up and managing SSL/TLS certificates for secure communications over networks (e.g., securing web servers with HTTPS). Learn to configure TLS/SSL on services such as Apache, Nginx, or other server applications. Documentation: Help maintain and update system documentation for procedures and configurations. Troubleshooting Support: Assist in troubleshooting minor system or hardware issues with guidance from senior administrators. Backup & Recovery: Help with backup procedures to ensure data integrity and assist in system recovery processes. Collaboration: Work closely with other IT professionals and senior system administrators to learn best practices. Requirements: Understanding of Linux operating systems (e.g., Ubuntu, CentOS, RHEL) and Networking Fundamentals Familiaritywith command-line tools and terminal usage. Strong problem-solving attitude and a willingness to learn.

SQL Reporting Support

Chennai

5 - 7 years

INR 7.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Design, develop, and maintain SQL-based reports using tools such as SSRS, Power BI, or other reporting platforms. Troubleshoot and optimize SQL queries for performance and accuracy. Collaborate with business users to gather reporting requirements and translate them into functional reports. Provide support for existing reports and implement changes as required. Maintain data integrity and consistency across reporting platforms. Required Skills: Strong proficiency in SQL, including writing complex queries and stored procedures. Experience with reporting tools like SSRS, Power BI, or Tableau. Familiarity with data warehousing concepts and ETL processes. Ability to analyze and troubleshoot data-related issues. Good communication and documentation skills. Preferred Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Experience in a support environment and ticket management systems. Knowledge of scripting languages (e.g., Python, PowerShell) is a plus.

Informatica Lead

Chennai

6 - 10 years

INR 10.0 - 15.0 Lacs P.A.

Work from Office

Full Time

Experience : 5-10 years in ETL development, with 3+ years in a leadership role and extensive hands-on experience in Informatica PowerCenter and Cloud Data Integration. Job Overview: We are seeking a highly skilled and experienced Informatica Lead to join our IT team. The ideal candidate will lead a team of ETL developers and oversee the design, development, and implementation of ETL solutions using Informatica PowerCenter and Cloud Data Integration. This role requires expertise in data integration, leadership skills, and the ability to work in a dynamic environment to deliver robust data solutions for business needs. Key Responsibilities: ETL Development and Maintenance: Lead the design, development, and maintenance of ETL workflows and mappings using Informatica PowerCenter and Cloud Data Integration. Ensure the reliability, scalability, and performance of ETL solutions to meet business requirements. Optimize ETL processes for data integration, transformation, and loading into data warehouses and other target systems. Solution Architecture and Implementation: Collaborate with architects and business stakeholders to define ETL solutions and data integration strategies. Develop and implement best practices for ETL design and development. Ensure seamless integration with on-premises and cloud-based data platforms. Data Governance and Quality: Establish and enforce data quality standards and validation processes. Implement data governance and compliance policies to ensure data integrity and security. Perform root cause analysis and resolve data issues proactively. Team Leadership: Manage, mentor, and provide technical guidance to a team of ETL developers. Delegate tasks effectively and ensure timely delivery of projects and milestones. Conduct regular code reviews and performance evaluations for team members. Automation and Optimization: Develop scripts and frameworks to automate repetitive ETL tasks. Implement performance tuning for ETL pipelines and database queries. Explore opportunities to improve efficiency and streamline workflows. Collaboration and Stakeholder Engagement: Work closely with business analysts, data scientists, and application developers to understand data requirements and deliver solutions. Communicate project updates, challenges, and solutions to stakeholders effectively. Act as the primary point of contact for Informatica-related projects and initiatives. Academic Qualifications: Bachelor's degree in Computer Science, Information Technology, or equivalent. Relevant certifications (e.g., Informatica Certified Specialist, Informatica Cloud Specialist) are a plus. Experience : 6-10 years of experience in ETL development and data integration, with at least 3 years in a leadership role. Proven experience with Informatica PowerCenter, Informatica Cloud Data Integration, and large-scale ETL implementations. Experience in integrating data from various sources such as databases, flat files, and APIs. Technical Skills: Strong expertise in Informatica PowerCenter, Informatica Cloud, and ETL frameworks. Proficiency in SQL, PL/SQL, and performance optimization techniques. Knowledge of cloud platforms like AWS, Azure, or Google Cloud. Familiarity with big data tools such as Hive, Spark, or Snowflake is a plus. Strong understanding of data modeling concepts and relational database systems. Soft Skills: Excellent leadership and project management skills. Strong analytical and problem-solving abilities. Effective communication and stakeholder management skills. Ability to work under tight deadlines in a fast-paced environment

R Programmer

Mumbai

2 - 7 years

INR 4.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Department : Business Analytics Job Overview: We are seeking a talented R Analytics to join our analytics team. The ideal candidate will possess a strong background in data analysis, statistical modeling, and proficiency in the R programming language. You will be responsible for analyzing complex datasets, providing insights, and developing statistical models to support business decisions. Key Responsibilities: Utilize R programming to analyze large and complex datasets, performing data cleaning, transformation, and analysis. Develop and implement statistical models (regression, time series, classification, etc.) to provide actionable insights. Conduct exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Visualize data through plots, charts, and dashboards to effectively communicate results to stakeholders. Collaborate with cross-functional teams to define business problems and develop analytical solutions. Build and maintain R scripts and automation workflows for repetitive tasks and analysis. Stay updated with the latest developments in R packages and data science techniques. Present findings and insights to stakeholders through clear, concise reports and presentations. Qualifications : Bachelor's/Master's degree in Statistics, Mathematics, Data Science, Computer Science, or a related field. at least the last 2-5 years in a senior role specifically focusing on R Language, R Studio, and SQL. Strong knowledge of statistical techniques (regression, clustering, hypothesis testing, etc.). Experience with data visualization tools like ggplot2, shiny, or plotly. Familiarity with SQL and database management systems. Knowledge of machine learning algorithms and their implementation in R. Ability to interpret complex data and communicate insights clearly to non-technical stakeholders. Strong problem-solving skills and attention to detail. Familiarity with version control tools like Git is a plus.

Lead Data Engineer

Pune

8 - 10 years

INR 13.0 - 15.0 Lacs P.A.

Work from Office

Full Time

We are seeking a hands-on Lead Data Engineer to drive the design and delivery of scalable, secure data platforms on Google Cloud Platform (GCP). In this role you will own architectural decisions, guide service selection, and embed best practices across data engineering, security, and performance disciplines. You will partner with data modelers, analysts, security teams, and product owners to ensure our pipelines and datasets serve analytical, operational, and AI/ML workloads with reliability and cost efficiency. Familiarity with Microsoft Azure data services (Data Factory, Databricks, Synapse, Fabric) is valuable, as many existing workloads will transition from Azure to GCP. Key Responsibilities Lead end-to-end development of high-throughput, low-latency data pipelines and lake-house solutions on GCP (BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Composer, Dataplex, etc.). Define reference architectures, technology standards for data ingestion, transformation, and storage. Drive service-selection trade-offscost, performance, scalability, and securityacross streaming and batch workloads. Conduct design reviews and performance tuning sessions; ensure adherence to partitioning, clustering, and query-optimization standards in BigQuery. Contribute to long-term cloud data strategy, evaluating emerging GCP features and multi-cloud patterns (Azure Synapse, Data Factory, Purview, etc.) for future adoption. Lead the code reviews and oversee the development activities delegated to Data engineers. Implement best practices recommended by Google Cloud Provide effort estimates for the data engineering activities Participate in discussions to migrate existing Azure workloads to GCP, provide solutions to migrate the work loads for selected data pipelines Must-Have Skills 810 years in data engineering, with 3+ years leading teams or projects on GCP. Expert in GCP data services (BigQuery, Dataflow/Apache Beam, Dataproc/Spark, Pub/Sub, Cloud Storage) and orchestration with Cloud Composer or Airflow. Proven track record designing and optimizing large-scale ETL/ELT pipelines (streaming + batch). Strong fluency in SQL and one major programming language (Python, Java, or Scala). Deep understanding of data lake / lakehouse, dimensional & data-vault modeling, and data governance frameworks. Excellent communication and stakeholder-management skills; able to translate complex technical topics to non-technical audiences. Nice-to-Have Skills Hands-on experience with Microsoft Azure data services (Azure Synapse Analytics, Data Factory, Event Hub, Purview). Experience integrating ML pipelines (Vertex AI, Dataproc ML) or real-time analytics (BigQuery BI Engine, Looker). Familiarity with open-source observability stacks (Prometheus, Grafana) and FinOps tooling for cloud cost optimization. Preferred Certifications Google Professional Data Engineer (strongly preferred) or Google Professional Cloud Architect Microsoft Certified: Azure Data Engineer Associate (nice to have) Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related technical field. Equivalent professional experience will be considered.

Hadoop Lead

Mumbai

6 - 10 years

INR 10.0 - 16.0 Lacs P.A.

Work from Office

Full Time

Responsibilities Design and Implement Big Data solutions, complex ETL pipelines and data modernization projects. Required Past Experience: 6+ years of overall experience in developing, testing & implementing big data projects using Hadoop, Spark, Hive and Sqoop. Hands-on experience playing lead role in big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members. Experience in setting up Hadoop services, implementing Extract transform and load/Extract load and transform (ETL/ELT) pipelines, working with Terabytes/Petabytes of data ingestion & processing from varied systems Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing High-Level Design & Low-Level Design (HDD & LDD) documents. Required Skills and Abilities: Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash (shell scripting) Secondary Skills Apache Kafka, Storm, Distributed systems, good understanding of networking, security (platform & data) concepts, Kerberos, Kubernetes Understanding of Data Governance concepts and experience implementing metadata capture, lineage capture, business glossary Experience implementing Continuous integration/Continuous delivery (CI/CD) pipelines and working experience with tools like Source code management (SCD) tools such as GIT, Bit bucket, etc. Ability to assign and manage tasks for team members, provide technical guidance, work with architects on High-Level Design, Low-Level Design (HDD & LDD) and Proof of concept. Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and sql, experience in implementing slowly changing dimension (SCD) type 1 & 2, auditing, exception handling mechanism Data Warehousing projects implementation with either Java, or Scala based Hadoop programming background. Proficient with various development methodologies like waterfall, agile/scrum. Exceptional communication, organization, and time management skills Collaborative approach to decision-making & Strong analytical skills Good To Have - Certifications in any of GCP, AWS or Azure, Cloudera' Work on multiple Projects simultaneously, prioritizing appropriately

Tableau Developer

Mumbai

3 - 5 years

INR 5.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Department : Business Intelligence & Analytics Job Summary: We are looking for a Tableau Developer with 3 to 5 years of hands-on experience in designing interactive dashboards and BI solutions. This role requires strong Tableau expertise, solid SQL skills, and a good understanding of integrating Tableau with enterprise data platforms like Cloudera Data Platform (CDP 7.1). You will collaborate with architects, engineers, and business analysts to deliver scalable, high-performance BI solutions. Responsibilities: Design, develop, and maintain interactive dashboards and reports using Tableau Desktop. Follow and help enforce best practices in dashboard design, folder structure, naming conventions, and visualization standards. Collaborate with senior BI and data engineering teams to optimize Tableau's connectivity with Cloudera Data Platform (CDP 7.1) and other enterprise data sources. Implement filters, parameters, extracts, and Hyper Extracts to support performance-efficient reporting. Support query performance tuning, incremental extract strategies, and dashboard performance improvements. Maintain Tableau Server environments by publishing dashboards, managing permissions, and monitoring usage analytics. Participate in code reviews, documentation, and internal knowledge-sharing initiatives. Align dashboards with optimized data models and collaborate closely with backend data engineering teams. Stay current with Tableau feature updates and apply them to enhance dashboard functionality and user experience. Required Skills & Experience: Tableau Expertise: Tableau Desktop: Calculated fields, LOD expressions, dual-axis charts, actions, dashboard formatting, and storytelling. Tableau Server/Online: Publishing, scheduling, permissioning, and tuning. Tableau Prep: Data cleansing and transformation workflows. Data & Querying: SQL: Writing complex queries with joins, subqueries, CTEs, and aggregations. Data Modeling: Star/snowflake schemas, normalization, and data blending.

Hadoop Admin Manager

Chennai

10 - 15 years

INR 8.0 - 14.0 Lacs P.A.

Work from Office

Full Time

Years of Experience : 10-15 Yrs Shifts : 24*7 (Rotational Shift) Mode : Onsite Experience : 10+ yrs of experience in IT, with At least 7+ years of experience with cloud and system administration. At least 5 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem - Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview : Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement : Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files

Senior Data Engineer - GCP

Pune

4 - 6 years

INR 6.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Job Summary We are seeking an energetic Senior Data Engineer with hands-on expertise in Google Cloud Platform to build, maintain, and migrate data pipelines that power analytics and AI workloads. You will leverage GCP servicesBigQuery, Dataflow, Cloud Composer, Pub/Sub, and Cloud Storagewhile collaborating with data modelers, analysts, and product teams to deliver highly reliable, well-governed datasets. Familiarity with Microsoft Azure data services (Data Factory, Databricks, Synapse, Fabric) is valuable, as many existing workloads will transition from Azure to GCP. Key Responsibilities Design, develop, and optimize batch and streaming pipelines on GCP using Dataflow / Apache Beam, BigQuery, Cloud Composer (Airflow), and Pub/Sub. Maintain and enhance existing data workflows—monitoring performance, refactoring code, and automating tests to ensure data quality and reliability. Migrate data assets and ETL / ELT workloads from Azure (Data Factory, Databricks, Synapse, Fabric) to corresponding GCP services, ensuring functional parity and cost efficiency. Partner with data modelers to implement partitioning, clustering, and materialized-view strategies in BigQuery to meet SLAs for analytics and reporting. Conduct root-cause analysis for pipeline failures, implement guardrails for data quality, and document lineage. Must-Have Skills 4-6 years of data-engineering experience, including 2+ years building pipelines on GCP (BigQuery, Dataflow, Pub/Sub, Cloud Composer). Proficiency in SQL and one programming language (Python, Java, or Scala). Solid understanding of ETL / ELT patterns, data-warehouse modeling (star, snowflake, data vault), and performance-tuning techniques. Experience implementing data-quality checks, observability, and cost-optimization practices in cloud environments. Nice-to-Have Skills Practical exposure to Azure data services—Data Factory, Databricks, Synapse Analytics, or Microsoft Fabric. Preferred Certifications Google Professional Data Engineer or Associate Cloud Engineer Microsoft Certified: Azure Data Engineer Associate (nice to have) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related technical field. Equivalent professional experience will be considered.

Hadoop Administrator

Chennai

2 - 5 years

INR 4.0 - 7.0 Lacs P.A.

Work from Office

Full Time

Experience: 2+ yrs of experience in IT, with At least 1+ years of experience with cloud and system administration. At least 2 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: • Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement: • Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files Install and implement security for Hadoop clusters Install Hadoop Updates, patches and version upgrades. Automate the same through scripts Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS Working knowledge of any scripting language like Shell, Python, Perl Should have experience in Orchestration & Deployment tools. Academic Qualification:

Power BI Governance / Platform Engineer

Pune

6 - 8 years

INR 8.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Work Mode: Full-time, Office-based JobSummary Drive the health, governance, and performance of the enterprise PowerBI landscape. You will manage workspaces, gateways, and capacities; enforce governance policies; and resolve platform level issues to deliver a secure, reliable, and cost efficient BI environment. KeyResponsibilities Administer the platform: Monitor and tune PowerBI workspaces, on premises and cloud gateways, and dedicated/shared capacities for optimal performance and cost. Enforce governance & security: Apply and refine naming conventions, workspace lifecycle rules, RLS/OLS settings, and data loss prevention policies; manage access through AzureAD security groups. Troubleshoot & optimise: Diagnose and resolve dataset refresh failures, connectivity issues, and compute/memory bottlenecks, working closely with data engineers and report authors. Automate & report: Use PowerShell and the PowerBI RESTAPI to automate deployments and capacity scaling; surface usage metrics and KPIs to guide continual improvement. Champion best practices: Lead platform upgrades, feature roll outs, and adoption of performance tuning, version control, and CI/CD patterns across the enterprise. Must HaveSkills 6-8years in BI platform administration, with 3+years hands on PowerBI Service / Report Server experience. Deep knowledge of capacity metrics, gateway architecture, dataset refresh mechanics, DAX & PowerQuery performance. Proven expertise in governance frameworks (naming, lifecycle, security, compliance) and access management via AzureAD. Strong scripting/automation skills in PowerShell (preferred), Python, or similar, plus experience with PowerBI RESTAPIs. Familiarity with DevOps / CI CD pipelines for PowerBI assets and basic Azure services (KeyVault, LogAnalytics). Excellent problem solving and cross team communication skills. PreferredCertifications Microsoft Certified: PowerBI Data Analyst Associate (PL 300) Microsoft Certified: Azure Enterprise Data Analyst Associate (DP 500) Microsoft Certified: PowerPlatform Solution Architect Expert (PL 600)

Data Modeler/Data Analyst

Pune

6 - 8 years

INR 8.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Power BI Reporting & Analytics Engineer

Pune

8 - 10 years

INR 10.0 - 12.0 Lacs P.A.

Work from Office

Full Time

Work Mode: Full-time, Office-based JobSummary Transform raw data into compelling stories that drive business decisions. You will design, build, and optimize interactive dashboards and reports with PowerBI, partner with business stakeholders to define KPIs and data models, and ensure every analytic deliverable meets enterprise reporting standards for accuracy, usability, and performance. KeyResponsibilities Collaborate with business teams to gather requirements, identify key performance indicators, and translate them into intuitive PowerBI reports and dashboards. Build robust semantic modelsdefining star/snowflake schemas, measures, and calculated tablesto support self service analytics. Develop advanced DAX calculations, optimized queries, and dynamic visual interactions that deliver near real time insights. Continuously tune data models, visuals, and refresh schedules to maximise performance and minimise cost. Establish and maintain report governance standards (naming conventions, documentation, version control, and accessibility compliance). Mentor analysts and citizen developers on PowerBI best practices and storytelling techniques. Partner with data engineering teams to validate data quality, source new data sets, and enhance the analytics pipeline. Must HaveSkills 6-8years in BI/reporting roles, with 3+years hands on PowerBI design and development experience. Expertise in data modelling concepts (star/snowflake, slowly changing dimensions) and strong command of DAX and PowerQuery (M). Proven ability to translate complex business needs into intuitive KPIs, visuals, and interactive drill downs. Solid SQL skills and familiarity with data warehouse/ETL processes (AzureSynapse, Snowflake, or similar). Experience optimising report performance—query folding, aggregation tables, incremental refresh, composite models, etc. Strong understanding of data visualisation best practices, UX design, and storytelling principles. Excellent stakeholder management, requirements gathering, and presentation abilities. PreferredCertifications Microsoft Certified: PowerBI Data Analyst Associate (PL 300) Microsoft Certified: Azure Enterprise Data Analyst Associate (DP 500)

Frontend Developer - Typescript

Pune

3 - 8 years

INR 5.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Type: Full-time About the Role Were looking for a highly motivated Sr. Product AI Engineer- Frontend Developer with proficiency in building AI based desktop apps using TypeScript and frameworks like Electron, Node.js or Tauri. You will lead the development of scalable and secure user interfaces, work on local API integrations, and optimize performance for cross-platform environments. Key Responsibilities Develop user-friendly and efficient desktop UI for Windows and macOS. Implement and consume local/offline APIs using REST/WebSocket protocols. Integrate AI model workflows into the UI (offline/local deployment). Ensure security compliance in application design and data handling. Package and deploy desktop apps using cross-platform build tools. Optimize app performance for speed and responsiveness. Collaborate closely with backend, ML, and DevOps teams. Be open to working flexible or extended hours during high-priority phases Required Skills TypeScript Expert in scalable UI/application logic. Electron or Tauri Hands-on experience with desktop app frameworks. Node.js – Understanding of backend service integration. REST/WebSocket – Ability to build and consume APIs for local data exchange. Secure Coding – Knowledge of privacy-first and secure app design. Linux – Comfortable with Linux-based dev and deployment environments. Nice-to-Have Skills Familiarity with AI/ML model APIs (local or hosted). Knowledge of Redis or SQLite for lightweight data storage. Experience in plugin/module system architecture. Skills in cross-platform build automation (e.g., electron-builder, pkg). Experience working in air-gapped or security-restricted environments.

SAP Business Objects (BO) Developer

Chennai

6 - 8 years

INR 8.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are seeking a dedicated SAP Business Objects (BO) Developer to join our team. The ideal candidate will have extensive experience in providing technical support for SAP BO solutions, including troubleshooting, maintenance, and user assistance. This role requires strong problem-solving skills and the ability to work collaboratively with various stakeholders. Responsibilities and Duties: Provide technical support for SAP Business Objects applications, including troubleshooting and resolving issues. Monitor and maintain the performance and availability of SAP BO systems. Assist users with report creation, modification, and optimization. Implement and maintain data models and universes in Business Objects. Conduct regular system audits and ensure data quality and integrity. Collaborate with IT and business teams to understand requirements and provide appropriate solutions. Document support activities, solutions, and best practices. Stay updated with the latest advancements in SAP Business Objects and related technologies. Qualifications and Skills: Bachelors degree in computer science, Information Technology, or a related field. Extensive experience with SAP Business Objects tools (e.g., Web Intelligence, Crystal Reports, Design Studio). Strong knowledge of SQL and database management systems. Proven experience in troubleshooting and resolving technical issues. Excellent problem-solving skills and attention to detail. Strong communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Experience with other BI tools (e.g., Tableau, Power BI) is a plus.

AI/ML Engineer

Pune

3 - 5 years

INR 5.0 - 7.0 Lacs P.A.

Work from Office

Full Time

Role Overview Join our Pune AI Center of Excellence to drive software and product development in the AI space. As an AI/ML Engineer, youll build and ship core components of our AI products—owning end-to-end RAG pipelines, persona-driven fine-tuning, and scalable inference systems that power next-generation user experiences. Key Responsibilities Model Fine-Tuning & Persona Design Adapt and fine-tune open-source large language models (LLMs) (e.g. CodeLlama, StarCoder) to specific product domains. Define and implement “personas” (tone, knowledge scope, guardrails) at inference time to align with product requirements. RAG Architecture & Vector Search Build retrieval-augmented generation systems: ingest documents, compute embeddings, and serve with FAISS, Pinecone, or ChromaDB. Design semantic chunking strategies and optimize context-window management for product scalability. Software Pipeline & Product Integration Develop production-grade Python data pipelines (ETL) for real-time vector indexing and updates. Containerize model services in Docker/Kubernetes and integrate into CI/CD workflows for rapid iteration. Inference Optimization & Monitoring Quantize and benchmark models for CPU/GPU efficiency; implement dynamic batching and caching to meet product SLAs. Instrument monitoring dashboards (Prometheus/Grafana) to track latency, throughput, error rates, and cost. Prompt Engineering & UX Evaluation Craft, test, and iterate prompts for chatbots, summarization, and content extraction within the product UI. Define and track evaluation metrics (ROUGE, BLEU, human feedback) to continuously improve the product’s AI outputs. Must-Have Skills ML/AI Experience: 3–4 years in machine learning and generative AI, including 18 months on LLM- based products. Programming & Frameworks: Python, PyTorch (or TensorFlow), Hugging Face Transformers. RAG & Embeddings: Hands-on with FAISS, Pinecone, or ChromaDB and semantic chunking. Fine-Tuning & Quantization: Experience with LoRA/QLoRA, 4-bit/8-bit quantization, and model context protocol (MCP). Prompt & Persona Engineering: Deep expertise in prompt-tuning and persona specification for product use cases. Deployment & Orchestration: Docker, Kubernetes fundamentals, CI/CD pipelines, and GPU setup. Nice-to-Have Multi-modal AI combining text, images, or tabular data. Agentic AI systems with reasoning and planning loops. Knowledge-graph integration for enhanced retrieval. Cloud AI services (AWS SageMaker, GCP Vertex AI, or Azure Machine Learning)

Data Quality Engineer

Pune

4 - 6 years

INR 4.0 - 7.0 Lacs P.A.

Work from Office

Full Time

Job Summary We are looking for a Data Quality Engineer who will safeguard the integrity of our cloud-native data assets. You will design and execute automated and manual data-quality checks across structured and semi-structured sources on Azure and GCP, validating that our data pipelines deliver accurate, complete, and consistent datasets for analytics, reporting, and AI initiatives. Key Responsibilities Define, build, and maintain data-quality frameworks that measure accuracy, completeness, timeliness, consistency, and validity of data ingested through ETL/ELT pipelines. Develop automated tests using SQL, Python, or similar tools; supplement with targeted manual validation where required. Collaborate with data engineers to embed data-quality gates into CI/CD pipelines on Azure Data Factory / Synapse / Fabric and GCP Dataflow / Cloud Composer. Profile new data sources (structured and semi-structuredJSON, Parquet, Avro) to establish baselines, detect anomalies, and recommend cleansing or transformation rules. Monitor data-quality KPIs and publish dashboards/alerts that surface issues to stakeholders in near-real time. Conduct root-cause analysis for data-quality defects, propose remediation strategies, and track resolution to closure. Maintain comprehensive documentation of test cases, data-quality rules, lineage, and issue logs for audit and governance purposes. Partner with data governance, security, and compliance teams to ensure adherence to regulatory requirements Must-Have Skills 4-6 years of experience in data quality, data testing, or data engineering roles within cloud environments. Hands-on expertise with at least one major cloud data stackAzure (Data Factory, Synapse, Databricks/Fabric) or GCP (BigQuery, Dataflow, Cloud Composer). Strong SQL skills and proficiency in a scripting language such as Python for building automated validation routines. Solid understanding of data-modeling concepts (dimensional, 3NF, data vault) and how they impact data-quality rules. Experience testing semi-structured data formats (JSON, XML, Avro, Parquet) and streaming/near-real-time pipelines. Excellent analytical and communication skills; able to translate complex data issues into clear, actionable insights for technical and business stakeholders. Nice-to-Have Skills Familiarity with BI/reporting tools (Power BI, Looker, Tableau) for surfacing data-quality metrics. Preferred Certifications Google Professional Data Engineer or Associate Cloud Engineer (GCP track) - OR - Microsoft Certified: Azure Data Engineer Associate Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field. Comparable professional experience will also be considered. Why Join Us? You will be the guardian of our datas trustworthiness, enabling decision-makers to rely on insights with confidence. If you are passionate about building automated, scalable data-quality solutions in a modern cloud environment, we’d love to meet you.

Technical Project Manager - DWH

Pune

10 - 18 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Job Title: TPM / Project Manager - Data Warehousing Experience: 10+ years Location : Pune Company: Smartavya Analytica Private limited is niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Summary: We are looking to hire a Technical Project Manager with strong hands-on experience in leading complex and large-scale data integration and consolidation initiatives. This role will involve understanding business requirements, analyzing technical options and to Deliver the Project within timelines. It is a challenging role with the opportunity to build innovative Cloud Data integration solutions. Requirement: Having 10+ years of hands-on experience as a TPM / Project Manager - DWH. Responsible to work closely with customer for requirement Gathering and Documenting. Liaise with key stakeholders to define, solutions roadmap, prioritize the deliverables Responsible for end-to-end project delivery from project estimations, project planning, resourcing and support perspective Participate and contribute in Solution Design implementing Projects Monitor and review the status of the project and ensure that the deliverables are on track. Effectively communicate the Project status to all the stakeholders on a regular basis Identify and manage risks / issues related to deliverables and arrive at mitigation plans to resolve the issues and risks Seek proactive feedback continuously to identify areas of improvement Ensure the team is creating and maintaining the knowledge artifacts with reference to the project deliverables. Should be hands on managing on prem to Cloud migration projects. Should have been successfully executed minimum one Cloud migration project Skills Hands-on and deep experience in managing migration project . Secondary Skills Expert in any of one DWH Tools, SQL. BQ is preferred Good knowledge of Dimensional Modeling Experience of managing medium to large projects Proven experience in project planning, estimation, execution and implementation of medium to large projects Proficient with various development methodologies like waterfall, agile/scrum and iterative Good Interpersonal skills and excellent communication skills Advanced level Microsoft Project, PowerPoint, Visio, Excel and Word. Strong Team Player

Linux System Administrator

Mumbai

2 - 4 years

INR 3.0 - 5.5 Lacs P.A.

Work from Office

Full Time

Responsibilities: Assist in System Setup: Help in the installation and configuration of Platform's/Networking/Linux servers (e.g., CentOS, Ubuntu, or RHEL) under guidance. Basic Monitoring: Monitor system performance and ensure that servers are running smoothly with basic supervision. Software Updates & Patches: Assist in applying basic security patches and software updates to maintain system security. User Management: Learn to manage user accounts, groups, and permissions on servers. Security Assistance: Assist in securing Linux servers by applying encryption and decryption techniques for sensitive data. Implement basic encryption (e.g., file system encryption, data-at-rest) and decryption protocols. Ensure user authentication through methods like SSH keys and password policies. Transport Layer Security (TLS/SSL): Help in setting up and managing SSL/TLS certificates for secure communications over networks (e.g., securing web servers with HTTPS). Learn to configure TLS/SSL on services such as Apache, Nginx, or other server applications. Documentation: Help maintain and update system documentation for procedures and configurations. Troubleshooting Support: Assist in troubleshooting minor system or hardware issues with guidance from senior administrators. Backup & Recovery: Help with backup procedures to ensure data integrity and assist in system recovery processes. Collaboration: Work closely with other IT professionals and senior system administrators to learn best practices. Requirements: Understanding of Linux operating systems (e.g., Ubuntu, CentOS, RHEL) and Networking Fundamentals Familiarity with command-line tools and terminal usage. Strong problem-solving attitude and a willingness to learn. Good communication skills and the ability to work in a team. Basic networking knowledge (e.g., DNS, IP addressing, TCP/IP). Interest in security concepts like encryption, decryption, and authentication methods. Familiarity with basic Transport Layer Security (TLS/SSL) and its implementation in web servers is a plus. Interest in automation and system management tools is a plus (e.g., Bash scripting). Benefits: Exposure to working with senior Linux administrators. Opportunities for career advancement within the organization. [List other benefits like health insurance, paid leave, etc.]Role & responsibilities

Cloud DevOps Architect

Pune

10 - 15 years

INR 30.0 - 40.0 Lacs P.A.

Work from Office

Full Time

Cloud DevOps Architect Job Title: Cloud DevOps Architect Location: Pune, India Experience: 10 - 15 Years Work Mode: Full-time, Office-based Company : Smartavya Analytica Private Limited Company Overview: Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for OnPremises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are looking for an accomplished Cloud DevOps Architect to design and implement robust DevOps and Infrastructure Automation frameworks across Azure, GCP, or AWS environments. The ideal candidate will have a deep understanding of CI/CD , IaC , VPC Networking , Security , and Automation using Terraform or Ansible . Key Responsibilities: Architect and build end-to-end DevOps pipelines using native cloud services (Azure DevOps, AWS CodePipeline, GCP Cloud Build) and third-party tools (Jenkins, GitLab, etc.). Define and implement foundation setup architecture (Azure, GCP and AWS) as per the recommended best practices. Design and deploy secure VPC architectures , manage networking, security groups, load balancers, and VPN gateways. Implement Infrastructure as Code (IaC) using Terraform or Ansible for scalable and repeatable deployments. Establish CI/CD frameworks integrating with Git, containers, and orchestration tools (e.g., Kubernetes, ECS, AKS, GKE). Define and enforce cloud security best practices including IAM, encryption, secrets management, and compliance standards. Collaborate with application, data, and security teams to optimize infrastructure, release cycles, and system performance. Drive continuous improvement in automation, observability, and incident response practices. Must-Have Skills: 10- 5 years of experience in DevOps, Infrastructure, or Cloud Architecture roles. Deep hands-on expertise in Azure , GCP , or AWS cloud platforms (any one is mandatory, more is a bonus). Strong knowledge of VPC architecture , Cloud Security , IAM , and Networking principles . Expertise in Terraform or Ansible for Infrastructure as Code. Experience building resilient CI/CD pipelines and automating application deployments. Strong troubleshooting skills across networking, compute, storage, and containers. Preferred Certifications: Azure DevOps Engineer Expert / AWS Certified DevOps Engineer Professional / Google Professional DevOps Engineer HashiCorp Certified: Terraform Associate (Preferred for Terraform users)

FIND ON MAP

Smartavya Analytica

Smartavya Analytica logo

Smartavya Analytica

|

Data Analytics / Technology

N/A

51-200 Employees

31 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview