Jobs
Interviews

Analytics Saves at Work

8 Job openings at Analytics Saves at Work
Power Platform Developer India 6 - 8 years Not disclosed On-site Full Time

Analytics Saves at Work is looking for a Power Platform Developer to join one of our Big 4 clients. If you thrive in high-impact environments and love building efficient, secure, and scalable infrastructure, this one’s for you! Location: Bangalore / Pune / Mumbai (On-Site) Experience Required: 6 -8 years Role: Power Platform Developer Role Overview: The candidate will play a pivotal role in driving Power Platform-based applications (web and mobile). They will be responsible for overseeing the solution design, development, and management of scalable enterprise-level applications and processes using Microsoft Power Platform tools in combination with Azure Data Functions and AI technologies. The primary focus will be on Power Apps, Power Automate, Power Automate Desktop, Dataverse, SharePoint Forms, AI Builder, Copilot Studio, and AI agents. Expertise in data management, integration, and visualization. Primary Skills: Blend of skills in business processes and technology as well as strong hands-on experience in the Microsoft PowerApps platform. This position will be customer-facing and responsible for delivering business transformation & technology projects. Deep understanding and demonstrated hands-on experience with PowerApps (Canvas, Portal, SharePoint Form Apps, and Model Driven Apps), Power Automate Cloud Flows & Desktop, PowerBI, AI Builder, and Copilot Studio. Expertise in implementing Power Automate Flows and Power Automate Desktop (Automated, Instant, Business Process Flow, and UI Flows). Experience in technical documentation, including solution design architecture, design specifications, and technical standards. Problem-solving mindset with the ability to analyze complex data-related challenges and devise effective solutions. Project management experience, including scope definition, timeline management, and resource allocation. Knowledge of AI agents, agent frameworks, and working with large language models (LLMs). Experience with Azure AI foundry and Azure AI services. Note: We are seeking a Developer who can design and build enterprise-level applications and processes using Microsoft Power Platform tools in conjunction with Azure Data Functions and AI technologies. The primary focus will be on Power Apps, Power Automate, Power Automate Desktop, Dataverse, SharePoint Forms, AI Builder, Copilot Studio, and AI agents. Show more Show less

Python & Prompt Engineer India 11 years None Not disclosed On-site Full Time

Analytics Saves at Work is looking for a skilled Python Developer with Prompt Engineering experience to join one of our esteemed Big 4 clients . If you thrive in high-impact environments and are passionate about building intelligent, scalable, and secure solutions using GenAI and Python, we’d love to hear from you! 📍 Location : PAN India 🧑‍💻 Work Model : Hybrid 📅 Notice Period : Immediate to 30 Days (Strictly) 👔 Experience : 8 – 11 Years Key Skills: 8+ years of experience with a strong background in Python development . Hands-on experience in Prompt Engineering (LLMs like GPT, Claude, etc.). Knowledge of BigQuery (BQ) is a plus. Familiarity with CI/CD frameworks such as Jenkins is advantageous. Strong problem-solving and analytical skills. Good communication and collaboration abilities in cross-functional teams Note: We are seeking candidates who are available to join immediately or within 30 days. Please don't apply if you have less than 8 years of relevant experience.

Big Data Developer Hyderabad,Telangana,India 6 years None Not disclosed On-site Full Time

Job Description: At Analytics Saves at Work, we are looking for a skilled and experienced Big Data Developer for one of our clients with a strong background in PySpark and cloud technologies. The ideal candidate should have hands-on experience in Python programming , working with AWS services , and development in Java or Scala . Key Responsibilities: Design, develop, and maintain scalable big data applications using PySpark . Work with data ingestion, transformation, and storage on AWS cloud infrastructure including EMR, S3, RDS, and Glue . Collaborate with data engineers, architects, and stakeholders to understand data requirements and deliver reliable solutions. Optimize and troubleshoot big data applications for performance and scalability. Write clean, reusable, and well-documented Python code . Implement and follow best practices in big data and cloud-based architecture. Required Skills: Minimum 6 years of experience in Big Data development. Strong hands-on experience in PySpark with Java or Scala. Solid understanding of AWS services : EMR, S3, RDS, Glue . Strong proficiency in Python programming . Good problem-solving and analytical skills. Ability to work independently and in a team environment.

Technical Manager Bengaluru,Karnataka,India 6 years None Not disclosed On-site Full Time

Analytics Saves at Work is looking for a Technical Manager to join one of our esteemed Big 4 clients . This is an exciting opportunity for a hands-on leader with strong technical skills and proven team management experience to work on cutting-edge cloud-based streaming solutions. 📍 Location: Bangalore 🧑‍💻 Experience: 6 to 10 Years Key Responsibilities: Lead and manage a team of 10+ engineers focused on the development and enhancement of full stack cloud-based applications. Architect and implement scalable solutions using Java , Spring Boot , React.js , and AWS . Oversee end-to-end project delivery, including planning, execution, monitoring, and risk management. Enforce best practices for software development, including code quality, testing, version control, and CI/CD. Mentor team members and foster a culture of continuous improvement and innovation. Conduct code reviews and support technical troubleshooting as needed. Skills & Experience: 6–10 years of experience in software development with a focus on backend and full stack technologies. Strong programming experience in Java and Spring Boot . Proficiency with React.js for front-end development. Hands-on experience with AWS services and cloud architecture. Minimum of 3 years in a managerial role , leading teams of 10+ developers. Experience in building or maintaining streaming or real-time data platforms is a plus. Strong communication, leadership, and stakeholder management skills.

Looker chennai,tamil nadu 5 - 9 years INR Not disclosed On-site Full Time

As a Google Looker at Analytics Saves at Work, your primary responsibility will be to develop and maintain data visualization dashboards, create custom reports, and optimize data models for business intelligence and analytics purposes. You will be based in either Chennai or Bangalore. To excel in this role, you should possess strong skills in data visualization, dashboard development, and reporting. You must have at least 5 years of experience working with Google Looker, Power BI, and Tableau. Proficiency in SQL querying is essential, and experience in React is required. Additionally, you should have knowledge of data modeling and analytics, along with a background in business intelligence. Problem-solving skills, attention to detail, and the ability to work effectively in a collaborative team environment are key attributes for success in this position. A Bachelor's degree in Computer Science, Information Systems, or a related field is necessary to qualify for this role.,

Data Engineer south delhi,delhi,india 3 - 7 years None Not disclosed On-site Full Time

Job Description: We are looking for a skilled Data Engineer with 3-7 Years of experience with strong expertise in data integration, ETL pipelines, and cloud infrastructure. The ideal candidate will be proficient in SQL, Python, and MongoDB, with hands-on experience in building scalable data pipelines and working across multiple databases. The role requires a platform-agnostic mindset with exposure to AWS services, messaging systems, and monitoring tools. The selected candidate will be working at our client site in Delhi, and this is a Work From Office (WFO) opportunity. Experience- 3-7 years Location - Delhi Key Responsibilities: Design, develop, and maintain ETL pipelines and database schemas to support business and analytics needs. Work with multi-database architectures (SQL, NoSQL, MongoDB) ensuring scalability and efficiency. Deploy and manage AWS resources such as Lambda functions and EC2 instances. Integrate and optimize streaming/messaging frameworks such as Kafka and caching systems like Redis. Collaborate with cross-functional teams to ensure seamless data flow across platforms. Monitor infrastructure and system performance using tools such as Grafana, CloudWatch, or equivalent monitoring solutions . Ensure data quality, security, and compliance standards are consistently maintained. Required Skills & Experience: Strong programming experience in SQL, Python, and MongoDB . Proven experience in building and managing ETL pipelines . Ability to work in a platform-agnostic environment. Hands-on experience with AWS services (Lambda, EC2). Exposure to Kafka / Redis . Experience with monitoring tools ( Grafana, CloudWatch, etc. ). Strong problem-solving skills and ability to work in a fast-paced environment.

.NET Azure Full Stack Developer karnataka 5 - 9 years INR Not disclosed On-site Full Time

Analytics Saves at Work is looking for a .NET Azure Full Stack Developer (React) to join a client in the Financial Services industry. As a developer, you will play a crucial role in designing and constructing scalable web applications utilizing modern .NET and Azure technologies, with a focus on front-end proficiency in React and TypeScript. Your expertise in backend API development, cloud-native services, and CI/CD practices will be instrumental in driving digital transformation and modernization efforts. Responsibilities include designing, developing, and maintaining secure backend services using .NET Core (C#) and Web API. You will also be responsible for implementing and optimizing front-end applications with React JS, TypeScript, Redux, Hooks, and Tailwind CSS. Integration and automation of workflows with Azure PaaS services like Azure Functions, Azure Data Factory, and Azure App Services are essential tasks. Collaboration with DevOps teams to establish and manage CI/CD pipelines through Azure DevOps is crucial to the role. Applying software design patterns and architectural best practices is necessary to uphold code quality and performance standards. Participation in code reviews, architecture discussions, and agile ceremonies is expected, along with monitoring and troubleshooting production applications to maintain system reliability and availability. Cross-functional teamwork with product managers, designers, and fellow engineers is essential to deliver high-quality solutions. Qualifications for this position include a minimum of 5 years of hands-on development experience with .NET Core (C#) and RESTful APIs. Strong proficiency in front-end development using React JS, TypeScript, and modern CSS frameworks like Tailwind is required. Experience with Azure PaaS services, especially Azure Functions and Azure Data Factory, is essential. The ability to build and maintain CI/CD pipelines in Azure DevOps is a must-have skill. A solid grasp of object-oriented design principles, design patterns, Redux, React Hooks, and state management in complex front-end applications is necessary. Familiarity with Git, automated testing, and agile methodologies is beneficial, along with excellent problem-solving, communication, and collaboration abilities.,

Power BI & Data Automation Specialist south delhi,delhi,india 3 - 7 years None Not disclosed On-site Full Time

Job Title: Power BI & Data Automation Specialist Location: Delhi (On-site at client office) Experience: 3 to 7 years Job Summary: We are looking for experienced professionals to support our reporting and data analytics initiatives. The selected candidates will work as an extended part of our internal team , helping with report automation, Power BI dashboard development, and analytics layer creation from core business data. This is an on-site role in Delhi , and all necessary infrastructure (laptop, software) will be provided. Key Responsibilities: 🔹 Report Automation Automate recurring business reports across different frequencies (hourly, daily, weekly, monthly) Write and schedule SQL queries/jobs to generate output for automated distribution Manually generate reports during initial setup phases while automation is being built 🔹 Power BI Development Build and enhance interactive dashboards in Power BI based on business needs Implement slicers, drill-throughs, KPIs, and advanced DAX measures Design scalable and reusable Power BI reports connected to backend data pipelines 🔹 Analytics Data Layer Creation Work with engineering teams to transform core transactional data into a reporting-friendly analytics layer Design data models and summary tables that support efficient Power BI reporting Ensure data accuracy, consistency, and performance optimization in reporting layer Required Skills & Experience: 3 to 7 years of experience in reporting, BI development, or data engineering Strong hands-on experience with Power BI Desktop , DAX , and report design Proficiency in SQL and working with relational databases (e.g., SQL Server, PostgreSQL) Experience in scheduling/query automation using tools like SQL Agent, PowerShell, or cloud-based schedulers Understanding of data modeling (star/snowflake), ETL pipelines, and analytics layer design Ability to collaborate with business and technical teams for requirement gathering and delivery Preferred Qualifications: Experience with Power BI Service , including dataset refresh, RLS, and gateway setup Familiarity with cloud data platforms (Azure, AWS) is a plus Prior experience working in a consulting or extended team model is an advantage