Jobs
Interviews

149 Quicksight Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

8 - 13 Lacs

saidapet, tamil nadu

Work from Office

Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As aData Engineer, you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus onPython,SQL,AWS,PySpark, andDatabricks, you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines usingPySpark,Databricks, andSQLonAWS cloud platforms. Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment usingPythonandPySpark. Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such asData CatalogsorCollibra(if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 yearsof professional experience inData Engineeringor a related field. Strong programming experience withPythonand experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimizedSQLqueries on large-scale datasets. Solid hands-on experience withPySparkand distributed data processing frameworks. Expertise working withDatabricksfor developing and orchestrating data pipelines. Experience withAWS cloudservices such asS3,Glue,EMR,Athena,Redshift, andLambda. Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools likeAirflow,Databricks Jobs, orAWS Step Functions. Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure todata observability,monitoring, andalertingframeworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight.

Posted 3 days ago

Apply

8.0 - 10.0 years

15 - 25 Lacs

hyderabad, secunderabad

Work from Office

Lead Data Analyst We are seeking a skilled and motivated Lead Data Analyst to oversee the successful implementation and ongoing support of Business Intelligence (BI) solutions for all customers. Leveraging AWS Redshift and Amazon QuickSight, this role will ensure seamless BI integration, data accuracy, ETL optimization, and manage release upgrades, daily operations, and internal reporting needs. Responsibilities: Lead and manage the end-to-end implementation of BI solutions across all customer projects, ensuring timely and high-quality delivery. Oversee and optimize ETL processes to ensure efficient data handling, storage, and retrieval within the BI framework. Manage daily operations, including bug tracking, prioritizing enhancements, and coordinating resolutions with both the technical team and customer contacts. Develop and manage internal reports and dashboards to support decision-making, providing insights into KPIs and operational metrics for key stakeholders. Coordinate with customers to plan and execute BI-related activities for each new release, ensuring minimal disruption and seamless upgrades. Oversee the health and performance of BI systems, working with the infrastructure team to ensure data integrity, speed, and reliability. Work closely with Product Engineering and Customer Success teams to understand requirements, align on priorities, and provide feedback for product improvements. Troubleshoot technical issues and provide solutions to ensure smooth operation of QuickSight dashboards. Identify opportunities to improve data analysis processes and enhance overall data analytics capabilities. Ensure comprehensive documentation and knowledge transfer, providing training and guidance to internal teams and end-users as needed. Requirements: Bachelors or Masters degree in a Relevant Field such as Data Science, Business Analytics, Information Systems, or Computer Science. 6+ years of progressively increasing professional experience in Data Analytics, with at least 3+ years in a senior-level role. 3+ years experience in Amazon QuickSight and SQL. Strong experience in building visualizations, dashboards, and reports using QuickSight. Strong SQL skills for data extraction, transformation, and loading from various data sources. Experience working closely with stakeholders to understand needs and define analytic requirements. Experience with data warehousing concepts and ETL processes. Cloud Computing and AWS Experience. Excellent Communication and Collaboration, Analytical and Problem-Solving Skills.

Posted 3 days ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : AWS Analytics Good to have skills : AWS Architecture Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Platform Engineer, you will engage in the design, construction, testing, and deployment of cloud application solutions that seamlessly integrate both cloud and non-cloud infrastructures. Your typical day will involve collaborating with various teams to ensure the architecture's viability, security, and performance, while also creating proofs of concept to validate your designs. You will be responsible for managing the deployment of infrastructure and platform environments, ensuring that all components work harmoniously together to meet organizational goals. Your role will require a proactive approach to problem-solving and a commitment to delivering high-quality solutions that enhance operational efficiency and effectiveness. Key Responsibilities:-Extensive experience in design, development and deployment of modern data frameworks-Good hands-on experience in AWS Glue, Spark, S3, MSK, Lambda, EMR, Redshift, Athena, pySpark-Good knowledge around building and delivering comprehensive data strategy roadmap-Understanding of homogenous and Heterogeneous On-prem to AWS database migration using AWS native tools (DMS, SCT) and/or third data party data migration and maintenance tools.-Work with client to evaluate their current platform and suggest for improvement-Good understanding about AWS Relational and NOSQL DB Technical experience-15 - 20 years of experience in the industry with at least 10 years and above in AWS.- Must have AWS Glue-Must have AWS professional or speciality level certification-Experienced in working in Client facing role-Good to have knowledge AWS Quicksight, Tableau-Good to have knowledge in Python-Good understanding of cloud Migration (6Rs of migration) and application Modernization Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : AWS Analytics Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Platform Engineer, you will engage in the design, construction, testing, and deployment of cloud application solutions that seamlessly integrate both cloud and non-cloud infrastructures. Your typical day will involve collaborating with cross-functional teams to ensure that the solutions meet performance and security standards while also validating architectural designs through proof of concepts. You will be responsible for deploying infrastructure and platform environments, ensuring that all components work harmoniously together to deliver robust cloud solutions. Key Responsibilities:?Design, develop, and manage robust data pipelines and ETL processes using AWS services (e.g., AWS Glue, Amazon Redshift, Amazon S3, Amazon Kinesis). Perform data migration from on premises to AWS using AWS services (AWS DMS, AWS SCT etc)?Implement and maintain data models, schemas, and storage solutions to meet business intelligence and analytics needs. Collaborate with data scientists, analysts, and engineering teams to understand data requirements and deliver effective solutions. Monitor, troubleshoot, and optimize data workflows to ensure high performance, accuracy, and availability. Develop and enforce data security and compliance measures in line with best practices and organizational policies. Contribute to architecture decisions and provide recommendations for improving data infrastructure. Document data processes, architectures, and configurations to facilitate knowledge sharing and future reference. Stay informed about AWS updates and emerging technologies to integrate new features and tools into our data solutions.Technical Requirements:?3-5 years of experience in data engineering, with a strong focus on AWS cloud services. Proficiency with AWS services such as Amazon S3, AWS Glue, Amazon Redshift, AWS Lambda, and Amazon RDS. Solid experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Strong programming skills in languages such as Python, Java, or Scala for data processing and automation. Experience with data modeling, data warehousing, and ETL/ELT processes. Understanding of data security practices and regulatory compliance requirements. Excellent problem-solving abilities and analytical skills. Strong communication skills with the ability to work effectively in a collaborative environment.Preferred Qualifications:?AWS certifications such as AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect. Experience with big data technologies like Apache Hadoop, Apache Spark. Familiarity with data visualization tools (e.g., Tableau, Amazon QuickSight). Qualification 15 years full time education

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

andhra pradesh

On-site

As a Data Visualisation Analyst-Mobile version at Pulsus Group, your main role involves creating robust, reusable data querying, transformation, and visualization processes in SQL and Excel. You will be responsible for integrating multiple data sources into one visualization to convey a compelling story. Your expertise in leveraging analytics and visualization tools will be crucial in presenting information in a manner that drives fact-based decision-making. Additionally, you will develop analytical reports and represent large, complex sets of data through visual representations. Key Responsibilities: - Evaluate business needs and enhance efficiencies and capabilities through the utilization of technology, including automation and improvement of operational processes aligned with the strategy of enhancing overall analytics and business. - Perform data analysis from various sources and generate reports on data such as executing dashboards, customer website journey analytics, lead performance analysis, digital and social campaign analytics, customer response analysis, and return of digital marketing investment analysis. - Contribute to building and maintaining a comprehensive reporting and tracking strategy for optimal marketing campaign performance. - Utilize visualization software such as Tableau, Qlik Sense, QlikView, d3.js, and QuickSight. - Work with large data sets, write and execute SQL queries for data extraction, review, and understanding. - Manage reporting, data management, or data analytics, including standard and ad-hoc reporting. - Demonstrate passion for analytics and data-driven marketing, ability to handle multiple projects simultaneously, and possess strong analytical and detail-oriented skills. - Exhibit intellectual curiosity and a strong willingness to learn. Qualifications Required: - Experience with visualization software (Tableau, Qlik Sense, QlikView, d3.js, QuickSight). - Proficiency in working with large data sets and ability to write/execute SQL queries. - Experience in reporting, data management, or data analytics, including standard and ad-hoc reporting. - Passion for analytics and data-driven marketing. - Ability to manage multiple projects concurrently. - Highly analytical and detail-oriented. - Intellectual curiosity and a strong willingness to learn. Please note that the salary offered for this position is best in the industry, and the company, Pulsus Group, was established in 1984, focusing on the American region and expanding its healthcare informatics platform globally. Pulsus Group has received endorsements from medical associations and industries internationally, bridging relations between industries and practicing physicians. If you have any job-related queries, you can reach out to vizag@pulsus.com or hr.vizag@pulsus.com. For general queries, contact contact@pulsus.com or call 0891-3356302 or 8712290488.,

Posted 4 days ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

vadodara

Work from Office

JLL supports the Whole You, personally and professionally. The Compliance Analyst Specialist: Maintains compliance capabilities while adding dedicated compliance analyst specialist support for: (1) Orbit Data Governance and (2) Asset Management Inventory Governance and File/Edit Document Management Key Responsibilities: GRC Control Controllership Program (GREF GRC Controls) Task: Use data analytics to create a trend analysis on GRC Controls linked to GREF systems and processes. Task: Create a reporting mechanism (dashboard) for compliance effectiveness and monitoring of repeated issues. Task: Identify opportunity areas and create an action plan to minimize the risk of repeated issues and/or identify unmitigated risks (i.e., residual risks). Data Integrity Program (GREF tools) Task: Use data analytics to create trend analysis on ORBIT issues Task: Identify opportunity areas and create action plans to improve productivity and minimize risk of repeated issues. Compliance monitoring and reporting dashboard for regulatory requirements for Data Governance Task: Use data analytics to assess adherence to regulatory requirements (i.e., GDPR, CCPA Internal Policies) Task: Identify opportunity areas and create action plans to mitigate risks and avoid repeated issues. Data Cleaning and Preprocessing for Free Trade Zones (FTZ) Project Remove duplicates and handle missing values from inputs provided by various stakeholders like Tax, accounting, GST, PXT, IT etc. Standardize data formats to enable automation of reports and returns (~110 monthly reports). Exploratory Data Analysis (EDA) for FTZ Project Create visualizations (histograms, scatter plots, box plots) for leadership and Design interactive dashboards using tools like Tableau/Power BI Identify patterns and trends in Import, export, GST benefits etc. Compliance Dashboard Creation and Maintenance for Free Trade Zones (FTZ) and GREF Building Compliance Create regular performance reports and dashboard for compliance metrics Set up automated reporting systems and monitor KPIs Track compliance rates with customs regulations for FTZ Analyze documentation accuracy Monitor restricted goods movement Create compliance reporting dashboards Ad-hoc Analysis and Reporting for Free Trade Zones Project Respond to specific data related business questions Create custom reports based on deep-dive analyses Document Management System (DMS) Administration for FTZ Project Monitor document retention compliance Setting up automated filing systems and managing asset documentation throughout its lifecycle GREF Building Compliance Assurance Program (B-CAP) building repository and reviewing the existing documents across all sites in India Develop, monitor, and report key performance indicators for GREF B-CAP GREF B-CAP testing pre-launch Contingent Worker Cost Monitoring and Reporting Required Skills and Experience: Bachelors degree in Computer Science, Information Management, or a related field. Data Management: Proficiency in database systems, data reporting, and data presentation. Must ensure data integrity and accessibility. Analytics: Strong analytical skills with the ability to identify trends and derive actionable insights from complex datasets. Visualization: Expertise in data visualization tools to create clear, impactful reports and dashboards. Business Acumen: Ability to understand business objectives and translate data findings into valuable recommendations. Experience with Data Source integrations (via API, ETL, virtualization, streaming) is desirable, but not required. Communication: Excellent verbal and written skills to present complex information clearly to diverse stakeholders. Technical Adaptability: Proficiency in relevant data analysis tools and programming languages. Commitment to continuous learning in this rapidly evolving field. Desired or preferred experience and technical skills: Proficient in Microsoft Office tools, experience in working with cloud-based tools Smartsheet, Salesforce, Excel, Tableau, Quicksight, Visio, Power Bi and AppSheet.

Posted 5 days ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

bengaluru

Work from Office

JLL supports the Whole You, personally and professionally. The Compliance Analyst Specialist: Maintains compliance capabilities while adding dedicated compliance analyst specialist support for: (1) Orbit Data Governance and (2) Asset Management Inventory Governance and File/Edit Document Management Key Responsibilities: GRC Control Controllership Program (GREF GRC Controls) Task: Use data analytics to create a trend analysis on GRC Controls linked to GREF systems and processes. Task: Create a reporting mechanism (dashboard) for compliance effectiveness and monitoring of repeated issues. Task: Identify opportunity areas and create an action plan to minimize the risk of repeated issues and/or identify unmitigated risks (i.e., residual risks). Data Integrity Program (GREF tools) Task: Use data analytics to create trend analysis on ORBIT issues Task: Identify opportunity areas and create action plans to improve productivity and minimize risk of repeated issues. Compliance monitoring and reporting dashboard for regulatory requirements for Data Governance Task: Use data analytics to assess adherence to regulatory requirements (i.e., GDPR, CCPA Internal Policies) Task: Identify opportunity areas and create action plans to mitigate risks and avoid repeated issues. Data Cleaning and Preprocessing for Free Trade Zones (FTZ) Project Remove duplicates and handle missing values from inputs provided by various stakeholders like Tax, accounting, GST, PXT, IT etc. Standardize data formats to enable automation of reports and returns (~110 monthly reports). Exploratory Data Analysis (EDA) for FTZ Project Create visualizations (histograms, scatter plots, box plots) for leadership and Design interactive dashboards using tools like Tableau/Power BI Identify patterns and trends in Import, export, GST benefits etc. Compliance Dashboard Creation and Maintenance for Free Trade Zones (FTZ) and GREF Building Compliance Create regular performance reports and dashboard for compliance metrics Set up automated reporting systems and monitor KPIs Track compliance rates with customs regulations for FTZ Analyze documentation accuracy Monitor restricted goods movement Create compliance reporting dashboards Ad-hoc Analysis and Reporting for Free Trade Zones Project Respond to specific data related business questions Create custom reports based on deep-dive analyses Document Management System (DMS) Administration for FTZ Project Monitor document retention compliance Setting up automated filing systems and managing asset documentation throughout its lifecycle GREF Building Compliance Assurance Program (B-CAP) building repository and reviewing the existing documents across all sites in India Develop, monitor, and report key performance indicators for GREF B-CAP GREF B-CAP testing pre-launch Contingent Worker Cost Monitoring and Reporting Required Skills and Experience: Bachelors degree in Computer Science, Information Management, or a related field. Data Management: Proficiency in database systems, data reporting, and data presentation. Must ensure data integrity and accessibility. Analytics: Strong analytical skills with the ability to identify trends and derive actionable insights from complex datasets. Visualization: Expertise in data visualization tools to create clear, impactful reports and dashboards. Business Acumen: Ability to understand business objectives and translate data findings into valuable recommendations. Experience with Data Source integrations (via API, ETL, virtualization, streaming) is desirable, but not required. Communication: Excellent verbal and written skills to present complex information clearly to diverse stakeholders. Technical Adaptability: Proficiency in relevant data analysis tools and programming languages. Commitment to continuous learning in this rapidly evolving field. Desired or preferred experience and technical skills: Proficient in Microsoft Office tools, experience in working with cloud-based tools Smartsheet, Salesforce, Excel, Tableau, Quicksight, Visio, Power Bi and AppSheet.

Posted 5 days ago

Apply

2.0 - 4.0 years

5 - 7 Lacs

mumbai

Work from Office

JLL supports the Whole You, personally and professionally. The Compliance Analyst Specialist: Maintains compliance capabilities while adding dedicated compliance analyst specialist support for: (1) Orbit Data Governance and (2) Asset Management Inventory Governance and File/Edit Document Management Key Responsibilities: GRC Control Controllership Program (GREF GRC Controls) Task: Use data analytics to create a trend analysis on GRC Controls linked to GREF systems and processes. Task: Create a reporting mechanism (dashboard) for compliance effectiveness and monitoring of repeated issues. Task: Identify opportunity areas and create an action plan to minimize the risk of repeated issues and/or identify unmitigated risks (i.e., residual risks). Data Integrity Program (GREF tools) Task: Use data analytics to create trend analysis on ORBIT issues Task: Identify opportunity areas and create action plans to improve productivity and minimize risk of repeated issues. Compliance monitoring and reporting dashboard for regulatory requirements for Data Governance Task: Use data analytics to assess adherence to regulatory requirements (i.e., GDPR, CCPA Internal Policies) Task: Identify opportunity areas and create action plans to mitigate risks and avoid repeated issues. Data Cleaning and Preprocessing for Free Trade Zones (FTZ) Project Remove duplicates and handle missing values from inputs provided by various stakeholders like Tax, accounting, GST, PXT, IT etc. Standardize data formats to enable automation of reports and returns (~110 monthly reports). Exploratory Data Analysis (EDA) for FTZ Project Create visualizations (histograms, scatter plots, box plots) for leadership and Design interactive dashboards using tools like Tableau/Power BI Identify patterns and trends in Import, export, GST benefits etc. Compliance Dashboard Creation and Maintenance for Free Trade Zones (FTZ) and GREF Building Compliance Create regular performance reports and dashboard for compliance metrics Set up automated reporting systems and monitor KPIs Track compliance rates with customs regulations for FTZ Analyze documentation accuracy Monitor restricted goods movement Create compliance reporting dashboards Ad-hoc Analysis and Reporting for Free Trade Zones Project Respond to specific data related business questions Create custom reports based on deep-dive analyses Document Management System (DMS) Administration for FTZ Project Monitor document retention compliance Setting up automated filing systems and managing asset documentation throughout its lifecycle GREF Building Compliance Assurance Program (B-CAP) building repository and reviewing the existing documents across all sites in India Develop, monitor, and report key performance indicators for GREF B-CAP GREF B-CAP testing pre-launch Contingent Worker Cost Monitoring and Reporting Required Skills and Experience: Bachelors degree in Computer Science, Information Management, or a related field. Data Management: Proficiency in database systems, data reporting, and data presentation. Must ensure data integrity and accessibility. Analytics: Strong analytical skills with the ability to identify trends and derive actionable insights from complex datasets. Visualization: Expertise in data visualization tools to create clear, impactful reports and dashboards. Business Acumen: Ability to understand business objectives and translate data findings into valuable recommendations. Experience with Data Source integrations (via API, ETL, virtualization, streaming) is desirable, but not required. Communication: Excellent verbal and written skills to present complex information clearly to diverse stakeholders. Technical Adaptability: Proficiency in relevant data analysis tools and programming languages. Commitment to continuous learning in this rapidly evolving field. Desired or preferred experience and technical skills: Proficient in Microsoft Office tools, experience in working with cloud-based tools Smartsheet, Salesforce, Excel, Tableau, Quicksight, Visio, Power Bi and AppSheet.

Posted 5 days ago

Apply

2.0 - 4.0 years

5 - 7 Lacs

bengaluru

Work from Office

JLL supports the Whole You, personally and professionally. The Compliance Analyst Specialist: Maintains compliance capabilities while adding dedicated compliance analyst specialist support for: (1) Orbit Data Governance and (2) Asset Management Inventory Governance and File/Edit Document Management Key Responsibilities: GRC Control Controllership Program (GREF GRC Controls) Task: Use data analytics to create a trend analysis on GRC Controls linked to GREF systems and processes. Task: Create a reporting mechanism (dashboard) for compliance effectiveness and monitoring of repeated issues. Task: Identify opportunity areas and create an action plan to minimize the risk of repeated issues and/or identify unmitigated risks (i.e., residual risks). Data Integrity Program (GREF tools) Task: Use data analytics to create trend analysis on ORBIT issues Task: Identify opportunity areas and create action plans to improve productivity and minimize risk of repeated issues. Compliance monitoring and reporting dashboard for regulatory requirements for Data Governance Task: Use data analytics to assess adherence to regulatory requirements (i.e., GDPR, CCPA Internal Policies) Task: Identify opportunity areas and create action plans to mitigate risks and avoid repeated issues. Data Cleaning and Preprocessing for Free Trade Zones (FTZ) Project Remove duplicates and handle missing values from inputs provided by various stakeholders like Tax, accounting, GST, PXT, IT etc. Standardize data formats to enable automation of reports and returns (~110 monthly reports). Exploratory Data Analysis (EDA) for FTZ Project Create visualizations (histograms, scatter plots, box plots) for leadership and Design interactive dashboards using tools like Tableau/Power BI Identify patterns and trends in Import, export, GST benefits etc. Compliance Dashboard Creation and Maintenance for Free Trade Zones (FTZ) and GREF Building Compliance Create regular performance reports and dashboard for compliance metrics Set up automated reporting systems and monitor KPIs Track compliance rates with customs regulations for FTZ Analyze documentation accuracy Monitor restricted goods movement Create compliance reporting dashboards Ad-hoc Analysis and Reporting for Free Trade Zones Project Respond to specific data related business questions Create custom reports based on deep-dive analyses Document Management System (DMS) Administration for FTZ Project Monitor document retention compliance Setting up automated filing systems and managing asset documentation throughout its lifecycle GREF Building Compliance Assurance Program (B-CAP) building repository and reviewing the existing documents across all sites in India Develop, monitor, and report key performance indicators for GREF B-CAP GREF B-CAP testing pre-launch Contingent Worker Cost Monitoring and Reporting Required Skills and Experience: Bachelors degree in Computer Science, Information Management, or a related field. Data Management: Proficiency in database systems, data reporting, and data presentation. Must ensure data integrity and accessibility. Analytics: Strong analytical skills with the ability to identify trends and derive actionable insights from complex datasets. Visualization: Expertise in data visualization tools to create clear, impactful reports and dashboards. Business Acumen: Ability to understand business objectives and translate data findings into valuable recommendations. Experience with Data Source integrations (via API, ETL, virtualization, streaming) is desirable, but not required. Communication: Excellent verbal and written skills to present complex information clearly to diverse stakeholders. Technical Adaptability: Proficiency in relevant data analysis tools and programming languages. Commitment to continuous learning in this rapidly evolving field. Desired or preferred experience and technical skills: Proficient in Microsoft Office tools, experience in working with cloud-based tools Smartsheet, Salesforce, Excel, Tableau, Quicksight, Visio, Power Bi and AppSheet.

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are invited to join Chryselys, a Pharma Analytics & Business consulting company based in Hyderabad with a hybrid work setup. Our focus is on delivering data-driven insights through AI-powered, cloud-native platforms to drive high-impact transformations. At Chryselys, we excel in digital technologies and advanced data science techniques that offer strategic and operational insights. Our team comprises industry veterans, advisors, and senior strategists with diverse backgrounds from top-tier companies. We prioritize delivering the value of a big five consulting firm without the associated costs. Our solutions are business-centric and built on cloud-native technologies. As a Data Scientist at Chryselys, your responsibilities will include designing, developing, and implementing machine learning models and algorithms to analyze pharmaceutical datasets, leveraging platforms such as IQVIA, Symphony, Veeva, and Open data. You will use advanced statistical techniques and data science methodologies to extract insights from complex data sources. Building and maintaining data pipelines and scalable machine learning models on Data Lake architecture using cloud-based platforms like AWS (Redshift, S3, SageMaker) is also part of your role. Collaboration with cross-functional teams to translate business problems into data science solutions is essential. You will be expected to create and present data visualizations, dashboards, and reports using tools like PowerBI, Tableau, Qlik, QuickSight, and ThoughtSpot to effectively communicate findings and recommendations to clients. Conducting exploratory data analysis, data profiling, and feature engineering to prepare datasets for predictive modeling is another key aspect of your role. Evaluating model performance, optimizing algorithms, and ensuring robustness and accuracy in predictions are critical tasks. To excel in this role, you are required to have a Bachelor's or master's degree in data science, statistics, computer science, engineering, or a related quantitative field with a strong academic record. Proficiency in programming languages such as Python and R, along with a deep understanding of libraries like TensorFlow, Scikit-learn, and Pandas, is essential. Strong experience with SQL and cloud-based data processing environments like AWS (Redshift, Athena, S3) is preferred, as well as familiarity with Jupyter Notebooks/SageMaker. Demonstrated ability to build data visualizations and communicate insights through various tools, strong analytical skills, experience in hypothesis testing, A/B testing, and statistical analysis are required. Additionally, you should possess exceptional communication and presentation skills to explain complex data science concepts to non-technical stakeholders. A strong problem-solving mindset and the ability to adapt and innovate in a dynamic consulting environment are also key attributes for this role.,

Posted 6 days ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

pune

Work from Office

Core Responsibilities: Data Analysis & KPI Development: Perform comprehensive analysis of product data to define, track, and maintain key performance indicators (KPIs), evaluating the effectiveness of initiatives and identifying areas for improvement. Reporting & Visualization: Design and maintain dashboards and Excel-based reports to effectively communicate product performance and insights to stakeholders. Cross-functional Collaboration: Partner with product managers, architects, designers, and engineers to align business goals and ensure accurate, efficient, and error-free delivery. Innovation in Data Practices: Continuously seek out new data sources, tools, and analytical techniques to enhance the depth and quality of insights. Insight Generation: Interpret analytical results to uncover trends, patterns, and anomalies, and provide actionable recommendations to support strategic decisions. Regulatory Compliance: Prepare and submit reports to regulatory bodies within established timelines, ensuring full compliance. Data Governance: Collaborate with regulatory and data teams to ensure data handling aligns with relevant compliance standards Preferred Skills and Experience: Education: Bachelor's or master's degree in engineering (B.E., B.Tech, M.Tech). Experience: 3+ years of experience in data analysis; familiarity with the finance domain is a plus. Technical Proficiency: Strong skills in data querying and handling large, complex datasets. Proficient in SQL, Python or R, Excel, Knime, Alteryx and data visualization tools (e.g., Tableau, Quick Sight, or open-source alternatives). Quality Assurance: Experience in implementing QA processes to ensure data accuracy, consistency, and reliability, along with troubleshooting capabilities. Communication: Excellent verbal and written communication skills, with the ability to present complex data clearly and concisely.

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 15 Lacs

chennai

Work from Office

KONE Technology and Innovation Unit (KTI) is where the magic happens at KONE. It's where we combine the physical world – escalators and elevators – with smart and connected digital systems. We are changing and improving the way billions of people move within buildings every day. We are on a mission to expand and develop new digital solutions that are based on emerging technologies. KONE’s vision is to create the Best People Flow® experience by providing ease, effectiveness and experiences to our customers and users. In line with our strategy, Sustainable Success with Customers, we will focus on increasing the value we create for customers with new intelligent solutions and embed sustainability even deeper across all of our operations. By closer collaboration with customers and partners, KONE will increase the speed of bringing new services and solutions to the market. R&D unit in KTI is responsible for developing digital services at KONE. It’s the development engine for our Digital Services such as KONE 24/7 Connected Services , Office Flow and Partnership Ecosystem. We are looking for Cloud Automation Architect with strong expertise in Automation on AWS cloud, UI, API, Data and ML Ops . The ideal candidate will bring hands-on technical leadership, architect scalable automation solutions, and drive end-to-end solution design for enterprise-grade use cases. You will collaborate with cross-functional teams including developers, DevOps engineers, product owners, and business stakeholders to deliver automation-first solutions. Role description: Solution Architecture & Design Architect and design automation solutions leveraging Cloud services and data management Define E2E architecture spanning cloud infrastructure, APIs, UI, and visualization layers Translate business needs into scalable, secure, and cost-effective technical solutions. Automation on Cloud Lead automation initiatives across infrastructure, application workflows, and data pipelines Implement operations use cases using Data Visualization and Cloud Automation Optimize automation for cost, performance, and security UI & API Integration Design and oversee development of APIs and microservices to support automation Guide teams on UI frameworks (React/Angular) for building dashboards and portals Ensure seamless integration between APIs, front-end applications, OCR and cloud services Data & ML Ops Define architecture for data ingestion, transformation, and visualization on AWS. Work with tools like Amazon QuickSight, Power BI to enable business insights Establish ML Ops best practices for data-driven decision-making Architect and implement end-to-end MLOps pipelines for training, deployment, and monitoring ML models. Use AWS services like SageMaker, Step Functions, Lambda, Kinesis, Glue, S3, Redshift for ML workflows. Establish best practices for model versioning, reproducibility, CI/CD for ML, and monitoring model drift. Team leading and Collaboration Mentor engineering teams on cloud-native automation practices Collaborate with product owners to prioritize and align technical solutions with business outcomes Drive POCs and innovation initiatives for automation at scale Requirements: 8–10 years of experience in cloud architecture, automation, and solution design Deep expertise in Python for automation Use cases and understanding of ML Ops Experience with data engineering & visualization tools Knowledge of UI frameworks (React, Angular, Vue) for portals and dashboards Expertise in AWS Cloud services for compute, data, and ML workloads Strong understanding of security, IAM, compliance, and networking in AWS Hands-on experience with MLOps pipelines (model training, deployment, monitoring). At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life. Read more on www.kone.com/careers

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 15 Lacs

pune

Work from Office

KONE Technology and Innovation Unit (KTI) is where the magic happens at KONE. It's where we combine the physical world – escalators and elevators – with smart and connected digital systems. We are changing and improving the way billions of people move within buildings every day. We are on a mission to expand and develop new digital solutions that are based on emerging technologies. KONE’s vision is to create the Best People Flow® experience by providing ease, effectiveness and experiences to our customers and users. In line with our strategy, Sustainable Success with Customers, we will focus on increasing the value we create for customers with new intelligent solutions and embed sustainability even deeper across all of our operations. By closer collaboration with customers and partners, KONE will increase the speed of bringing new services and solutions to the market. R&D unit in KTI is responsible for developing digital services at KONE. It’s the development engine for our Digital Services such as KONE 24/7 Connected Services , Office Flow and Partnership Ecosystem. We are looking for Cloud Automation Architect with strong expertise in Automation on AWS cloud, UI, API, Data and ML Ops . The ideal candidate will bring hands-on technical leadership, architect scalable automation solutions, and drive end-to-end solution design for enterprise-grade use cases. You will collaborate with cross-functional teams including developers, DevOps engineers, product owners, and business stakeholders to deliver automation-first solutions. Role description: Solution Architecture & Design Architect and design automation solutions leveraging Cloud services and data management Define E2E architecture spanning cloud infrastructure, APIs, UI, and visualization layers Translate business needs into scalable, secure, and cost-effective technical solutions. Automation on Cloud Lead automation initiatives across infrastructure, application workflows, and data pipelines Implement operations use cases using Data Visualization and Cloud Automation Optimize automation for cost, performance, and security UI & API Integration Design and oversee development of APIs and microservices to support automation Guide teams on UI frameworks (React/Angular) for building dashboards and portals Ensure seamless integration between APIs, front-end applications, OCR and cloud services Data & ML Ops Define architecture for data ingestion, transformation, and visualization on AWS. Work with tools like Amazon QuickSight, Power BI to enable business insights Establish ML Ops best practices for data-driven decision-making Architect and implement end-to-end MLOps pipelines for training, deployment, and monitoring ML models. Use AWS services like SageMaker, Step Functions, Lambda, Kinesis, Glue, S3, Redshift for ML workflows. Establish best practices for model versioning, reproducibility, CI/CD for ML, and monitoring model drift. Team leading and Collaboration Mentor engineering teams on cloud-native automation practices Collaborate with product owners to prioritize and align technical solutions with business outcomes Drive POCs and innovation initiatives for automation at scale Requirements: 8–10 years of experience in cloud architecture, automation, and solution design Deep expertise in Python for automation Use cases and understanding of ML Ops Experience with data engineering & visualization tools Knowledge of UI frameworks (React, Angular, Vue) for portals and dashboards Expertise in AWS Cloud services for compute, data, and ML workloads Strong understanding of security, IAM, compliance, and networking in AWS Hands-on experience with MLOps pipelines (model training, deployment, monitoring). At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life. Read more on www.kone.com/careers

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

bengaluru

Work from Office

About The Role Skill required: Procure to Pay - Master Data Management (MDM) Designation: Procure to Pay Operations Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Reporting Background (Domain Experience)Able to understand and produce SLAs, KPIs & KRIs, Daily, Weekly and Monthly ReportsExpert in MS-Advance Excel (including advanced formulas, pivot tables, etc.), MS-PowerPoint, MS-Wordboosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. Responsible for accounting of goods and services, through requisitioning, purchasing and receiving. Also look after order sequence of procurement and financial process end to end. The Accounts Payable Processing team focuses on designing, implementing, managing and supporting accounts payable activities by applying the relevant processes, policies and applications. The team is responsible for timely and accurate billing and processing of invoices, managing purchase and non-purchase orders and two-way and three-way matching of invoices.Architect master data solutions across platforms to consolidate content, synchronize data, provide centralized maintenance of unified data, enable rich product content management and print publishing as well as to synchronize global data ensuring consistency and control of master data elements. What are we looking for? 3. Risk Management Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shiftsReporting Background (Domain Experience)Able to understand and produce SLAs, KPIs & KRIs, Daily, Weekly and Monthly ReportsExpert in MS-Advance Excel (including advanced formulas, pivot tables, etc.), MS-PowerPoint, MS-WordBI Tool Exposure and Dashboard creation Experience with at least one Business Intelligence (BI) tool and dashboard creation (e.g., PowerBI, Tableau, or AWS QuickSight).Database knowledge should have good understanding in SQL query and MS-Access Database for data extraction, manipulation, and reportingG-Suit exposure, Advance Excel, and Tableau Qualification Any Graduation

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

pune

Work from Office

Qualification: Degree in Computer Science (or similar), alternatively well-founded professional experience in the desired field Experience Range: 3 to 5 Years Roles & Responsibilities: As a Senior Data Engineer, you manage and develop the solutions in close alignment with various business and Spoke stakeholders. You are responsible for the implementation of the IT governance guidelines. Collaborate with the Spokes Data Scientists, Data Analysts, and Business Analysts, when relevant. Tasks Create and manage data pipeline architecture for data ingestion, pipeline setup and data curation Experience working with and creating cloud data solutions Assemble large, complex data sets that meet functional/non-functional business requirements Implement the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Pyspark, SQL and AWS big data-technologies Build analytics tools that use the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics Manipulate data at scale: getting data in a ready-to-use state in close alignment with various business an d Spoke stak eholders Must Have: Advanced knowledge: ETL Data Lake, Data Warehouse, RDS architectures knowledge Python, SQL (Any other OOP language is also valuable) Pyspark (preferably) or Spark Knowledge Object-oriented programming, Clean Code and good documentation skills AWS: S3, Athena, Lambda, Glue, IAM, SQS, EC2, Quicksight, and etc. Git Data Analysis & Visualization Optional: AWS CDK Cloud Development Kit CI/CD knowledge

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

You will be working as a Senior Business Intelligence Engineer at COVU, a venture-backed technology startup focused on modernizing independent insurance agencies using AI and data analytics. Your role will involve designing and building real-time dashboards, optimizing data pipelines, and establishing best practices for BI platforms. As a Senior Business Intelligence Engineer, you will be responsible for owning and operating the BI stack, including architecture, security, and performance. You will design and develop real-time dashboards for various teams within the company and optimize data pipelines using AWS and SQL/NoSQL technologies. Additionally, you will document and promote BI best practices and present insights to both technical and non-technical stakeholders. To qualify for this position, you should have at least 5 years of experience in building BI/analytics solutions on cloud data warehouses. Strong SQL skills and experience with AWS services like QuickSight, Redshift, Glue, and Lambda are required. You should also have expertise in modeling large relational and NoSQL datasets for efficient dashboards and implementing data governance practices with a focus on PII protection. Bonus qualifications include familiarity with P&C insurance data, Applied Epic, and Salesforce. The company offers health and dental insurance benefits, and the position is fully remote, allowing candidates from anywhere in the world to apply.,

Posted 1 week ago

Apply

6.0 - 11.0 years

9 - 14 Lacs

gurugram, delhi / ncr

Work from Office

Role & responsibilities Leverage analytical skills and independent judgement to interpret moderately complex goals, trends, risks, and areas for improvement by collecting, analyzing, and reporting on key metrics and conclusions. May maintain integrity of reports and/or dashboards to allow business to operate accurately and efficiently • Use expanded analytic solutions and knowledge to support customer teams and improve efficiencies. Work on projects/matters of moderate complexity in an independent contributor role. Complexity can vary based on several factors such as client size, number of systems, varying levels of established structures, or dynamics of customer and/or data • Work cross-functionally and build internal and/or external relationships to ensure high quality data is available for analysis and better business understanding • Develop and deliver data-driven insights and recommendations to internal and/or external stakeholders • Engage day-to-day with stakeholders for planning, forecasting, and gaining a solid understanding of business questions for appropriate documentation and analysis • Work well independently and seek counsel and guidance on more complex projects/matters, as needed. Work is generally reliable on routine tasks and assignment Preferred candidate profile Required Qualifications • Proficient knowledge of expanded analysis solutions/tools (such as OLTP/OLAP data structures, advanced Excel, Tableau, Salesforce, Power BI, Business Objects) • Proficient knowledge of domain languages (such as SQL Query, HIVE QL, etc.) • Application of moderately complex statistical methods (such as deviations, quartiles, etc.) • 2 5 years of related experience to reflect skills and talent necessary for this role preferred • May require practical sales motion knowledge • May require practical industry and demographic understanding in one of the following: hardware, software, SaaS, healthcare, or industrial • May require strong proficiency in all Microsoft Office applications (especially Word, Excel, and PowerPoint) Preferred Qualification . Bachelors degree/diploma, or the equivalent, preferred. Degree/diploma in computer science, finance, or statistics/mathematics a plus • 4+ years of experience in data driven business insight recommendations, business analytics, dashboard design experienc

Posted 1 week ago

Apply

6.0 - 10.0 years

27 - 32 Lacs

chennai, gurugram, bengaluru

Work from Office

Join us as a Data & Analytics Analyst Take on a new challenge in Data & Analytics and help us shape the future of our business Youll be helping to manage the analysis of complex data to identify business issues and opportunities, and supporting the delivery of high quality business solutions We're committed to mapping a career path that works for you, with a focus on helping you build new skills and engage with the latest ideas and technologies in data analytics We're offering this role at associate vice president level What you'll do As a Data & Analytics Analyst, youll be providing high quality analytical input to support the development and implementation of innovative processes and problem resolution. Youll be capturing, validating and documenting business and data requirements, making sure they are in line with key strategic principles. Well look to you to interrogate, interpret and visualise large volumes of data to identify, support and challenge business opportunities and identify solutions. Youll also be: Performing data extraction, storage, manipulation, processing and analysis Conducting and supporting options analysis, identifying the most appropriate solution Accountable for the full traceability and linkage of business requirements of analytics outputs Seeking opportunities to challenge and improve current business processes, ensuring the best result for the customer Creating and executing quality assurance at various stages of the project in order to validate the analysis and to ensure data quality, identify data inconsistencies, and resolve as needed The skills you'll need Youll need a background in business analysis tools and techniques, along with the ability to influence through communications tailored to a specific audience. Additionally, youll need the ability to use core technical skills. Youll also demonstrate: Strong analytic and problem solving abilities A keen eye for detail in your work Experience in AWS QuickSight (Advanced), Tableau (Advanced), Snowflake Demonstrable experience with stakeholder engagement and ability to coach and mentor others in the team

Posted 1 week ago

Apply

3.0 - 7.0 years

7 - 11 Lacs

bengaluru

Work from Office

Job Requirements Data layer architecting in the Data Architecture domain, responsible for the design and creation of standard and benchmark data models using Layered Scalable Architecture principles / Medallion Architecture to deliver the Data Marts across all Businesses of Titan with highest data quality, performance & costs. Design optimization to handle high volumes having several decades of Business Data, External Data and complex structures. Managing and maintaining data dictionary/ glossary/ERD Creation of Conceptual, Logical and Physical Data Models Project management Work Experience Strong Technical knowledge of SQL (Advanced SQL store procedures, performance tuning), RDBMS, Data Warehousing and Modelling preferablyin AWS stack ER Modelling \u2013 Forward & Reverse Engineering should have worked on modelling tools and extensive knowledge on Conceptual, Logical and Physical Models Should have independently managed customers and projects of 4 members or more Minimum 3 projects implementation experience in Retail/Manufacturing/HR/Finance domains Good Knowledge on ETL tools (min 1 tool) preferably SAP Data Services/IBM Data Stage/AWS Glue Good process knowledge of minimum one Domain Experience in Leading Projects in agile framework at least 1 medium to complex project Experience in implementation of SF Customer Data Cloud would be an added advantage Good to have Statistical data modelling / Predictive modelling skills Hands On experience is SalesForce Customer Data Cloud (creation of Customer segmentations for campaigns) would be an added advantage Should have good understanding of Reporting Tools such as Tableau / Power-BI / AWS Quicksight Should have exposure to Data Governance

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The future of work at GoDaddy varies for each team, with some working in the office full-time, others in a hybrid arrangement, and some entirely remotely. As a Senior Financial Analyst Corporate Finance and Financial Modeling, you will focus on modeling, tracking, and understanding the business drivers that propel GoDaddy forward. This role is crucial, involving the development of long-term strategic plans and contributing to the execution of strategies and initiatives across the organization. You will have the opportunity to collaborate with leadership on key opportunities, build financial models for initiatives, and support strategic decision-making processes. Working closely with Business Unit Finance Teams, your responsibilities will include ensuring accuracy and efficiency in all financial processes, supporting annual and quarterly budgets, forecasts, and variance analyses, identifying cost-saving opportunities to enhance profitability, and implementing financial initiatives to drive business growth. To excel in this role, you should possess at least 4 years of proven experience in finance, including financial modeling, forecasting, and analysis in a global organization. Strong technical skills in Excel, along with knowledge of Tableau, QuickSight, Workday, and Adaptive, are essential. Effective communication of financial information at all levels is crucial, as is past experience collaborating across regions, teams, customers, and external partners to optimize performance and achieve results. A Bachelor's degree in Finance or Accounting or equivalent experience is preferred. Professional certifications such as CFA or CA are advantageous. GoDaddy offers a range of benefits including paid time off, retirement savings, incentive eligibility, equity grants, and family-friendly benefits. We value diversity and offer Employee Resource Groups to foster an inclusive culture. GoDaddy is committed to providing tools and support to empower entrepreneurs globally and make opportunities more inclusive. As an equal opportunity employer, GoDaddy welcomes applicants with various backgrounds and experiences. Candidates from Colorado can redact age-identifying information from their application materials without penalty. GoDaddy does not accept unsolicited resumes from recruiters or employment agencies.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Sales Operations Analyst at Zenoti, you will have a pivotal role in enhancing sales operations efficiency, supporting the global AE and SDR teams, and driving process optimization for rapid company growth. Your main responsibilities will include managing operational requests, overseeing various sales technology tools like Zuora, DocuSign, and Outreach, and ensuring the smooth execution of workflows to meet company objectives. Your valuable contributions will aid in enhancing data governance, automating processes, and enhancing overall sales effectiveness. Your tasks will involve fulfilling ad-hoc reporting and dashboarding requests to analyze sales performance, maintaining and optimizing the sales tech stack, evaluating and proposing new technology solutions to enhance sales operations, ensuring proper governance of Salesforce data, acting as a functional expert in sales operations by streamlining workflows and adhering to SOP's, developing and updating documentation such as FAQs, process guides, and training materials, supporting the sales team with operational requests and troubleshooting, and providing guidance on CPQ/Zuora processes to assist the sales team. To excel in this role, you should hold a Bachelor's degree along with an MBA, possess at least 2 years of experience in sales operations or a related function, have a strong understanding of how SAAS organizations operate and scale, be proficient in CRM systems particularly Salesforce, with experience in automation, demonstrate expertise in building reports and dashboards in Salesforce, and showcase familiarity with owning tech stacks and automation tools. Additionally, you should have experience working with tech stacks, CPQ/Zuora experience is beneficial, excellent communication and problem-solving skills, an agile mindset with attention to detail, the ability to work independently and proactively identify process improvements, strong time management skills, and readiness to work extended hours during peak periods. Join Zenoti to be part of an innovative company revolutionizing the wellness and beauty industry, work alongside a diverse team that values collaboration, creativity, and growth, lead impactful projects to shape the global success of Zenoti's platform, receive attractive compensation, medical coverage for yourself and your immediate family, access to yoga, meditation, and stress management sessions, engage in regular social activities, and contribute to social work and community initiatives. Make a difference by joining Zenoti on its mission to empower wellness businesses worldwide.,

Posted 1 week ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

bengaluru

Work from Office

Create, iterate, and maintain dashboards for tracking key metrics and driving business or product recommendations Develop a deep understanding of the business/product and partner with business stakeholders/Product Managers to identify the key metrics to be analyzed and tracked for the problem Identify and present actionable insights and recommendations to the leadership using high-quality visualizations and concise messaging to influence the direction of the business or product Constantly monitoring and analyzing metrics identified, publish insights and anomalies along with hypothesis/RCA Understand business implications of area at hand and structure analysis accordingly Build customer facing dashboards within the product; understanding product metrics which could provide insights to the customer, writing logics and creating dashboards to represent the metrics Working with engineering/data platforms on designing, building and deploying data analysis systems for large datasets and fulfilling requirements for the identified metrics Requirements - 2-4 years of relevant experience in Analytics Proven Technical, Analytical and Quantitative skills Proficiency in Analytics visualization tools like SQL, Excel, AWS Quicksight, Tableau, Superset, LookerStudio Strong communication skills and comfortable in working with various functional teams Strong problem-solving skills; Flexibility to adapt to diverse problem areas Ability to work with ambiguity of problem statement Data-oriented with a strong grasp on excel/spreadsheets

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Associate Director MIS and Data Analytics at our organization, you will be at the forefront of our data-driven approach to empower businesses in effectively navigating the complexities of chargebacks. Your role will be instrumental in leading our MIS and data strategy efforts, including structuring data, developing advanced reporting frameworks, and implementing business intelligence solutions to facilitate data-driven decision-making and contribute to our mission of transforming the chargeback landscape. The responsibilities of this role include designing and managing scalable MIS and Data Analytics processes, overseeing reporting automation, and translating raw data into actionable insights. You will play a key role in ensuring data accuracy, optimizing performance metrics, and collaborating across departments to drive business growth through intelligent, data-driven strategies. Your expertise in analytics, dashboarding, and data storytelling will enable key stakeholders to make informed decisions and enhance operational efficiency. To excel in this role, you will need a blend of strategic thinking, technical expertise, and leadership skills, along with a strong understanding of MIS architecture, data visualization, and automation frameworks. Additionally, your ability to work collaboratively in a fast-paced environment with creative, smart, and flexible individuals will be crucial to the success of our team. Key responsibilities of the role include developing and implementing a robust MIS strategy to support business intelligence, overseeing end-to-end MIS processes, and driving the automation of real-time dashboards and reports using tools such as Power BI, Tableau, SQL, QuickSight, and others. You will also analyze large and complex datasets related to financial transactions, customer behavior, chargeback trends, and risk assessment to identify patterns and trends that enhance fraud prevention and operational efficiency. Moreover, you will be responsible for designing and maintaining advanced MIS reports, implementing reporting automation solutions, and creating customized ad-hoc reports for leadership teams. Collaborating with cross-functional teams such as Sales, Operations, Finance, and Product, you will translate data into actionable business strategies and insights for decision-making. Qualifications for this role include a Bachelor's/Master's degree in Data Science, Computer Science, Business Analytics, Mathematics, Statistics, or a related field. You should have strong expertise in SQL, Python, R, VBA, and data visualization tools, as well as experience in automation of reporting and dashboarding. Additionally, proficiency in MS Excel, PowerPoint, and Word, along with cloud-based data platforms and certifications in Six Sigma, PMP, or Lean, would be advantageous. Strong problem-solving skills, leadership abilities, and effective communication skills are essential for this role, along with the flexibility to work in UK/US time zones.,

Posted 1 week ago

Apply

5.0 - 8.0 years

12 - 22 Lacs

pune

Hybrid

Key Responsibilities: Own the design, development, and maintenance of ongoing metrics, reports, analyses, and dashboards on the key drivers of the business. Partner with operations and business teams to consult, develop, and implement KPIs, automated reporting/process solutions, and data infrastructure improvements. Enable effective decision-making by retrieving and aggregating data from multiple sources and compiling it into actionable formats. Manage timely delivery of regular client reports, including: Building reports from the data warehouse. Reviewing completed reports for anomalies & discrepancies. Troubleshooting data issues/discrepancies. Ensuring formatting and delivery standards are met. Maintain and update Tableau and Excel dashboards for daily/weekly client reporting. Explore and integrate new data sources into dashboards and reporting tools. Support data cleansing and manipulation processes, including taxonomy classification, conversion renaming/grouping, and removal of test/ghost impressions. Desired Skills & Experience: Minimum 5 + years of experience in Analytics/Business Intelligence. Strong verbal, written, and data presentation skills with the ability to communicate effectively with both business and technical teams. Hands-on experience in creating complex Excel reports and SQL queries (with joins across multiple datasets). Proficiency in data visualization tools such as Tableau, Power BI, QuickSight, or Datorama. Strong analytical thinking and problem-solving skills with attention to detail. Ability to work in a fast-paced, ambiguous, and rapidly-changing environment . Experience in developing reporting requirements and defining business metrics.

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 17 Lacs

pune

Hybrid

Key Responsibilities: Own the design, development, and maintenance of ongoing metrics, reports, analyses, and dashboards on the key drivers of the business. Partner with operations and business teams to consult, develop, and implement KPIs, automated reporting/process solutions, and data infrastructure improvements. Enable effective decision-making by retrieving and aggregating data from multiple sources and compiling it into actionable formats. Manage timely delivery of regular client reports, including: Building reports from the data warehouse. Reviewing completed reports for anomalies & discrepancies. Troubleshooting data issues/discrepancies. Ensuring formatting and delivery standards are met. Maintain and update Tableau and Excel dashboards for daily/weekly client reporting. Explore and integrate new data sources into dashboards and reporting tools. Support data cleansing and manipulation processes, including taxonomy classification, conversion renaming/grouping, and removal of test/ghost impressions. Desired Skills & Experience: Minimum 2.8+ years of experience in Analytics/Business Intelligence. Strong verbal, written, and data presentation skills with the ability to communicate effectively with both business and technical teams. Hands-on experience in creating complex Excel reports and SQL queries (with joins across multiple datasets). Proficiency in data visualization tools such as Tableau, Power BI, QuickSight, or Datorama. Strong analytical thinking and problem-solving skills with attention to detail. Ability to work in a fast-paced, ambiguous, and rapidly-changing environment . Experience in developing reporting requirements and defining business metrics.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies