Jobs
Interviews

88 Quicksight Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Primary skill : AWS, Quicksight Secondary skill : AWS Glue, Lambda, Athena, Redshift, Aurora Experience : 5-9 years Location : Pune/Mumbai/Chennai/Noida/Bangalore/Coimbatore Notice period : Immediate joiners Design, develop, and maintain large-scale data pipelines using AWS services such as Athena, Aurora, Glue, Lambda, and Quicksight. Develop complex SQL queries to extract insights from massive datasets stored in Amazon Redshift.

Posted 2 months ago

Apply

9.0 - 14.0 years

25 - 40 Lacs

Bengaluru

Hybrid

Greetings from tsworks Technologies India Pvt . We are hiring for Sr. Data Engineer - Snowflake with AWS If you are interested, please share your CV to mohan.kumar@tsworks.io Position: Senior Data Engineer Experience: 9+ Years Location: Bengaluru, India (Hybrid) Mandatory Required Qualification Strong proficiency in AWS data services such as S3 buckets, Glue and Glue Catalog, EMR, Athena, Redshift, DynamoDB, Quick Sight, etc. Strong hands-on experience building Data Lake-House solutions on Snowflake, and using features such as streams, tasks, dynamic tables, data masking, data exchange etc. Hands-on experience using scheduling tools such as Apache Airflow, DBT, AWS Step Functions and data governance products such as Collibra Expertise in DevOps and CI/CD implementation Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the AWS cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using PySpark (AWS Glue, EMR or Databricks), Snowflake's data processing capabilities, or other relevant tools. Hands-on experience working with Data Lake solutions such as Apache Hudi, Delta Lake or Iceberg. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Skills & Knowledge Bachelor's degree in computer science, Engineering, or a related field. 9 + Years of experience in Information Technology, designing, developing and executing solutions. 4+ Years of hands-on experience in designing and executing data solutions on AWS and Snowflake cloud platforms as a Data Engineer. Strong proficiency in AWS services such as Glue, EMR, Athena, Databricks, with file formats such as Parquet and Avro. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Hands-on experience in handling real-time data streams from Kafka or Kinesis is required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Knowledge of data quality, governance, and security best practices. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. AWS and Snowflake Certifications are preferred.

Posted 2 months ago

Apply

6.0 - 11.0 years

16 - 20 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. The Optum Technology Digital team is on a mission to disrupt the healthcare industry, transforming UHG into an industry-leading Consumer brand. We deliver hyper-personalized digital solutions that empower direct-to-consumer, digital-first experiences, educating, guiding, and empowering consumers to access the right care at the right time. Our mission is to revolutionize healthcare for patients and providers by delivering cutting-edge, personalized and conversational digital solutions. We’re Consumer Obsessed, ensuring they receive exceptional support throughout their healthcare journeys. As we drive this transformation, we're revolutionizing customer interactions with the healthcare system, leveraging AI, cloud computing, and other disruptive technologies to tackle complex challenges. Serving UnitedHealth Group's digital technology needs, the Consumer Engineering team impacts millions of lives through UnitedHealthcare & Optum. We are seeking a dynamic individual who embodies modern engineering culture - someone with deep engineering expertise within a digital product model, a passion for innovation, and a relentless drive to enhance the consumer experience. Our ideal candidate thrives in an agile, fast-paced rapid-prototyping environment, embraces DevOps and continuous integration/continuous deployment (CI/CD) practices, and champions the Voice of the Customer. If you are driven by the pursuit of excellence, eager to innovate, and excited to make a tangible impact within a team that embraces modern technologies and consumer-centric strategies, while prioritizing robust cyber-security protocols, we invite you to explore this exciting opportunity with us. Join our team and be at the forefront of shaping the future of healthcare, where your unique skills will not only be recognized but celebrated. Primary Responsibilities Design and implement data models to analyse business, system, and security events for real-time insights and threat detection Conduct exploratory data analysis (EDA) to understand patterns and relationships across large data sets, and develop hypotheses for new model development Develop dashboards and reports to present actionable insights to business and security teams Build and automate near real-time analytics workflows on AWS, leveraging services like Kinesis, Glue, Redshift, and QuickSight Collaborate with AI/ML engineers to develop and validate data features for model inputs Interpret and communicate complex data trends to stakeholders and provide recommendations for data-driven decision-making Ensure data quality and governance standards, collaborating with data engineering teams to build quality data pipelines Develop data science algorithms & generate actionable insights as per platform needs and work closely with cross capability teams throughout solution development lifecycle from design to implementation & monitoring Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B. Tech or Master’s degree or equivalent experience 12+ years of experience in data engineering roles in Data Warehouse 3+ years of experience as a Data Scientist with a focus on building models for analytics and insights in AWS environments Experience with AWS data and analytics services (e.g., Kinesis, Glue, Redshift, Athena, TimeStream) Hands-on experience with statistical analysis, anomaly detection and predictive modelling Proficiency with SQL, Python, and data visualization tools like QuickSight, Tableau, or Power BI Proficiency in data wrangling, cleansing, and feature engineering Preferred Qualifications Experience in security data analytics, focusing on threat detection and prevention Knowledge of AWS security tools and understanding of cloud data security principles Familiarity with deploying data workflows using CI/CD pipelines in AWS environments Background in working with real-time data streaming architectures and handling high-volume event-based data

Posted 2 months ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Gurugram, Delhi / NCR

Work from Office

Role & responsibilities Leverage analytical skills and independent judgement to interpret moderately complex goals, trends, risks, and areas for improvement by collecting, analyzing, and reporting on key metrics and conclusions. May maintain integrity of reports and/or dashboards to allow business to operate accurately and efficiently • Use expanded analytic solutions and knowledge to support customer teams and improve efficiencies. Work on projects/matters of moderate complexity in an independent contributor role. Complexity can vary based on several factors such as client size, number of systems, varying levels of established structures, or dynamics of customer and/or data • Work cross-functionally and build internal and/or external relationships to ensure high quality data is available for analysis and better business understanding • Develop and deliver data-driven insights and recommendations to internal and/or external stakeholders • Engage day-to-day with stakeholders for planning, forecasting, and gaining a solid understanding of business questions for appropriate documentation and analysis • Work well independently and seek counsel and guidance on more complex projects/matters, as needed. Work is generally reliable on routine tasks and assignment Preferred candidate profile Required Qualifications • Proficient knowledge of expanded analysis solutions/tools (such as OLTP/OLAP data structures, advanced Excel, Tableau, Salesforce, Power BI, Business Objects) • Proficient knowledge of domain languages (such as SQL Query, HIVE QL, etc.) • Application of moderately complex statistical methods (such as deviations, quartiles, etc.) • 2 5 years of related experience to reflect skills and talent necessary for this role preferred • May require practical sales motion knowledge • May require practical industry and demographic understanding in one of the following: hardware, software, SaaS, healthcare, or industrial • May require strong proficiency in all Microsoft Office applications (especially Word, Excel, and PowerPoint) Preferred Qualification . Bachelors degree/diploma, or the equivalent, preferred. Degree/diploma in computer science, finance, or statistics/mathematics a plus • 4+ years of experience in data driven business insight recommendations, business analytics, dashboard design experienc

Posted 2 months ago

Apply

3.0 - 6.0 years

13 - 19 Lacs

Hyderabad

Hybrid

Primary Responsibilities: Data Collection and Cleaning: Data Analysts are responsible for gathering data from multiple sources, ensuring its accuracy and completeness. This involves cleaning and preprocessing data to remove inaccuracies, duplicates, and irrelevant information. Proficiency in data manipulation tools such as SQL, Excel, and Python is essential for efficiently handling large data sets. Analysis and Interpretation : One of the primary tasks of a Data Analyst is to analyse data to uncover trends, patterns, and correlations. They use statistical techniques and software such as R, SAS, and Tableau to conduct detailed analyses. The ability to interpret results and communicate findings clearly is crucial for guiding business decisions. Reporting and Visualization: Data Analysts create comprehensive reports and visualizations to present data insights to stakeholders. These visualizations, often created using tools like Power BI and Tableau, make complex data more understandable and actionable. Analysts must be skilled in designing charts, graphs, and dashboards that effectively convey key information. Collaboration and Communication: Effective collaboration with other departments, such as marketing, finance, and IT, is vital for understanding data needs and ensuring that analysis aligns with organizational goals. Data Analysts must communicate their findings clearly and concisely, often translating technical data into understandable insights for non-technical stakeholders. Predictive Modelling and Forecasting: Advanced Data Analysts also engage in predictive modelling and forecasting, using machine learning algorithms and statistical methods to predict future trends and outcomes. This requires a solid understanding of data science principles and familiarity with tools like TensorFlow and Scikit-learn Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B.Tech or Masters degree or equivalent degree 6+ years of experience in Data Analyst role in Data Warehouse 3+ years of experience with a focus on building models for analytics and insights in AWS environments Experience with Data Visualization: Ability to create effective visualizations using tools like Tableau, Power BI, AWS Quick Sight and other visualization software Proficiency in Analytical Tools: Solid knowledge of SQL, Excel, Python, R, and other data manipulation and statistical analysis tools Knowledge of Database Management: Understanding of database structures, schemas, and data management practices Programming Skills: Familiarity with programming languages such as Python and R for data analysis and modelling Statistical Analysis: Solid grasp of statistical methods, hypothesis testing, and experimental design Preferred Qualifications: Experience of Terraform to define and manage Infrastructure as Code(IaC) Data Engineering: Working on data architecture, database design, and data warehousing

Posted 2 months ago

Apply

5.0 - 9.0 years

18 - 25 Lacs

Pune

Hybrid

Job Summary: We are seeking an experienced Senior BI Developer with a strong background in AWS QuickSight and advanced SQL to join our analytics team. The ideal candidate will be responsible for designing, developing, and maintaining scalable BI solutions that support data-driven decision-making across the organization. This role involves deep collaboration with stakeholders to transform data into actionable insights through visually compelling dashboards and reports. ________________________________________ Key Responsibilities: Design and develop interactive dashboards and reports using AWS QuickSight Write, optimize, and maintain complex SQL queries for data extraction and transformation Work closely with business users to gather requirements and translate them into technical solutions Develop and maintain ETL/ELT pipelines using SQL and cloud data tools Perform data modeling, schema design, and documentation for reporting datasets Ensure data quality, integrity, and security across BI solutions Automate reporting processes to improve efficiency and reduce manual effort Monitor performance of BI systems and implement optimizations as needed ________________________________________ Required Skills & Qualifications: 6+ years of experience in BI development, data analysis, or related roles Hands-on experience with AWS QuickSight for creating dashboards and visual analytics Strong SQL skills including advanced joins, CTEs, window functions, and performance tuning Solid understanding of data warehousing concepts and cloud data architectures Experience working with AWS services (e.g., Redshift, S3, Athena, Glue) is a plus Familiarity with data governance and security best practices Strong analytical mindset with excellent problem-solving skills Effective communication and stakeholder management abilities

Posted 2 months ago

Apply

3.0 - 4.0 years

5 - 8 Lacs

Bengaluru

Hybrid

Duration: 8Months Job Type: Contract The top 3 Responsibilities: Experience with data visualization using Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, , QuickSight, , PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language e.g., Python Mandatory Requirements: 3+ years of analyzing and interpreting data with Redshift, NoSQL etc. experience.

Posted 2 months ago

Apply

2 - 6 years

10 - 14 Lacs

Hyderabad, Secunderabad

Work from Office

Digital Solutions Consultant I - HYD015A Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting May 7, 2025 Unposting Date May 20, 2025 Reporting Manager Title Manager We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team . Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia.Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now.We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects.? The Role As a Power BI Developer with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are seeking an experienced Power BI Developer with a strong skillset in creating visually compelling reports and dashboards, data modeling, and UI/UX design. The ideal candidate will have expertise in wireframing, UI design, and front-end development using React and CSS to complement their data analysis and visualization abilities in Power BI. Power BI Report Development: Design, develop, and maintain interactive dashboards and reports in Power BI that provide business insights. Leverage DAX, Power Query, and advanced data modeling techniques to build robust and scalable solutions. Create custom visuals and optimize Power BI performance for large datasets. UI/UX Design: Collaborate with product managers and stakeholders to define UI and UX requirements for data visualization. Design wireframes, prototypes, and interactive elements for Power BI reports and applications. Ensure designs are user-friendly, intuitive, and visually appealing. Data Modeling: Develop and maintain complex data models to support analytical and reporting needs. Ensure the integrity, accuracy, and consistency of data within Power BI reports. Implement ETL processes using Power Query for data transformation React & Front-End Development: Develop interactive front-end components and custom dashboards using React . Integrate React applications with Power BI APIs for seamless, embedded analytics experiences. Utilize CSS and modern front-end techniques to ensure responsive and visually engaging interfaces Collaboration & Problem-Solving: Work closely with cross-functional teams (data analysts, business analysts, project managers) to understand requirements and deliver solutions. Analyze business needs and translate them into effective data solutions and UI designs. Provide guidance and support in the best practices for data visualization, user experience, and data modeling. About You To be considered for this role it is envisaged you will possess the following attributes: Experience with AWS services and Power BI Service for deployment and sharing. Familiarity with other BI tools or frameworks (e.g., Tableau, Qlik, QuickSight). Basic understanding of back-end technologies and databases (e.g., SQL, NoSQL). Knowledge of Agile development methodologies. Bachelors degree in Computer Science, Information Technology, or a related field. Strong experience in Power BI (desktop and service), including Power Query, DAX, and data model design. Proficiency in UI/UX design with experience in creating wireframes, mockups, and interactive prototypes. Expertise in React for building interactive front-end applications and dashboards. Advanced knowledge of CSS for styling and creating visually responsive components. Strong understanding of data visualization best practices, including the ability to create meaningful and impactful reports. Experience working with large datasets and optimizing Power BI performance. Familiarity with Power BI APIs and embedding Power BI reports into web applications. Excellent communication and collaboration skills to work effectively in a team environment. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. Were building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Please noteIf you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 2 months ago

Apply

6 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced Amazon Redshift Developer / Data Engineer to design, develop, and optimize cloud-based data warehousing solutions. The ideal candidate should have expertise in Amazon Redshift, ETL processes, SQL optimization, and cloud-based data lake architectures. This role involves working with large-scale datasets, performance tuning, and building scalable data pipelines. Key Responsibilities: Design, develop, and maintain data models, schemas, and stored procedures in Amazon Redshift. Optimize Redshift performance using distribution styles, sort keys, and compression techniques. Build and maintain ETL/ELT data pipelines using AWS Glue, AWS Lambda, Apache Airflow, and dbt. Develop complex SQL queries, stored procedures, and materialized views for data transformations. Integrate Redshift with AWS services such as S3, Athena, Glue, Kinesis, and DynamoDB. Implement data partitioning, clustering, and query tuning strategies for optimal performance. Ensure data security, governance, and compliance (GDPR, HIPAA, CCPA, etc.). Work with data scientists and analysts to support BI tools like QuickSight, Tableau, and Power BI. Monitor Redshift clusters, troubleshoot performance issues, and implement cost-saving strategies. Automate data ingestion, transformations, and warehouse maintenance tasks. Required Skills & Qualifications: 6+ years of experience in data warehousing, ETL, and data engineering. Strong hands-on experience with Amazon Redshift and AWS data services. Expertise in SQL performance tuning, indexing, and query optimization. Experience with ETL/ELT tools like AWS Glue, Apache Airflow, dbt, or Talend. Knowledge of big data processing frameworks (Spark, EMR, Presto, Athena). Familiarity with data lake architectures and modern data stack. Proficiency in Python, Shell scripting, or PySpark for automation. Experience working in Agile/DevOps environments with CI/CD pipelines.

Posted 2 months ago

Apply

7 - 9 years

14 - 24 Lacs

Chennai

Work from Office

Experience Range: 4-8 years in Data Quality Engineering Job Summary: As a Senior Data Quality Engineer, you will play a key role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibility will be developing and leading the product testing strategy while leveraging your technical expertise in AWS and big data technologies. You will also guide the team in implementing shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Your contributions will ensure the successful execution of testing across multiple data platforms and projects. Key Responsibilities: Develop Product Testing Strategy: Collaborate with stakeholders to define and implement the product testing strategy. Identify key platform and project responsibilities, ensuring a comprehensive and effective testing approach. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy across data platforms and projects, ensuring thorough coverage and timely completion of tasks. BDD & AWS Integration: Utilize Behavior-Driven Development (BDD) methodologies to drive shift-left testing and integrate AWS services such as AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark to improve test effectiveness. Test Execution & Reporting: Design, execute, and document test cases while providing comprehensive reporting on testing results. Collaborate with the team to identify the appropriate data for testing and manage test environments. Collaboration with Developers: Work closely with application developers and technical support to analyze and resolve identified issues in a timely manner. Automation Solutions: Create and maintain automated test cases, enhancing the test automation process to improve testing efficiency. Must-Have Skills: Big Data Platform Expertise: At least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. Strong Programming Skills: Proficiency in object-oriented programming, particularly with Python. Ability to use programming skills to enhance test automation and tooling. BDD & AWS Integration: Experience with Behavior-Driven Development (BDD) practices and AWS technologies, including AWS Glue, Lambda, Airflow, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark. Testing Frameworks & Tools: Familiarity with testing frameworks such as PyTest, PyTest-BDD, and CI/CD tools like AWS CodeBuild and Harness. Communication Skills: Exceptional communication skills with the ability to convey complex technical concepts to both technical and non-technical stakeholders. Good-to-Have Skills: Automation Engineering: Expertise in creating automation testing solutions to improve testing efficiency. Experience with Test Management: Knowledge of test management processes, including test case design, execution, and defect tracking. Agile Methodologies: Experience working in Agile environments, with familiarity in using Agile tools such as Jira to track stories, bugs, and progress. Experience Range: Minimum Requirements: Bachelors degree in Computer Science or related field, or HS/GED with 8 years of experience in Data Quality Engineering. At least 4 years of experience in big data platforms and test engineering, with a strong focus on AWS and Python. Skills Test Automation,Python,Data Engineering

Posted 2 months ago

Apply

2 - 5 years

6 - 8 Lacs

Bengaluru

Work from Office

Brightchamps Business Data Analyst Job Description Company Description BrightCHAMPS is a global live-learning edtech platform for kids aged 6 to 16 that focuses on teaching next-gen life skills such as Coding, Financial Literacy, Communication Skills, and Robotics. The company is currently valued at $650 million with a $63 million investment and serves over 30 countries across 4 verticals. Our mission is to empower the world's 2 billion kids with next-gen life skills to level the playing field for success, regardless of gender, physical, socio-political, cultural, and financial considerations. Role Description This is a full-time on-site role for a Business Data Analyst located in the Greater Bengaluru Area. As a Business Data Analyst at BrightCHAMPS, you will be responsible for analysing and interpreting data, performing data modelling, and communicating insights to drive decision-making. Your day-to-day tasks will include conducting data analyses, developing reports and dashboards, and collaborating with cross-functional teams to identify opportunities for improvement. Key Responsibilities: 1. Collaborate with cross-functional teams on interdepartmental projects to align strategies and achieve organisational goals. 2. Gather, analyse, and interpret data to fulfil various departmental requirements, ensuring data accuracy and completeness. 3. Utilise advanced Excel skills including VLOOKUP, Pivot tables, and logical functions to manipulate and analyse large datasets efficiently. 4. Utilise SQL to extract and manipulate data from relational databases, with familiarity in window functions for complex data analysis. 5. Develop and maintain dashboards and reports using preferred visualisation tools such as Google Data Studio, Quicksight, or Tableau to present insights in a visually intuitive manner. 6. Perform root cause analysis (RCA) on key business metrics, identifying trends, patterns, and areas for improvement. 7. Provide statistical analysis and interpretation of data to derive actionable insights and recommendations for optimising business performance. 8. Demonstrate a strong understanding of key business metrics including revenue, conversion rates, and retention/churn, and leverage this knowledge to drive strategic initiatives. Requirements: 1. Bachelors in Data Analytics, Statistics or any related or equivalent courses. 2. Proficiency in Data Modeling, Advanced Excel and Advanced SQL (must-have) – Python is a plus 3. Experience with data visualisation tools such as Quicksight, or Metabase is a plus. 4. Understanding of basic statistics and probability concepts. 5. Excellent communication and collaboration skills with the ability to effectively communicate insights and recommendations to stakeholders. 8. Strong analytical and problem-solving skills with a keen attention to detail. 9. Ability to thrive in a fast-paced startup environment and adapt to changing priorities.

Posted 2 months ago

Apply

4 - 6 years

10 - 14 Lacs

Bengaluru

Work from Office

Job Description: We are looking for a self-motivated, highly skilled and experienced AI/ML Engineer to be part of our growing team. You will be responsible for developing and deploying cutting-edge machine learning models to solve real-world problems. Your responsibilities will include data preparation, model training, evaluation, and deployment, as well as collaborating with data scientists and software engineers to ensure our AI solutions are effective and scalable. As a Machine Learning Engineer, you will be responsible for developing and optimizing pipelines for both inference and training processes. Your expertise will be crucial in Amazon SageMaker, you need to build, train, and deploy machine learning and foundation models at scale with infrastructure. Experience Level: ~ 4 years. Key Responsibilities: Utilize AI solutions and tools provided by AWS to build segmentation models based on customer behavior and usage patterns. Automatically generate periodic reports. Develop functionality for defining reusable segmentation criteria tailored to marketing objectives. Required Skill Set: Hands-on experience with AWS S3, Lambda, Glue, SageMaker, Athena, QuickSight, etc. Python programming, conceptual understanding of ML algorithms, deep learning techniques, and prior experience with AWS is required. Understanding of serverless architectures and event-driven processing flows. Prior experience in working AI solutions and tools provided by AWS is must. Qualifications: Bachelor or Master's degree in Computer Science or related field. Prior Industry Experience in machine learning frameworks or projects is must.

Posted 2 months ago

Apply

7 - 10 years

15 - 25 Lacs

Chennai, Bengaluru

Work from Office

What awaits you/ Job Profile Design and develop data visualizations using Amazon QuickSight to present complex data in clear and understandable Dashboards. Create interactive dashboards and reports that allow end-users to explore data and draw meaningful conclusions. Work on Data preparation and ensure the good quality data is used in Visualization. Collaborate with data analysts and business stakeholders to understand data requirements, gather insights, and transform raw data into actionable visualizations. Ensure that the data visualizations are user-friendly, intuitive, and aesthetically pleasing. Optimize the user experience by incorporating best practices. Identify and address performance bottlenecks in data queries and visualization. Ensure compliance with data security policies and governance guidelines when handling sensitive data within QuickSight. Provide training and support to end-users and stakeholders on how to interact with Dashboards. Self-Managing and explore the latest technical development and incorporate in the project. Experience in analytics, reporting and business intelligence tools. Using the Agile Methodology, attending daily standups and use of the Agile tools. Lead Technical discussions with customers to find the best possible solutions. What should you bring along Must Have Overall experience of 7-10 years in Data visualization development. Minimum of 2 years in QuickSight and 5 years in other BI Tools like Tableau, PowerBI, Qlik Good In writing complex SQL Scripting, Dataset Modeling. Hands on in AWS -Athena, RDS, S3, IAM, permissions, Logging and monitoring Services. Experience working with various data sources and databases like Oracle, mySQL, S3, Athena. Ability to work with large datasets and design efficient data models for visualization. Prior experience in working in Agile, Scrum/Kanban working model. Nice to Have Knowledge on Data ingestion and Data pipeline in AWS. Knowledge Amazon Q or AWS LLM Service to enable AI integration Must have skill Quick sight, Tableau, SQL , AWS Good to have skills Qlikview ,Data Engineer, AWS LLM

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies