Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Client: Ministry of Public and Business Service Delivery and Procurement Work Location: 5700 Yonge Street, 10th Floor, Toronto, Ontario, Hybrid Estimated Start Date: 2025-07-02 Estimated End Date: 2026-04-02 #Business Days: 248.00 Extension : Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: CRJMC Description Responsibilities: Construct a high-level overview of the entire data landscape of the organization. Illustrate how data moves through the organization and between systems. Develop high-level designs outlining the main entities and their relationships. Design detailed models showing attributes, primary keys, and relationships between entities without considering physical aspects. Implement designs showing how data will be stored, including tables, columns, indexes, and relationships. Design diagrams showing how different data systems and databases interact and are integrated. Document standards and policies for data management, including naming conventions, data quality standards, and data security policies. Plan and guide the roles and responsibilities of data stewards in maintaining data quality and integrity. Compile detailed descriptions of data elements, their meanings, and their relationships. Create a comprehensive list of business terms and their definitions to ensure consistent usage across the organization. Document and design processes for extracting, transforming, and loading data from various sources. Strategize and schedule the movement of data from legacy systems to new systems or platforms. Detail the design of the data warehouse, including the schema, dimensions, and fact tables. Set up and configure BI tools and dashboards. Regularly report on the quality of data, identifying issues and areas for improvement. Develop strategies and methods for cleaning and improving the quality of data. Plan and diagram how data security will be implemented and maintained. Document compliance with relevant regulations and standards. Develop strategies for improving the efficiency and performance of data systems. Detail plans for data architecture projects, including timelines, milestones, and resources required. Document technical specifications for all data architecture components and processes. Experience And Skill Set Requirements Public Sector Experience – 5 Points 5+ years of experience working with federal/provincial/broader public-sector healthcare providers. Knowledge of Public Sector Enterprise Architecture artifacts (or similar), processes and practices, and ability to produce technical documentation that comply with industry standard practices. In-depth knowledge of industry standard such as Project Management Institute (PMI) and Public Sector I&IT project management methodologies. Knowledge and experience with Public Sector Health related projects. Knowledge and understanding of Ministry policy and IT project approval processes and requirements. Experience adopting and adhering to Public Sector Unified I&IT Project Methodology, Public Sector Enterprise Architecture and Public Sector Gating process, and Public Sector Standard Systems Development Methodologies. Experience with large complex IT Health-related projects. Technical Skills And Experience – 50 Points 7+ years experiences in the following: developing conceptual, logical and physical data models for structured, semi-structured, and unstructured data with relational, star and snowflake schemas. Proficiency in data modeling methods and tools (e.g. ERWIN, VISIO, Power Designer). developing and implementing common data models and master data management strategies design schemas that balance agility, schema flexibility, data governance, and quality. Expertise in creating schemas that allow flexible querying and analysis across various data formats. defining data partitioning, clustering, and indexing strategies to optimize query performance and data access patterns. implementing schemas and metadata structures that support efficient data management. Understanding of data governance principles and practices, especially in designing schemas and metadata structures. Experience in creating Architecture Artefacts based on enterprise standards. 4+ years experience in designing schemas that integrate diverse data types stored in a data lake or data lakehouse. 3+ years experience of implementing Medallion architecture or similar frameworks. Familiarity with tools and techniques for managing metadata in a data lakehouse environment. Hands-on experience with Azure services, including Azure Data Factory, Azure Data Lake Storage, and Azure Databricks. Core Skill And Experience – 30 Points 10+ years experience with Solutions development. Experience in addressing complex data integration challenges and design efficient, scalable data models. Experience in monitoring and enforcing data modelling/normalization standards. Experience with the privacy and security requirements or software development in a health context, or equivalent. Demonstrated knowledge of leading technical design, security and recovery procedures for application development. Experience with relational and hierarchical database technologies. Demonstrated understanding and experience with the use of information retrieval packages using query languages. Experience in production environment troubleshooting and tuning to improve performance. Knowledge and experience of Information Management tools, principles, concepts, policies and practices. General Skills – 15 Points Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols. Knowledge of performance considerations for different database designs in different environments. Experience in structured methodologies for the design, development and implementation of applications. Experience in systems analysis and design in large or medium systems environments. Awareness of emerging I&IT trends and directions Excellent analytical, problem-solving and decision-making skills; A team player with a track record for meeting deadlines. Strong communication skills to articulate data architecture concepts and collaborate with stakeholders. Experience working with cross-functional teams, including IT, business, and data engineering teams. Strong analytical skills to understand and model complex data relationships and structures. Meticulous attention to detail in designing and implementing data models and schemas. Note This position is currently listed as "Hybrid" and consultants will be required to work onsite at the work location 3 days a week and 2 days from home. The details of this arrangement will be confirmed with the Hiring Manager. Extension/Amendment Attestation: Extension(s) only allowed using unused days/funds left on contract. No additional funds will be added beyond maximum contract value. The Statement of Work (SOW) shall expire on April 2, 2026. HSC may exercise its option(s) to extend a SOW beyond April 2, 2026 using unused days/funds left on the contract. Such extension(s) will be allowable only if the Master Service Agreement is extended beyond April 5, 2026 and be upon the same terms, conditions and covenants contained in the SOW. Eligibility and Application Steps If you are enthusiastic about this exciting opportunity, we kindly request you to provide the following documents: hrsmss@smsoftconsulting.com Without mandatory documents, we cannot submit a candidate. Updated Resume in word format (Mandatory) Skills Matrix and References (Mandatory) Expected hourly rate (Mandatory) Visa Status (Mandatory) LinkedIn ID (Mandatory) Please only apply if you meet the qualifications mentioned above. Feel free to share with your network or tag someone who fits for this role! If you have any questions or need further clarification, feel free to call or text at (647) 408-1348. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Bhopal, Madhya Pradesh, India
On-site
We are seeking a highly skilled QA Testing Engineer to join our Quality Assurance team. The ideal candidate will be responsible for developing and executing manual and automated test plans to ensure the delivery of high-quality software solutions. The role requires strong technical proficiency, analytical thinking, and a detail-oriented mindset. Key Responsibilities Analyze business requirements, user stories, and technical specifications to create detailed test plans, test cases, and test scripts. Perform functional, integration, regression, system, smoke, and sanity testing across various platforms (web, API, mobile). Develop, maintain, and execute automated test scripts using tools like Selenium, Cypress, or Playwright. Identify, log, and track bugs in defect tracking tools (e.g., JIRA, Bugzilla) and verify fixes through retesting. Collaborate with development, DevOps, and product teams to understand features, technical implementation, and business use cases. Participate in sprint planning, daily stand-ups, and reviews in an Agile / Scrum environment. Ensure test environments are correctly set up and maintained. Generate detailed test reports and metrics for internal teams and management. Technical Skillset Testing Types : Manual Testing (functional, UI/UX, smoke, sanity, exploratory) Automated Testing (Selenium, Cypress, Playwright, etc.) API Testing (Postman, Rest Assured, Swagger) Regression & Re-testing Cross-browser and responsive testing Tools & Frameworks Automation Tools: Selenium WebDriver, Cypress, Playwright, TestNG, JUnit API Testing: Postman, Rest Assured, Swagger, SoapUI Defect Management: JIRA, Bugzilla, TestRail, Azure DevOps Build & CI/CD: Jenkins, GitHub Actions, GitLab CI Version Control: Git, Bitbucket Performance Testing (Good to have): JMeter, LoadRunner, Gatling Databases SQL Basic querying and data validation using MySQL, PostgreSQL, or SQL Server Soft Skills Analytical thinking and problem-solving ability Strong attention to detail and ability to catch edge cases Clear and concise communication written and verbal Team player and proactive in resolving testing issues Ability to work independently and handle pressure in fast-paced environments Educational Qualification Bachelors Degree in Computer Science, Information Technology, or related discipline ISTQB or equivalent QA certification (preferred) Good To Have Knowledge of BDD (Cucumber, SpecFlow) Mobile App Testing experience (Android & iOS using Appium or manual) Experience with cloud-based testing tools like BrowserStack or Sauce Labs Security or accessibility testing awareness (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
8.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Graduation or post-graduation in commerce or accounting with relevant experience of 8-10 years in MIS reporting, analysis of budget to actual variances, querying financial data Perform analysis and insights into revenue, assignments and consultant contributions and to identify key business drivers Prepare FP&A reports in a timely & accurate manner Good oral and written communication skills in English Excellent knowledge in MS Office applications Understanding of end-to-end processes and appreciation of critical parameters Problem identification and analytical ability. Strong knowledge of MS Office Self-initiative, drive and zeal for continuous improvement. Ability to discharge the responsibilities in a conflicting environment Conformance with Policies/Compliances Ability to coach and give feedback on an ongoing basis. Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About This Role Wells Fargo is seeking a Finance Analyst In This Role, You Will Participate in functions related to financial research and reporting Forecast analysis of key metrics, as well as other financial consulting related to business performance, operating and strategic reviews Identify opportunities for process improvements within the scope of responsibilities Research moderate to complex financial data in support of management decision-making for a business Create and communicate various activities such as product pricing, product, and portfolio performance Exercise independent judgment to guide key metrics forecasting, closing data and validation Present recommendations for resolving all aspects of delivering key forecasting projections as well as financial reporting to support monthly and quarterly forecasting Develop expertise on reporting that meets brand standards and internal control standards Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals Required Qualifications: 2+ years of Finance experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Degree in Accounting or Engineering with Data Science or higher in business administration, Project management certification Strong experience in projects involving data sourcing, data integration, performing related data study, Gap analysis of Data elements between the key reports and Data available in the Application to aid adoption of new System for achieving the larger objective of FTO to eliminate Data redundancy, standardize data and establish single source for Finance Use cases as medium term priority and Non-Finance Use cases as the long term priority by performing the above activities in financial services industry or equivalent demonstrated through the following - work experience, training, and education. Knowledge and understanding of Data warehouse, database querying using SQL or similar querying methods. Experience in Project management related to Data sourcing, establishing Data integrity controls and Data analysis to enhance quality. Experience in Microsoft Office skills, writing Structured Query Language (MS_SQL) and basic knowledge in Python and Power Bi Experience in working as part of highly agile transformation/change management teams. Strong analytical skills with high attention to detail and accuracy Excellent verbal, written, presentation and interpersonal communication skills. Knowledge and understanding of financial services industry, with emphasis on Capital Market products Strong organizational, multi-tasking, and prioritizing skills as projects involve interaction with Senior leadership groups Experience in scenarios where Problem solving and Critical thinking are demonstrated Ability to come up with solutions where minimal information is available through connecting the dots and collaborating with Upstream, Data domains, Technology & Downstream users Ability to train and guide team members and manage transformation initiatives to optimize operational effectiveness and efficiencies. Job Expectations: Work in agile teams using scrum methodologies for product development that facilitates activities and Projects related to Applications support in Finance Data Operations. Collaborate and consult with peers, colleagues, product owners, stakeholders, and leadership to serve Downstream User community's Data needs by achieving product vision. Use Analytical skills and Project Management techniques to support data integration, minimum curation to facilitate production of reports that enables Regulatory Report submission and analysis, insights gathering to support decision making by Management. Use data analysis techniques by interacting with SQL platforms to aid data sourcing from multiple Data domains. Spearhead efforts in identification of Risk and implementation of controls, documentation of our consolidated reporting and control architecture, establishing an effective and G&O compliant architecture for daily financial reporting. Face off with internal and external audit and COSO for a high-risk application as defined under operational risk guidelines. Have a robust understanding of relational databases where the Analyst will need to analyze large volumes of data via SQL queries to aid Gap analysis as part of Data study to eliminate Data redundancy and overlap between source domains. This is a very high exposure group and the ideal candidate will be a driver of change in establishing Data sourcing, developing Application controls and building effective partnerships within and across cross functional teams like Sourcing team, Data Landing Zone team, Technology team, Risk-Control, Operations team and Downstream teams. @RWF25 Posting End Date 15 Jun 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-458061 Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Kochi, Kerala, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Greater Bhopal Area
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Mysore, Karnataka, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Patna, Bihar, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description We are looking for a leader who can own our analytics roadmap, develop and grow the careers of Business Analysts in our team and serve as a key partner for our Operations, Product, Engineering and Science teams. In this role, you will: a) lead a team of Business Analysts to identify and solve business problems using vast amounts of data from disparate sources; b) drive a Analytics roadmap, prioritizing business insight requests to balance the tactical and strategic information needs; c) work closely with our Ops and Engineering teams to build, maintain pipelines which powers critical products/features that influence operations, CX, Product and Science d) guide and coach your team to continuously improve operational excellence and ability to deliver best-in-class solutions. About The Team OPTIMA Operations strives to become the most reliable source for dataset generation and annotations. We work in collaboration with Shopping feature teams to enhance customer experience (CX) quality across shopping features, devices, and locales. Our primary focus lies in handling annotations for training, measuring, and improving Artificial Intelligence (AI) and Large Language Models (LLMs), enabling Amazon to deliver a superior shopping experience to customers worldwide. Our mission is to empower Amazon's LLMs through Reinforcement Learning from Human Feedback (RLHF) across various categories at high speed. We aspire to provide an end-to-end data solution for the LLM lifecycle, leveraging technology alongside our operational excellence. By joining us, you will play a pivotal role in shaping the future of the shopping experience for customers worldwide. Basic Qualifications Bachelor's degree in Business, Engineering, Statistics, Computer Science, Mathematics or related field 5+ yrs of experience in a business analyst/data analyst/statistical analyst role 2+ leading a team of Business Analysts Highly skilled at querying relational databases, ability to pull from various sources and having done so in a business environment Experience in data visualization platforms Proven problem-solving skills, project management skills, attention to detail, and exceptional organizational skills Communication (verbal, written, and data presentation) and interpersonal skills to effectively communicate with both business and technical teams Preferred Qualifications Master's degree in Business, Engineering, Statistics, Computer Science, Mathematics or related field Experience conducting advanced statistical analysis (skilled in using one or more of R/SPSS/SAS/Python and etc.) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2995065 Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Faridabad, Haryana, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Vellore, Tamil Nadu, India
Remote
Experience : 4.00 + years Salary : USD 3000 / month (based on experience) Expected Notice Period : 30 Days Shift : (GMT+02:00) Europe/Paris (CEST) Opportunity Type : Remote Placement Type : Full Time Contract for 3 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Oyster) What do you need for this opportunity? Must have skills required: AI/LLM, Zendesk, BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Oyster is Looking for: Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us We’re a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years’ experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why You’ll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.
The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager
Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts
As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2