Jobs
Interviews

11 Data Sets Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

As a Data Scientist at our company, you will be responsible for analyzing large sets of structured and unstructured data to uncover trends and insights that can drive business decisions. Your expertise in data mining, statistical analysis, and machine learning will be crucial in developing and implementing machine learning models and algorithms for predictive and prescriptive analysis. Collaborating with cross-functional teams, you will play a key role in building models, developing algorithms, and enhancing our data infrastructure. Your key responsibilities will include performing exploratory data analysis (EDA), preparing reports using data visualization tools such as Tableau, Power BI, and Matplotlib, and conducting A/B testing and statistical analysis to assess the impact of business changes. Working closely with data engineers, you will build and maintain scalable data pipelines, ensuring data quality and optimizing models for performance and scalability. Additionally, you will stay up to date with the latest techniques in data science and machine learning, applying them to business challenges. Your ability to present findings and insights to non-technical stakeholders and executives in an understandable format will be essential in driving strategic decision-making. Supporting ad-hoc analysis and providing data-driven recommendations will also be part of your role. If you have a passion for deriving actionable insights from large data sets and are excited about leveraging data science methodologies to solve business problems, we invite you to join our team as a Data Scientist.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The Team Lead - Business Analyst will be joining the Merchandising Analytics Team in Dania Beach, FL. Reporting directly to the Sr. Manager of Enterprise Data and BI, you will play a crucial role in delivering scalable data infrastructure, supporting data governance initiatives, and enhancing the self-service merchandising reporting environment. Collaboration with analysts, data engineers, and the corporate IT team will be key to anticipating data infrastructure requirements and proactively developing solutions to drive data-centric insights. Your primary responsibilities will include demonstrating proficiency in handling and analyzing data sets comprising millions of records using SQL, Excel, and Tableau. You will be tasked with creating metrics and dashboards encompassing customer, marketing, ecommerce, and financial metrics. Additionally, the ability to leverage data visualization techniques to present intricate topics in a clear and concise manner will be essential. Key Responsibilities: - Collaborate with analytics, data engineering, and BI teams to develop and quality-assure new data tables and sources across various platforms - Work closely with business leaders and end-users to enhance the Tableau experience and ensure actionable and insightful KPIs - Assist the Sr. Manager of Enterprise BI in establishing and upholding stringent data governance standards within Tableau and underlying data sources - Develop and maintain reusable SQL scripts to cater to diverse business cases and reporting demands - Create informative and user-friendly Tableau dashboards using multiple data sources, parameters, and measures - Conduct intricate analyses on customer behavior, marketing mix, and funnel performance by leveraging extensive data sets from multiple systems - Collaborate with different analytics teams to design and implement scalable, self-service reporting for cross-functional stakeholders - Partner with stakeholders to define and sequence product release roadmaps using tools such as JIRA, Confluence, and other workflow management and documentation platforms.,

Posted 1 week ago

Apply

6.0 - 10.0 years

18 - 25 Lacs

Noida

Work from Office

Job Title : Senior Datawarehouse Developer Location: Noida, India Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyze the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage . The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s). Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, Data Lakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services. Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with Data warehousing, Data modelling. Strong experience in SQL. 2-6 years of total experience in building DW/BI systems. Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover, luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer. Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops, continuous intelligence, Augmented analytics, and AI/ML. Prior experience of working with Global Clients. Nice to have Skills and Qualifications: Prior experience of working in a start-up culture Prior experience of working in Agile SAFe and PI Planning Prior experience of working in Ed-Tech/E-Learning companies Any relevant DW/BI Certification Working Knowledge of processing huge amount of data , performance tuning, cluster administration, High availability and failover , backup restore. Experience: 6-10 Years experience Educational Qualification(s): Bachelor's/masters degree in computer science, Engineering or equivalent

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Product Manager at our company, you will be responsible for leveraging your expertise in product analytics and data management to contribute significantly to our product team. Your primary focus will involve analyzing user behavior, tracking product performance, and influencing the product roadmap based on data insights. By effectively extracting, exploring, and evaluating data, you will be able to spot opportunities, drive strategic product decisions, and provide value to our users and the business. In this role, you will be expected to blend analytical thinking with product strategy, seamlessly transitioning between in-depth data analysis and product implementation. Your passion for creating products driven by measurable outcomes will be instrumental in shaping our product offerings and achieving success. This position offers a unique opportunity to make a high impact within the organization and is ideal for individuals who are adept at navigating the intersection of data analytics and product development. If you are enthusiastic about building products that are informed by data-driven decisions and are eager to drive innovation, this role is perfect for you.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have 2-4 years of relevant experience in producing periodic accurate and timely Client Reports and Presentations across multiple investment strategies for an Asset Manager or Bank. Your responsibilities will include working with others to draft Requirement Documents, review data sets, source data accurately, standardize data for consistency, create reporting templates, perform testing, and implement suitable reporting templates in a timely manner. You must have a proven working knowledge of client reporting platforms, processes, internal and external data providers, and a strong sense of risk mitigation. Coordinating information flow between relevant internal business areas to ensure accurate and timely completion of reports is crucial. Additionally, you should continuously maintain reporting templates, suggest ways to check for data issues and resolve them, manage project timelines, support functional testing, and deployment. Experience in MS Excel, MS Power Query / BI, automation software like Alteryx, SQL, Simcorp Coric, object-oriented programming, TFS, .NET, C#, and understanding of performance principles and calculations is required. Your role will involve creating and maintaining SOPs, risk logs, BCP, suggesting and implementing tools to automate and optimize reporting workflows, and resolving automation issues. You should have excellent communication and interpersonal skills, be highly motivated, independent, have a strong interest in solving complex problems, and be able to manage projects and deadlines effectively. Maintaining strong working relationships with external/internal clients and responding to queries in a timely manner is essential.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have 2-4 years of relevant experience in producing periodic, accurate, and timely client reports and presentations across multiple investment strategies regularly for an asset manager or bank. It is essential to have a proven working knowledge of client reporting platforms, processes, internal and external data providers, coupled with a strong sense of risk mitigation. Your responsibilities will include working with others to draft requirement documents, reviewing data sets and flows for suitability and quality, accurately sourcing data, standardizing data for consistency, creating reporting templates with accurate configurations, performing functional/integration testing, and implementing suitable reporting templates in a timely manner. Coordinating information flow between all relevant internal business areas to ensure all allocated reports are completed accurately and promptly is crucial. You will need to continuously maintain reporting templates to ensure they are operating efficiently and fit for purpose. Identifying and resolving data issues promptly is essential. Managing project timelines, supporting functional/integration testing and deployment, creating and maintaining SOPs, risk logs, and BCP are also part of your responsibilities. Proficiency in MS Excel, MS Power Query/BI, automation software like Alteryx, and SQL is required. You should be able to suggest and implement tools to automate and optimize reporting workflows and mitigate risks. Experience with Simcorp Coric, object-oriented programming, TFS, .NET, and C# would be highly beneficial. A strong understanding of the end-to-end process of data collection, data review, and report production, including performance principles and calculations, is essential. You should demonstrate an understanding of how incorrect data points can impact reporting. Excellent communication and interpersonal skills, along with a command of English, are necessary. You must be highly motivated, independent, have a strong interest in solving complex problems, and demonstrate a proven ability to manage projects and deadlines effectively. Maintaining strong working relationships with all external/internal clients, ensuring clear and accurate communication, and responding to queries promptly are critical aspects of this role.,

Posted 3 weeks ago

Apply

14.0 - 18.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Total Experience- 14 to 18 Years Location- Bangalore SKills Required: Programming language: Python, R, SQL Machine learning: Predictive maintenance, driver behaviour Deep Learning: ADAS Data Visualization: Tableau, Power Bi Big Data: Spark, Hadoop Cloud Technology: AWS/ GCP Automotive datasets CAN Data Analysis

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Hybrid

Job Summary: We are looking for a Quality Engineer Data to ensure the reliability, accuracy, and performance of data pipelines and AI/ML models in our SmartFM platform. This role is essential for delivering trustworthy data and actionable insights to optimize smart building operations. Roles and Responsibilities: Design and implement QA strategies for data pipelines and ML models. Test data ingestion and streaming systems (StreamSets, Kafka) for accuracy and completeness. Validate data stored in MongoDB, ensuring schema and data integrity. Collaborate with Data Engineers to proactively address data quality issues. Work with Data Scientists to test and validate ML/DL/LLM/Agentic Workflow models. Automate data validation and model testing using tools like Pytest, Great Expectations, Deepchecks. Monitor production pipelines for data drift, model degradation, and performance issues. Participate in code reviews and create detailed QA documentation. Continuously improve QA processes based on industry best practices. Required Technical Skills: 5 - 10 years of experience in QA, with focus on Data and ML testing. Proficient in SQL for complex data validation. Hands-on with StreamSets, Kafka, and MongoDB. Python scripting for test automation. Familiarity with ML model testing, metrics, and bias detection. Experience with cloud platforms (Azure, AWS, or GCP). Understanding of Node.js and React-based systems is a plus. Experience with QA tools like Pytest, Great Expectations, Deepchecks. Additional Qualifications: Excellent communication and documentation skills. Strong analytical mindset and attention to detail. Experience working cross-functionally with Engineers, Scientists, and Product teams. Passion for learning new technologies and QA frameworks. Domain knowledge in facility management, IoT, or building automation is a plus.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

20 - 35 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

YOUR ROLE AND RESPONSIBILITIES: Develop and implement comprehensive pricing strategies for new and existing products throughout their lifecycle Conduct in-depth market research to inform pricing decisions in partnership with US team Collaborate with cross-functional teams, including Managed Markets-Market Access, Government Affairs, Medical Affairs, Regulatory, and Finance to align pricing strategies with business objectives Partner closely with the US Market Access Payer Team to analyze Customer data, develop insights, and provide recommendations to improve Formulary coverage through rebates Track the performance of improvement in Formulary coverage as it relates to the Gross to Net Assist with conducting monthly, quarterly and annual financial reviews Monitor and analyze pricing performance, providing actionable insights to senior leadership WHO YOU ARE: Strong analytical and problem-solving skills 3 Years plus experience in Global Capabilities Center environment supporting remote teams MBA/ Master in Finance/ Masters in Economics(Health Economics) qualifications preferred Intermediate to advanced skills in Microsoft programs, specifically Excel and PowerPoint Working knowledge and exposure to development of data input forms on systems like Symphony / IQVIA payer team Good understanding of the pharmaceutical industry in the United States / Wester European market Familiarity with health economics and outcomes research (HEOR) methodologies Ability to work independently and in a virtual team environment Ability to multitask and manage multiple deliverables and projects at the same time

Posted 1 month ago

Apply

8.0 - 13.0 years

35 - 40 Lacs

Bengaluru

Remote

Role & responsibilities We are looking for MLOps/ML Engineer with Dataiku DSS platform with MNC company for permanent position, Remote. Preferred candidate profile We are seeking a skilled MLOps/ML Engineer to serve as our subject matter expert for Dataiku DSS. In this pivotal role, you will manage and scale our end-to-end machine learning operations, all of which are built on the Dataiku platform. Key responsibilities include designing automated data pipelines, deploying models as production APIs, ensuring the reliability of scheduled jobs, and championing platform best practices. Extensive, proven experience with Dataiku is mandatory. Data Pipeline Development: Design and implement Extract, Transform, Load (ETL) processes to collect, process, and analyze data from diverse sources. Workflow Optimization: Develop, configure, and optimize Dataiku DSS workflows to streamline data processing and machine learning operations. Integration: Integrate Dataiku DSS with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and big data technologies such as Snowflake, Hadoop, and Spark. AI/ML Model development & Implementation: Implement and optimize machine learning models within Dataiku for predictive analytics and AI-driven solutions. MLOps & Data Ops: Deployment of data pipelines and AI/ML models within the Dataiku platform. Dataiku Platform Management: Build, Manage & Support Dataiku platform. Automation: Automate data workflows, monitor job performance, and ensure scalable execution. Customization: Develop and maintain custom Python/R scripts within Dataiku to enhance analytics capabilities. Dataiku Project Management: Develop and maintain custom Python/R scripts within Dataiku to enhance analytics capabilities. Required Skills and Qualifications: Experience Level: 2 to 6 years of hands-on experience with Dataiku DSS platform and data engineering. Educational Background: Bachelors or Master’s degree in Computer Science, Data Science, Information Technology, or a related field. Technical Proficiency: Experience with Dataiku DSS platform. Strong programming skills in Python and SQL. Familiarity with cloud services (AWS, Azure, GCP) and big data technologies (Hadoop, Spark). Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong troubleshooting skills to address and resolve issues in data workflows and models. Communication: Effective verbal and written communication skills to collaborate with team members and stakeholders.

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 27 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Role & responsibilities Strong Proficiency in IBM Cognos Analytics ( Report Studio, Query Studio, Analysis Studio, Framework Manager ) Create models in Cognos , source the data from Cognos model (data mart, datasets) Developing IBM Cognos data sets, data modules, reports and dashboards/visualizations SQL knowledge, writing/debugging/cleaning; Strong SQL skills and experience with data warehousing concepts. Manage and maintain the Cognos Framework Manager packages. Create and maintain reports, dashboards, jobs, schedules, and data models using IBM Cognos Analytics. Nice-to-have: Knowledge of oracle RDBMS12c or higher, logical data models, data modelling Certification on IBM Cognos analytics CI / CD tooling: Azure DevOps, Git, Ansible Experience in Lending business, LoanIQ application & its data model.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies