Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 3.0 years
1 - 4 Lacs
Noida
Work from Office
JOB DESCRIPTION FOR DATA ANALYST Job Title: Data Analyst Location: Noida Experience Required: 13 Years Department: Data & Analytics / Business Intelligence Industry: SaaS / Technology Employment Type: Full-Time About the Role: We are looking for a proactive and curious Data Analyst with 1–3 years of experience to join our growing SaaS team. In this role, you’ll work with large volumes of product, customer, and operational data to generate insights that help improve product decisions, customer experience, and business growth. You’ll partner with teams across Product, Marketing, Sales, and Customer Success to solve real-world business problems with data. Key Responsibilities: • Collect, clean, and validate data from various internal SaaS platforms (e.g., CRM, Product Analytics, Marketing Tools) • Generate recurring and ad hoc reports for business stakeholders • Build dashboards and visualizations using tools like Power BI, Tableau, or Looker • Analyze user behavior and product performance metrics to support decision-making • Assist in setting up tracking for key SaaS KPIs such as user engagement, churn, MRR, and CLTV • Collaborate with Product Owners, Marketers, and Sales teams to define and interpret data requirements • Identify data inconsistencies or inefficiencies and recommend solutions • Support data-driven A/B testing initiatives and user behavior segmentation Requirements: • Bachelor’s degree in Mathematics, Statistics, Computer Science, Engineering, or a related field • 1–3 years of relevant experience in data analytics, preferably in a SaaS or tech-based environment • Solid knowledge of SQL and experience querying relational databases • Hands-on experience with Excel/Google Sheets and at least one visualization tool (Power BI, Tableau, Looker, etc.) • Strong analytical thinking with attention to detail • Ability to present data insights clearly to both technical and non-technical stakeholders • Basic understanding of SaaS metrics such as CAC, MRR, ARR, retention, and funnel analysis Good to Have: • Familiarity with product analytics tools like Mixpanel, Amplitude, or Google Analytics • Exposure to scripting languages like Python or R for data analysis • Knowledge of CRM tools like Salesforce or HubSpot • Experience working in an Agile environment with cross-functional teams What We Offer: • Opportunity to work in a fast-paced SaaS company with a data-driven culture • Career development and mentorship in data analytics • Flexible work hours and remote work options • Exposure to the entire SaaS business lifecycle and growth operations • A collaborative environment that encourages continuous learning If you are interested, then please do share your cv at harshit.tripathi@gmail.com
Posted 1 month ago
3.0 - 7.0 years
9 - 13 Lacs
Jaipur
Work from Office
Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 35 years of experience owning end-to-end data pipelines. Youll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities: Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R . Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills: You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally : Bachelors or Masters in a quantitative field (Statistics, CS, Economics, etc.). 3-5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains.
Posted 1 month ago
2.0 - 6.0 years
7 - 11 Lacs
Pune
Work from Office
Role Overview : As a Senior Data Analyst at Hevo, you will leverage your SQL skills and analytical expertise to manage, process, and report data, driving insights across the organization. You will focus on reporting, forecasting, and presenting key metrics to business leaders while collaborating with stakeholders to support strategic decision-making. Key Responsibilities Query large datasets using SQL to extract and manipulate data. Maintain and optimize databases on the data warehouse. Prepare and present weekly business reviews (WBRs), forecasts, and track key metrics. Drive analytics projects related to customer funnels and lead acquisition, uncover insights, and report findings to leadership. Collaborate with cross-functional teams to execute WBRs and track follow-up actions. Lead and manage end-to-end analytics projects with minimal oversight and mentor junior team members. Continuously challenge and improve metrics by aligning them with industry standards. What are we looking for 3-6 years of experience in a quantitative analyst role (preferably in B2B SaaS, growth analytics, or revenue operations). Proficiency in SQL and experience working with large datasets. Experience using Tableau, Looker, or similar tools to create dashboards and report insights. Strong communication skills, with the ability to present data to both technical and non-technical audiences. Bonus: Experience with executive or rev ops reporting. Ability to manage multiple projects simultaneously and drive deliverables with minimal oversight. Key elements needed to succeed in this role Attention to detail Diagnosing the problem Continuous learning mindset Ability to solve complex, open-ended problems
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Job Title - S&C Global Network - AI - Retail - Consultant - Retail Specialized Data Scientist/ Retail Specialized Data Scientist Level 9 SnC GN Data & AI Management Level:09 - Consultant Location:Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must have skills: A solid understanding of retail industry dynamics, including key performance indicators (KPIs) such as sales trends, customer segmentation, inventory turnover, and promotions. Strong ability to communicate complex data insights to non-technical stakeholders, including senior management, marketing, and operational teams. Meticulous in ensuring data quality, accuracy, and consistency when handling large, complex datasets. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Strong proficiency in Python for data manipulation, statistical analysis, and machine learning (libraries like Pandas, NumPy, Scikit-learn). Expertise in supervised and unsupervised learning algorithms Use advanced analytics to optimize pricing strategies based on market demand, competitor pricing, and customer price sensitivity. Good to have skills: Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms such as AWS or Google Cloud for large-scale data processing. Experience with ETL (Extract, Transform, Load) processes and tools like Apache Airflow to automate data workflows. Familiarity with designing scalable and efficient data pipelines and architecture. Experience with tools like Tableau, Power BI, Matplotlib, and Seaborn to create meaningful visualizations that present data insights clearly. Job Summary : The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help our retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Roles & Responsibilities: Leverage Retail Knowledge:Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. Use AI-driven techniques for personalization, demand forecasting, and fraud detection. Use advanced statistical methods help optimize existing use cases and build new products to serve new challenges and use cases. Stay updated on the latest trends in data science and retail technology. Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills : Strong analytical and statistical skills. Expertise in machine learning and AI. Experience with retail-specific datasets and KPIs. Proficiency in data visualization and reporting tools. Ability to work with large datasets and complex data structures. Strong communication skills to interact with both technical and non-technical stakeholders. A solid understanding of the retail business and consumer behavior. Programming Languages:Python, R, SQL, Scala Data Analysis Tools:Pandas, NumPy, Scikit-learn, TensorFlow, Keras Visualization Tools:Tableau, Power BI, Matplotlib, Seaborn Big Data Technologies:Hadoop, Spark, AWS, Google Cloud Databases:SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Qualification Experience: Minimum 3 year(s) of experience is required Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
Posted 1 month ago
8.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a skilled Power BI Dashboarding and Visualization Developer with 8 to 10 years of experience. The ideal candidate will have a strong background in designing and developing interactive dashboards and visualizations using Power BI, as well as integrating and optimizing Power BI solutions within cloud environments. Roles and Responsibility Design and develop interactive dashboards and visualizations using Power BI. Integrate and optimize Power BI solutions within AWS and Azure environments. Collaborate with business users to gather requirements and deliver insights. Ensure data accuracy, security, and performance. Develop and maintain complex data models and reports using Power BI. Troubleshoot and resolve issues related to Power BI dashboard development. Job Strong experience with Power BI, including DAX, Power Query, and data modeling. Proficiency in SQL for querying and data manipulation. Familiarity with data warehouses such as Redshift, Snowflake, or Synapse. Knowledge of Azure Data Factory and AWS Glue for data integration. Understanding of REST APIs and integrating external data sources. Experience with Git for version control and CI/CD pipelines. Excellent communication and problem-solving skills.
Posted 1 month ago
6.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are looking for a skilled professional with 6 to 8 years of experience to join our team as an SME in the Employment Firms/Recruitment Services Firms industry. Roles and Responsibility Collaborate and manage the team to achieve high performance. Make key decisions and contribute to strategic planning. Provide solutions to problems across multiple teams. Lead the team in implementing new technologies and conducting regular code reviews. Engage with multiple teams to drive business growth. Develop and implement data analytics and visualization techniques. Job Proficiency in SAS BI is mandatory. Strong understanding of data manipulation and transformation techniques. Experience with database management systems. Hands-on experience in application development using SAS BI. Ability to work collaboratively and manage teams effectively. Strong problem-solving skills and attention to detail.
Posted 1 month ago
2.0 - 4.0 years
7 - 11 Lacs
Noida
Work from Office
We are looking for a highly skilled Data Scientist with 2 to 4 years of experience to join our team in Gurugram. The ideal candidate will have a strong background in statistics, mathematics, computer science, or another quantitative field. Roles and Responsibility Participate and collaborate with operation teams to understand business requirements and deliver scalable solutions. Conduct thorough research on state-of-the-art generative AI techniques and apply them to enhance LLM models. Create models using Classification and Regression techniques and perform Hypothesis Testing and Feature selection. Analyze user behavior, conversion data, and customer journeys to generate hypotheses and build optimization plans. Mine and analyze data from company databases to drive product development and business strategy improvements. Perform deep dive analyses on key business trends and package insights into easily consumable presentations and documents. Job Education in Statistics, Mathematics, Computer Science, or another quantitative field. Experience using computer languages (Python, SQL) to manipulate data and draw insights from large datasets. Proficiency in working with Large Language Models (LLMs), such as GPT-3, GPT-4, or similar models. Strong programming skills in Python and experience with deep learning libraries/frameworks (e.g., TensorFlow, PyTorch). Effective communication and collaboration skills, with the ability to work effectively in a multidisciplinary team environment. Familiarity with natural language processing (NLP) and understanding of various NLP tasks is a plus. Prior experience in handling LLM use cases or working with large-scale language models is highly desirable. Ensure data quality and integrity through best practices and compliance to framework, architecture, and coding standards. Experience with cloud computing platforms (preferably Azure) for deploying and managing machine learning models. Hands-on experience with generative AI models and techniques such as GANs, VAEs, and transformer models.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Database Engineer with 5 to 10 years of experience to design, develop, and maintain our database infrastructure. This position is based remotely. Roles and Responsibility Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale and big data processing. Implement data security measures to protect sensitive information and comply with relevant regulations. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to relational database systems or cloud-based solutions like Google BigQuery and AWS. Develop import workflows and scripts to automate data import processes. Ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and resolve issues, while collaborating with the full-stack web developer to implement efficient data access and retrieval mechanisms. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows, exploring third-party technologies as alternatives to legacy approaches for efficient data pipelines. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices, and use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines, taking accountability for achieving development milestones. Prioritize tasks to ensure timely delivery in a fast-paced environment with rapidly changing priorities, while also collaborating with fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems, leveraging online resources effectively like StackOverflow, ChatGPT, Bard, etc., considering their capabilities and limitations. Job Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes. Knowledge of cloud-based databases like AWS RDS and Google BigQuery. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. About Company Marketplace is an experienced team of industry experts dedicated to helping readers make informed decisions and choose the right products with ease. We arm people with trusted advice and guidance, so they can make confident decisions and get back to doing the things they care about most.
Posted 1 month ago
5.0 - 8.0 years
12 - 17 Lacs
Pune
Work from Office
Service Control own service management process definition, standardization and implementation, As an IT Change Enablement Expert, you will also be driving the adaptation of the Change Enablement process to align to the Continuous Integration and Continuous Delivery requirements. Also, you will have the opportunity to liaise with experts to align the Change Enablement process to the new automation concepts. Responsibilities Develop Change Enablement digital strategy processes with key delivery areas of continuous integration / continuous delivery, devops, automation and data analytics evolution Manage and deliver the day to day change enablement workflow tasks with the team such as: reviews, approvals ensuring risk controls are adhered to. Prepare deep dive data analytics and change reporting leading into relevant Governance meetings Engage and collaborate with the relevant Release Managers, Engineering, Operational Risk and Audit teams and drive delivery of tasks in the context of Change Enablement. Proactively identify operational readiness and change enablement process improvement opportunities. Skills Must have Experience in IT Change Enablement/Change Management for large organizations ITIL 3 or 4 experience and proficient with Microsoft suite of tools A thorough understanding of risks involved in the System Delivery Lifecycle (SDLC) and change environments Knowledge of the Continuous Integration and Continuous Delivery concepts and toolsets such as ServiceNow, Gitlab Familiarity with devops, agile methodologies and ways of working Strong analytical, reporting and data manipulation skills Attention to detail in all tasks, particularly in identifying gaps in operational processes areas Programming and/or automation skills (irrespective of toolset / technology) An enthusiastic and dynamic team player who is able to integrate and work successfully in a global team An excellent communicator, with strong interpersonal and stakeholder management skills Quick to produce accurate, concise information to tight deadlines, and to present that information professionally and in the right way to the required audience Excellent team player with a "can-do" attitude A balance of engineering and service mindset Experience of ServiceNow and M365 suite tooling Nice to have DevOps foundation certified is nice to have Experience of Gitlab CI/CD, AI tooling (CoPilot etc) and reporting platforms e.g PowerBi would be a benefit but not essential Other Languages English: C1 Advanced Seniority Senior
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
About the Role Were looking for a Data Engineer to help build reliable and scalable data pipelines that power reports, dashboards, and business decisions at Hevo. Youll work closely with engineering, product, and business teams to make sure data is accurate, available, and easy to use. Key Responsibilities Independently design and implement scalable ELT workflows using tools like Hevo, dbt, Airflow, and Fivetran. Ensure the availability, accuracy, and timeliness of datasets powering analytics, dashboards, and operations. Collaborate with Platform and Engineering teams to address issues related to ingestion, schema design, and transformation logic. Escalate blockers and upstream issues proactively to minimize delays for stakeholders. Maintain strong documentation and ensure discoverability of all models, tables, and dashboards. Own end-to-end pipeline quality, minimizing escalations or errors in models and dashboards. Implement data observability practices such as freshness checks, lineage tracking, and incident alerts. Regularly audit and improve accuracy across business domains. Identify gaps in instrumentation, schema evolution, and transformation logic. Ensure high availability and data freshness through monitoring, alerting, and incident resolution processes. Set up internal SLAs, runbooks, and knowledge bases (data catalog, transformation logic, FAQs). Improve onboarding material and templates for future engineers and analysts Required Skills & Experience 3-5 years of experience in Data Engineering, Analytics Engineering, or related roles. Proficient in SQL and Python for data manipulation, automation, and pipeline creation. Strong understanding of ELT pipelines, schema management, and data transformation concepts. Experience with modern data stack : dbt, Airflow, Hevo, Fivetran, Snowflake, Redshift, or BigQuery. Solid grasp of data warehousing concepts: OLAP/OLTP, star/snowflake schemas, relational & columnar databases. Understanding of Rest APIs, Webhooks, and event-based data ingestion. Strong debugging skills and ability to troubleshoot issues across systems. Preferred Background Experience in high-growth industries such as eCommerce, FinTech, or hyper-commerce environments. Experience working with or contributing to a data platform (ELT/ETL tools, observability, lineage, etc.). Core Competencies Excellent communication and problem-solving skills Attention to detail and a self-starter mindset High ownership and urgency in execution Collaborative and coachable team player Strong prioritization and resilience under pressure
Posted 1 month ago
6.0 - 11.0 years
18 - 33 Lacs
Pune, Chennai, Bangalore/ Bengaluru
Hybrid
Data analysts collect and store data related to sales, market research, logistics, linguistics, or other behaviors. They bring their technical expertise, ensure the quality and accuracy of that data and process shape.
Posted 1 month ago
1.0 - 3.0 years
2 - 3 Lacs
Jodhpur
Work from Office
We are seeking a detail-oriented and analytical Data Analyst to join our team. The successful candidate will be responsible for collecting, processing, and analyzing data to inform business decisions. This role requires strong analytical skills, attention to detail, and the ability to communicate insights effectively to both technical and non-technical stakeholders. Key Responsibilities: Collect, clean, and validate data from various sources. Perform exploratory data analysis and statistical modeling. Build and maintain dashboards and reports using tools like Power BI, Tableau, or Looker. Interpret data trends and patterns to inform business strategies. Collaborate with cross-functional teams to identify data needs and deliver actionable insights. Support data-driven decision-making through ad hoc analysis and reporting. Maintain data integrity and ensure consistency across systems. Document data processes and methodologies used in analysis. Qualifications: Bachelors degree in Data Science, Statistics, Mathematics, Computer Science, Economics, or a related field. 01 years of experience in a data analysis or business intelligence role. Proficiency in SQL and experience with relational databases. Experience with data visualization tools such as Tableau, Power BI, or similar. Strong skills in Excel and at least one programming language (e.g., Python, R). Familiarity with statistical techniques and machine learning models is a plus. Excellent communication and presentation skills. Strong problem-solving skills and attention to detail. Preferred Qualifications: Experience with cloud platforms like AWS, GCP, or Azure. Knowledge of ETL processes and tools. Background in e-commerce, healthcare, and finance. To Apply, Call Jyoti on- 9929500370 Call only between 10 am to 6 pm
Posted 1 month ago
0.0 - 5.0 years
4 - 9 Lacs
Noida
Remote
Identify, analyze, and interpret trends or patterns in complex data sets. Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Work with management to prioritize business. Required Candidate profile Knowledge of and experience with reporting packages databases (SQL etc), programming (XML, Javascript, or ETL frameworks). Adept at queries, report writing and presenting findings. Perks and benefits Flexible work arrangements.
Posted 1 month ago
0.0 - 5.0 years
4 - 9 Lacs
Chennai
Remote
Identify, analyze, and interpret trends or patterns in complex data sets. Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Work with management to prioritize business. Required Candidate profile Knowledge of and experience with reporting packages databases (SQL etc), programming (XML, Javascript, or ETL frameworks). Adept at queries, report writing and presenting findings. Perks and benefits Flexible work arrangements.
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Responsible for monitoring transactions and system health for Razorpays top merchants as we'll as the entire organization and alert relevant stakeholders whenever there is an issue with the system. Send communication and alerts during cases of incidents or anomalies in the system Collaborate with Internal teams as in when required Analyze the trends in the data and figure out trends and inferences Create Interactive dashboards for internal monitoring and metric consumption Required Qualifications Ability to work with large data Hands on experience with BI Tools if any and Data manipulation Knowledge of SQL is required Strong communication skills Ability to work 24/7 in shifts
Posted 1 month ago
5.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Bachelors degree in Computer Science, Information Technology, or a related field with 5-8 years of experience in Data Engineering. Proven experience as a Data Engineer with a focus on Oracle ODI. Experience on Oracle Data Integrator (ODI) for ETL processes. Solid understanding of data modeling concepts and techniques. Excellent SQL skills for data manipulation and analysis. Familiarity with data warehousing principles and best practices. Strong problem-solving and analytical skills. Effective communication and collaboration skills. Any Graduate or B.Tech
Posted 1 month ago
1.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Seekhos Team & Culture Role & Responsibilities: As a Product Analyst at Seekho, you will: Analyze Product Data: Extract, clean, and analyze large datasets to derive meaningful insights that enhance user engagement and product performance. Develop Data-Driven Strategies: Translate data findings into product improvements and business opportunities, enabling better decision-making. SQL Mastery: Write, optimize, and maintain complex SQL queries to extract insights from databases efficiently. Collaborate Across Teams: Work closely with product managers, engineers, content creators, and marketing teams to align data insights with strategic goals. Monitor and Report Metrics: Track key performance indicators (KPIs) and develop dashboards/reports to measure product effectiveness. Conduct Experimentation & A/B Testing: Design and analyze A/B tests to optimize user engagement and product features. Identify Trends & Patterns: Use statistical and analytical techniques to uncover hidden trends, enabling proactive decision-making. What We re Looking For 1+ years of experience in data analysis, product analytics, or a related field. Strong SQL skills - ability to write, optimize, and troubleshoot complex queries efficiently.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Hyderabad
Work from Office
Collect, clean, and analyze structured and unstructured data from various sources. Develop dashboards, reports, and visualizations to track key business metrics. Identify trends, patterns, and anomalies in data to generate actionable insights. Collaborate with cross-functional teams (ML Engineer, Product, Marketing, Engineering) to understand their data needs. Build and maintain automated reporting systems. Support data-driven decision-making through statistical analysis and modeling. Required Skills & Qualifications:- Bachelordegree in Statistics, Mathematics, Computer Science, related field. 2+ years of experience in data analysis or a similar role. Strong SQL skills and experience with data querying. Proficiency in Excel and data visualization tools. Working knowledge of Python Pandas, data manipulation and analysis. Excellent communication and presentation skills.
Posted 1 month ago
4.0 - 7.0 years
7 - 12 Lacs
Warangal, Hyderabad, Nizamabad
Work from Office
Description Sr Statistical Programmer Syneos Healthis a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities Our Clinical Development model brings the customer and the patient to the center of everything that we do We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for Whether you join us in a Functional Service Provider partnership or a Full-Service environment, youll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program We are committed to our Total Self culture where you can authentically be yourself Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people We are continuously building the company we all want to work for and our customers want to work with WhyBecause when we bring together diversity of thoughts, backgrounds, cultures, and perspectives were able to create a place where everyone feels like they belong Job Responsibilities Develop and maintain programs to create analysis datasets, tables, and figures, ensuring accuracy and compliance with statistical standards Provide comprehensive statistical programming support for statisticians, clients, or business needs, including troubleshooting and resolving programming issues Source, organize, and interpret complex data sets, utilizing advanced coding techniques to ensure data integrity and usability Collaborate with statisticians and other stakeholders to understand project requirements and deliver high-quality statistical outputs Evaluate existing programming processes, identify areas for improvement, and implement revisions to enhance productivity and efficiency Contribute to the design, implementation, and delivery of processes, programs, and policies, leveraging in-depth knowledge and skills within the statistical programming discipline Direct the work of lower-level professionals, providing guidance and mentorship to ensure the successful completion of projects and tasks Manage processes and programs related to statistical programming, ensuring alignment with organizational goals and objectives Ensure effective communication and collaboration with cross-functional teams to meet the needs of statisticians, clients, or businesses Stay updated with the latest advancements in statistical programming and data analysis techniques, continuously improving skills and knowledge to deliver innovative solutions Qualifications Advanced degree in Statistics, Computer Science, or a related field Proven experience in statistical programming and data analysis Strong knowledge of programming languages such as SAS, R, or Python Familiarity with data visualization tools and techniques Excellent problem-solving and analytical skills Ability to work independently and manage multiple projects simultaneously Certifications SAS Certified Advanced Programmer for SAS 9 or equivalent certification Certification in data analysis or statistical programming is preferred Necessary Skills Proficiency in statistical programming and data manipulation Strong understanding of statistical methodologies and data analysis techniques Ability to develop and implement efficient programming solutions Excellent communication and collaboration skills Attention to detail and commitment to quality Ability to adapt to changing project requirements and priorities Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients No matter what your role is, youll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment Learn more about Syneos Health http://www syneoshealth com Additional Information Tasks, duties, and responsibilities as listed in this job description are not exhaustive The Company, at its sole discretion and with no prior notice, may assign other tasks, duties, and job responsibilities Equivalent experience, skills, and/or education will also be considered so qualifications of incumbents may differ from those listed in the Job Description The Company, at its sole discretion, will determine what constitutes as equivalent to the qualifications described above Further, nothing contained herein should be construed to create an employment contract Occasionally, required skills/experiences for jobs are expressed in brief terms Any language contained herein is intended to fully comply with all obligations imposed by the legislation of each country in which it operates, including the implementation of the EU Equality Directive, in relation to the recruitment and employment of its employees The Company is committed to compliance with the Americans with Disabilities Act, including the provision of reasonable accommodations, when appropriate, to assist employees or applicants to perform the essential functions of the job Summary Roles within the Statistical Programming job family at the P22 level are responsible for developing programs and providing statistical programming support for statisticians or for client or business use These roles address needs for sourcing, organizing, and interpreting complex data sets utilizing codes and programs This includes developing codes that create analysis datasets, tables, and figures, evaluating programming processes, and suggesting revisions geared toward increasing productivity Impact and Contribution Roles within the Statistical Programming job family at the P22 level contribute significantly to the design, implementation, and delivery of processes, programs, and policies These roles involve in-depth knowledge and skills within a professional discipline, understanding the impact of work on related areas, and may be responsible for entire projects or processes within their area of responsibility Individuals in these roles may direct the work of lower-level professionals or manage processes and programs, ensuring that statistical programming support is effectively provided to meet the needs of statisticians, clients, or businesses Core Focus Developing programs and codes to create analysis datasets, tables, and figures Providing statistical programming support for statisticians or for client or business use Sourcing, organizing, and interpreting complex data sets Evaluating programming processes and suggesting revisions to increase productivity Contributing to the design, implementation, and delivery of processes, programs, and policies Directing the work of lower-level professionals or managing processes and programs Ensuring effective statistical programming support to meet the needs of statisticians, clients, or businesses
Posted 1 month ago
3.0 - 8.0 years
1 - 5 Lacs
Gurugram
Work from Office
Proficiency in MongoDB, Express.js, Angular (2+), and Node.js for developing scalable and efficient web applications. Strong understanding and hands-on experience with JavaScript , TypeScript, HTML5, and CSS3 to create responsive and intuitive user interfaces. Experience with modern front-end frameworks such as Angular, React.js, or Vue.js, ensuring seamless integration with back-end services. Familiarity with server-side CSS preprocessors like Sass or Less, leveraging their capabilities to streamline styling workflows. Thorough knowledge of designing and consuming RESTful APIs, coupled with proficiency in microservices architecture to facilitate modular and scalable application development. Responsibilities & Skills Hands-on experience with MongoDB or similar NoSQL databases, encompassing schema design, querying optimization, and data manipulation. Understanding of agile software development principles and practices, including iterative development, continuous integration, and rapid prototyping. Excellent problem-solving abilities with a proactive mindset to identify, troubleshoot, and resolve complex technical issues efficiently. Ability to work effectively both independently and collaboratively in a team environment, contributing to shared goals and delivering high-quality solutions. Strong verbal and written communication skills, adept at articulating technical concepts and collaborating effectively with cross-functional teams. Education Graduation in computers -Full-time B.E / B. Tech Experience 3+ Years Hiring For MEAN Programmer
Posted 1 month ago
0.0 - 3.0 years
2 - 4 Lacs
Pune
Work from Office
Job Summary: We are looking for a skilled and detail-oriented Data Analyst with over 2 years of experience to join our team The ideal candidate should have a strong foundation in scripting, solid Python & SQL skills, and hands-on experience with any data visualization tools such as Power BI, Tableau, or similar platforms Key Responsibilities: Write, optimize, and troubleshoot SQL queries to extract, transform, and load data from various sources Develop and maintain interactive dashboards and reports using Power BI, Tableau, or other visualization tools Automate data workflows and perform ad-hoc data analysis using scripting languages (e g , Python, R, or similar) Collaborate with cross-functional teams to understand data needs and deliver actionable insights Ensure data quality, consistency, and integrity across reporting solutions Required Skills and Qualifications: 2+ years of professional experience in Python, Data Analysis or BI role Proficiency in SQL for querying and data manipulation Good scripting knowledge in Python Hands-on experience with at least one data visualization tool (Power BI, Tableau, Looker, etc ) Strong analytical and problem-solving skills Excellent communication and stakeholder management abilities Preferred Qualifications: Experience working with large datasets and complex data models Exposure to cloud data platforms (e g , Azure, AWS, GCP) is a plus Familiarity with version control tools (e g , Git) is an advantage About Aumni Techworks Aumni Techworks, established in 2016, is a Software Services Company that partners with Product companies to build and manage their dedicated teams in India So, while you are working for a services company, you are working within a product team and growing with them We do not take projects and we have long term (open ended) contracts with our clients When our clients sign up with us they are looking at a multi-year relationship For e g Some of the clients we signed up 8 or 6 years, are still with us We do not move people across client teams and there is no concept of bench At Aumni, we believe in quality work, and we truly believe that Indian talent is at par with someone in NY, London or Germany 300+ and growing Benefits: Our award winning culture reminds us of our engineering days Medical insurance (including Parents), Life and disability insurance 24 leaves + 10 public holidays + leaves for Hospitalisation, maternity, paternity and bereavement On site Gym, TT, Carrom, Foosball and Pool table Hybrid work culture Fitness group / rewards Friday Socials, Annual parties, treks
Posted 1 month ago
8.0 - 13.0 years
2 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
At Aviato Consulting, you'll be at the forefront of innovation, building and deploying impactful AI solutions for some of Australia's largest enterprises As a Data Scientist, you'll leverage the power of Google Cloud and collaborative environments like Colab notebooks to develop, train, and optimize machine learning models that drive real business outcomes You'll be a key player in shaping our data science practice, working with diverse datasets, and directly contributing to our clients' success and our company's growth What Experience is Mandatory: Proven Data Science Expertise: Demonstrated experience in applying machine learning techniques, statistical modeling, and predictive analytics to solve complex business problems Google Cloud Platform (GCP) Proficiency: Hands-on experience with key GCP services relevant to data science, such as Vertex AI (for model training and deployment), BigQuery (for data warehousing), Cloud Storage, and Dataflow Model Training and Optimization: Strong background in the end-to-end lifecycle of machine learning models, including data preprocessing, feature engineering, model selection, training, evaluation, and hyperparameter tuning Python and Notebook Environments: Expert-level proficiency in Python for data manipulation, statistical analysis, and machine learning, with extensive experience using interactive environments like Jupyter or Google Colab notebooks for development and collaboration Working with Large Datasets: Experience handling, processing, and deriving insights from large and complex datasets, understanding data governance and security best practices Client-Facing Experience: Previous experience working directly with clients to understand requirements, present findings, and provide technical guidance What Experience is Beneficial (but Optional): MLOps Practices: Familiarity with MLOps principles and tools for automating and streamlining the machine learning lifecycle, including CI/CD for models Deep Learning Frameworks: Experience with deep learning frameworks such as TensorFlow or PyTorch Data Visualization Tools: Proficiency with data visualization libraries or tools (e g , Matplotlib, Seaborn, Looker) to communicate insights effectively What We Offer: Impactful Work & Ownership: You'll work on high-impact projects for large Australian clients, with significant autonomy and the opportunity to see your work directly influence business decisions Culture Built for Growth: Join a company founded by Ex-Googlers, fostering a culture of innovation, continuous learning, and collaboration, where your ideas are valued Shared Success: Become a part-owner of Aviato with an Employee Share Ownership Plan (ESOP) after 6 months, aligning your success with ours Rewarding Performance: Benefit from an annual bonus structure that recognizes your contributions and achievements True Flexibility: Enjoy the freedom of a fully remote work environment, allowing you to work from where you're most productive Certified Excellence: Be part of a "Great Place to Work" certified organization, reflecting our commitment to a positive and supportive employee experience Continuous Learning: We invest in your professional development, providing resources and opportunities to expand your skills and stay at the cutting edge of data science and Google Cloud technologies
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
We are looking for a skilled SAS Developer with 3 to 8 years of experience to join our team in Bangalore. The ideal candidate will have expertise in developing and implementing statistical analysis software solutions. Roles and Responsibility Design, develop, and test SAS programs for data manipulation and analysis. Collaborate with cross-functional teams to identify business requirements and develop solutions. Develop and maintain complex statistical models and algorithms using SAS. Troubleshoot and resolve technical issues related to SAS development. Participate in code reviews and contribute to improving overall code quality. Stay updated with industry trends and emerging technologies in SAS development. Job Requirements Strong knowledge of SAS programming language and its applications. Experience with data structures, algorithms, and software design patterns. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database management systems and data visualization tools. Employee type: CTH,
Posted 1 month ago
2.0 - 5.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About the role: We are seeking an innovative and highly skilled Industrial Solutions Engineer- OPC UA/PCS7 & Python Specialist to join our team and drive the design, development, and implementation of cutting-edge industrial solutions. This role focuses on leveraging your expertise in PCS7, PIMAQ, Python scripting, and OPC UA technologies to deliver scalable, high-quality systems that optimize performance and exceed project expectations. In this position, you will play a pivotal role in automating processes, analyzing data, and ensuring seamless communication between devices and systems in industrial environments, particularly within the oil and gas sector. Your ability to design accurate and accessible data models, coupled with your problem-solving skills and collaborative approach, will be key to delivering solutions that align with industry standards and project goals. If you are passionate about industrial automation, data-driven innovation, and working at the intersection of engineering and advanced technology, this is your opportunity to make a meaningful impact within a dynamic and forward-thinking team. Key Responsibilities Provide expert knowledge and support for PCS7 and PIMAQ systems, ensuring optimal performance and integration within the project framework. Develop and implement Python scripts for automation, data analysis, and system optimization. Utilize fundamental knowledge of OPC UA to facilitate seamless communication between devices and systems, ensuring interoperability and data exchange. Design and implement data models that support project objectives, ensuring data accuracy, consistency, and accessibility. Collaborate with engineering teams to define system requirements and specifications, ensuring alignment with project goals and industry standards. Qualifications Bachelors degree in Engineering, Computer Science, or a related field. Proven experience with PCS7 and PIMAQ in an oil and gas environment. Proficiency in Python programming, with a strong understanding of scripting for automation and data manipulation. Solid understanding of OPC UA fundamentals and its application in industrial environments. Experience in data modeling techniques and best practices. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to convey complex technical concepts to non-technical stakeholders. About the Team Become a part of our mission for sustainabilityclean energy for generations to come. We are a global team of diverse colleagues who share a passion for renewable energy and have a culture of trust and empowerment to make our own ideas a reality. We focus on personal and professional development to grow internally within our organization. Who is Siemens Energy At Siemens Energy, we are more than just an energy technology company. We meet the growing energy demand across 90+ countries while ensuring our climate is protected. With more than 96,000 dedicated employees, we not only generate electricity for over 16% of the global community, but were also using our technology to help protect people and the environment. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation.
Posted 1 month ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Responsibilities Design and Develop Scalable Data PipelinesBuild and maintain robust data pipelines using Python to process, transform, and integrate large-scale data from diverse sources. Orchestration and AutomationImplement and manage workflows using orchestration tools such as Apache Airflow to ensure reliable and efficient data operations. Data Warehouse ManagementWork extensively with Snowflake to design and optimize data models, schemas, and queries for analytics and reporting. Queueing SystemsLeverage message queues like Kafka, SQS, or similar tools to enable real-time or batch data processing in distributed environments. CollaborationPartner with Data Science, Product, and Engineering teams to understand data requirements and deliver solutions that align with business objectives. Performance OptimizationOptimize the performance of data pipelines and queries to handle large scales of data efficiently. Data Governance and SecurityEnsure compliance with data governance and security standards to maintain data integrity and privacy. DocumentationCreate and maintain clear, detailed documentation for data solutions, pipelines, and workflows. Qualifications Required Skills: 5+ years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka, AWS SQS, or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment, collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink. Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough