Jobs
Interviews

17 Data Analysts Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The opportunity: We are looking for a technical powerhouse - a hands-on Data Science and AI leader who can build, scale, and deploy machine learning systems that drive significant impact. As the key person in charge, you will be responsible for developing and executing the entire AI/ML strategy, from research to practical application, while playing a pivotal role in shaping a top-tier data organization. What you'll own: AI Leadership with a Global Lens: You will be tasked with establishing and leading a high-performing, multicultural team of data scientists, ML engineers, and analysts. Your role will involve setting a visionary path, fostering collaboration, and integrating diverse perspectives to stimulate innovation. Production-Grade AI at Scale: Your responsibilities will include deploying models that not only function effectively but also revolutionize processes such as fraud detection, credit scoring, and personalized finance on a large scale in various markets. Data Infrastructure for the Future: You will architect scalable, real-time systems that drive AI applications across different regions, languages, and regulatory landscapes. Fintech at Its Core: Your contributions will extend beyond technical aspects to directly impact financial inclusion, risk assessment, and overall growth in one of the world's most dynamic regions. Ethical AI for Real-World Impact: Ensuring fairness, transparency, and compliance in every model will be crucial, as trust forms the bedrock of the financial sector. Who you are: A Technical Leader Who Thrives in Diversity: You possess experience in building and leading teams with a diverse cultural mix, combining technical proficiency with emotional intelligence. A Hands-On AI/ML Expert: Your expertise lies in training, deploying, and scaling models, while also empowering your team members to reach their full potential. A Fintech or High-Stakes AI Veteran: Previous involvement in fraud detection, risk assessment, or financial analytics will be advantageous. A Communicator & Collaborator: You excel in bridging communication gaps between technical teams, executives, and regulators, effectively conveying complex AI concepts across various languages and cultures. Why this role Lead a Truly Regional Team: Collaborate with top talents from across Asia to develop AI solutions that cater to the needs of millions of diverse users. Zero Bureaucracy, Maximum Impact: Join a fast-paced environment where results matter; if you can demonstrate effectiveness, you will see your projects come to fruition swiftly. Your Legacy in Fintech: This opportunity is not just a job but a chance to shape your career significantly by leaving a lasting mark on the fintech industry.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Engineer at GlobalLogic, you will be responsible for architecting, building, and maintaining complex ETL/ELT pipelines for batch and real-time data processing using various tools and programming languages. Your key duties will include optimizing existing data pipelines for performance, cost-effectiveness, and reliability, as well as implementing data quality checks, monitoring, and alerting mechanisms to ensure data integrity. Additionally, you will play a crucial role in ensuring data security, privacy, and compliance with relevant regulations such as GDPR and local data laws. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Excellent analytical, problem-solving, and critical thinking skills with meticulous attention to detail are essential. Strong communication (written and verbal) and interpersonal skills are also required, along with the ability to collaborate effectively with cross-functional teams. Experience with Agile/Scrum development methodologies is considered a plus. Your responsibilities will involve providing technical leadership and architecture by designing and implementing robust, scalable, and efficient data architectures that align with organizational strategy and future growth. You will define and enforce data engineering best practices, evaluate and recommend new technologies, and oversee the end-to-end data development lifecycle. As a leader, you will mentor and guide a team of data engineers, conduct code reviews, provide feedback, and promote a culture of engineering excellence. You will collaborate closely with data scientists, data analysts, software engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Your role will also involve communicating complex technical concepts and data strategies effectively to both technical and non-technical audiences. At GlobalLogic, we offer a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust environment. By joining our team, you will have the chance to work on impactful projects, engage your curiosity and problem-solving skills, and contribute to shaping cutting-edge solutions that redefine industries. With a commitment to integrity and trust, GlobalLogic provides a safe, reliable, and ethical global environment where you can thrive both personally and professionally.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced and dynamic AI & Building Energy Modeling Product Manager responsible for leading the development and management of cutting-edge AI-driven solutions in the field of building energy modeling and sustainability. Your role requires a unique blend of expertise in AI/ML technologies, building energy modeling, and product management. You should possess strong technical skills, a passion for sustainability, and the ability to drive product strategy and development in a fast-paced environment. Your responsibilities include staying up-to-date on AI and Building Energy modeling and industry trends, applying this knowledge to inform product strategy, acting as a domain expert from Building Energy and HVAC domain, driving clear product definition and roadmap to achieve business goals, collaborating closely with AI/ML researchers, engineers, data analysts, annotators, and other product managers, deeply understanding customer needs and priorities, defining and tracking metrics to measure product quality and business impact, and ensuring AI products meet legal and ethical standards by working with legal and compliance teams. Qualifications required for this role include a bachelor's or master's degree in architecture, engineering, or a related field, along with a minimum of 2+ years of relevant experience in building energy modeling and simulation. Proven experience within a similar role dealing with Green Building, Sustainability strategies, Energy strategies, Energy modeling, and assessment is necessary. Strong technical skills and knowledge of building energy codes and standards, hands-on experience on IESVE/HAP/Design Builder, knowledge of using other software such as Energy Plus or eQuest, BEMP, CEM, or equivalent credentials, basic understanding of AI/ML concepts, capability of coding using Python, capability to perform market and competitive analysis in AI, familiarity with project management methodologies (Agile, Scrum), and the ability to work independently and as part of a team are also required. Furthermore, you will have the opportunity to work with a dynamic and innovative IT organization, experience a collaborative and supportive work environment, and benefit from professional growth and development opportunities. As a candidate, you should have a good understanding of different marketing techniques, familiarity with marketing applications (e.g., CRM tools, online analytics, and Google AdWords), a passion for the marketing industry and its best practices, excellent verbal and written communication skills, and skills in written & oral communication, objection handling, pitching value proposition, and preferably some knowledge of the valve industry and HVAC. Qualifications such as a bachelor's degree in Mechanical or a related discipline, BBA/MBA in Marketing, being a self-motivated and extrovert individual with a strong work ethic and a desire for continuous learning are also recommended.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

delhi

On-site

As a Data Architect in our organization, you will play a crucial role in defining the data architecture for key domains within the Data Products Portfolio. Your responsibilities will include evaluating data-related tools and technologies, recommending implementation patterns, and standard methodologies to ensure our Data ecosystem remains modern. Collaborating with Enterprise Data Architects, you will establish and adhere to enterprise standards while conducting PoCs to ensure their implementation. Your expertise will be instrumental in providing technical guidance and mentorship to Data Engineers and Data Analysts, developing and maintaining processes, standards, policies, guidelines, and governance to ensure consistency across the company. You will create and maintain conceptual/logical data models, work with business and IT teams to understand data requirements, and maintain a data dictionary with table and column definitions. Additionally, you will review data models with technical and business audiences and lead the design/build of new models to deliver financial results efficiently to senior management. This role is primarily technical, requiring you to function as an individual contributor (80%) while also demonstrating leadership capabilities (20%). Your key responsibilities include designing, documenting, and training the team on overall processes and process flows for Data architecture, resolving technical challenges in critical situations, developing relationships with external stakeholders, reviewing work from other tech team members, and implementing Data Architecture and Data security policies aligned with governance objectives and regulatory requirements. **Essential Education** - A Bachelor's degree in information science, data management, computer science, or a related field is preferred. **Experience & Qualifications** - Bachelor's degree or equivalent combination of education and experience. - 12+ years of IT experience with a major focus on data warehouse/database related projects. - Expertise in cloud databases like Snowflake/RedShift, data catalog, MDM, etc. - Proficiency in SQL, database procedures, Data Modelling (Conceptual, logical, and Physical), and documenting architecture-related work. - Hands-on experience in data storage, ETL/ELT, data analytics tools and technologies, Data Warehousing design/development, and BI/Analytical systems. - Experience with Cloud Big Data technologies such as AWS, Azure, GCP, and Snowflake. - Experience with Python is preferable. - Strong hands-on experience with data and analytics data architecture, solution design, and engineering. - Experience working with Agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams. - Strong communication and presentation skills for presenting architecture, features, and solution recommendations. You will work closely with global functional portfolio technical leaders, product owners, functional area teams, Global Data portfolio Management & teams, and consulting and internal Data Tribe teams across the organization.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As the COE Solution Development Lead at Teradata, you will be a key thought leader responsible for overseeing the detailed design, development, and maintenance of complex data and analytic solutions. Your role will involve utilizing strong technical and project management skills, as well as team building and mentoring capabilities. You will need to have a deep understanding of Teradata's Solutions Strategy, Technology, Data Architecture, and the partner engagement model. Reporting directly to Teradata's Head of Solution COE, you will play a crucial role in leading a team that develops scalable, efficient, and innovative data and analytics solutions to address complex business problems. Your key responsibilities will include leading the end-to-end process of solution development, designing comprehensive solution architectures, ensuring the flexibility for integration of various data sources and platforms, implementing best practices in data analytics solutions, collaborating with senior leadership, and mentoring a team of professionals to foster a culture of innovation and continuous learning. Additionally, you will work towards delivering solutions on time and within budget, facilitating knowledge sharing across teams, and ensuring that data solutions are scalable, secure, and aligned with the organization's overall technological roadmap. You will collaborate with the COE Solutions lead to transform conceptual solutions into detailed designs and lead a team of Data scientists, Solution engineers, Data engineers, and Software engineers. Furthermore, you will work closely with product development, legal, IT, and business teams to ensure seamless integration of data analytics solutions and the protection of related IP. To qualify for this role, you should have a Bachelor's degree in Computer Science, Engineering, Data Science, or a related field, with a preference for an MS or MBA. You should also possess over 15 years of experience in IT, with at least 10 years in data & analytics solution development and 4+ years in a leadership or senior management position. Along with a proven track record in developing data-driven solutions, you should have experience working with cross-functional teams and a strong understanding of emerging trends in data analytics technologies. We believe you will thrive at Teradata due to our people-first culture, flexible work model, focus on well-being, and commitment to Diversity, Equity, and Inclusion. If you are a collaborative, analytical, and innovative professional with excellent communication skills and a passion for data analytics, we invite you to join us in solving business challenges and driving enterprise analytics forward.,

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Coimbatore

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Kanpur

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Chandigarh

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

4.0 - 5.0 years

4 - 5 Lacs

Remote, , India

On-site

We are seeking a skilled Talend ETL Developer with 4 to 7 years of experience in designing and developing robust ETL processes using Talend. The role requires strong capabilities in data extraction, transformation, and loading, along with expertise in performance optimization, error handling, and collaboration with data teams. Key Responsibilities: ETL Process Design: Design and develop scalable ETL processes using Talend for end-to-end data integration and transformation. Data Extraction: Extract data from diverse sources including databases, APIs, and flat files. Data Transformation: Transform and cleanse data to align with business requirements and ensure high data quality. Data Loading: Load transformed data into data warehouses, data lakes, or other target systems. Job Scheduling: Schedule and automate ETL jobs using Talend's scheduling tools or third-party schedulers. Performance Optimization: Tune and optimize ETL workflows for efficiency, speed, and resource usage. Error Handling: Implement robust error handling, logging, and alerting mechanisms to ensure reliability. Data Profiling: Perform data profiling to detect anomalies, inconsistencies, and data quality issues. Documentation: Create and maintain comprehensive ETL documentation, including data flow diagrams and technical specifications. Collaboration with Data Teams: Work closely with data analysts, data scientists, and business stakeholders to gather requirements and deliver high-quality data solutions. Required Skills & Qualifications: 4 to 7 years of relevant experience in ETL development using Talend. Strong understanding of ETL architecture, data integration principles, and data transformation techniques. Hands-on experience in data extraction from multiple source types. Proficient in SQL and working with relational and non-relational databases. Familiarity with data warehousing and data lake architectures. Excellent problem-solving, troubleshooting, and communication skills.

Posted 1 month ago

Apply

4.0 - 5.0 years

4 - 5 Lacs

Ahmedabad, Gujarat, India

On-site

We are seeking a skilled Talend ETL Developer with 4 to 7 years of experience in designing and developing robust ETL processes using Talend. The role requires strong capabilities in data extraction, transformation, and loading, along with expertise in performance optimization, error handling, and collaboration with data teams. Key Responsibilities: ETL Process Design: Design and develop scalable ETL processes using Talend for end-to-end data integration and transformation. Data Extraction: Extract data from diverse sources including databases, APIs, and flat files. Data Transformation: Transform and cleanse data to align with business requirements and ensure high data quality. Data Loading: Load transformed data into data warehouses, data lakes, or other target systems. Job Scheduling: Schedule and automate ETL jobs using Talend's scheduling tools or third-party schedulers. Performance Optimization: Tune and optimize ETL workflows for efficiency, speed, and resource usage. Error Handling: Implement robust error handling, logging, and alerting mechanisms to ensure reliability. Data Profiling: Perform data profiling to detect anomalies, inconsistencies, and data quality issues. Documentation: Create and maintain comprehensive ETL documentation, including data flow diagrams and technical specifications. Collaboration with Data Teams: Work closely with data analysts, data scientists, and business stakeholders to gather requirements and deliver high-quality data solutions. Required Skills & Qualifications: 4 to 7 years of relevant experience in ETL development using Talend. Strong understanding of ETL architecture, data integration principles, and data transformation techniques. Hands-on experience in data extraction from multiple source types. Proficient in SQL and working with relational and non-relational databases. Familiarity with data warehousing and data lake architectures. Excellent problem-solving, troubleshooting, and communication skills.

Posted 1 month ago

Apply

4.0 - 5.0 years

4 - 5 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a skilled Talend ETL Developer with 4 to 7 years of experience in designing and developing robust ETL processes using Talend. The role requires strong capabilities in data extraction, transformation, and loading, along with expertise in performance optimization, error handling, and collaboration with data teams. Key Responsibilities: ETL Process Design: Design and develop scalable ETL processes using Talend for end-to-end data integration and transformation. Data Extraction: Extract data from diverse sources including databases, APIs, and flat files. Data Transformation: Transform and cleanse data to align with business requirements and ensure high data quality. Data Loading: Load transformed data into data warehouses, data lakes, or other target systems. Job Scheduling: Schedule and automate ETL jobs using Talend's scheduling tools or third-party schedulers. Performance Optimization: Tune and optimize ETL workflows for efficiency, speed, and resource usage. Error Handling: Implement robust error handling, logging, and alerting mechanisms to ensure reliability. Data Profiling: Perform data profiling to detect anomalies, inconsistencies, and data quality issues. Documentation: Create and maintain comprehensive ETL documentation, including data flow diagrams and technical specifications. Collaboration with Data Teams: Work closely with data analysts, data scientists, and business stakeholders to gather requirements and deliver high-quality data solutions. Required Skills & Qualifications: 4 to 7 years of relevant experience in ETL development using Talend. Strong understanding of ETL architecture, data integration principles, and data transformation techniques. Hands-on experience in data extraction from multiple source types. Proficient in SQL and working with relational and non-relational databases. Familiarity with data warehousing and data lake architectures. Excellent problem-solving, troubleshooting, and communication skills.

Posted 1 month ago

Apply

4.0 - 5.0 years

4 - 5 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking a skilled Talend ETL Developer with 4 to 7 years of experience in designing and developing robust ETL processes using Talend. The role requires strong capabilities in data extraction, transformation, and loading, along with expertise in performance optimization, error handling, and collaboration with data teams. Key Responsibilities: ETL Process Design: Design and develop scalable ETL processes using Talend for end-to-end data integration and transformation. Data Extraction: Extract data from diverse sources including databases, APIs, and flat files. Data Transformation: Transform and cleanse data to align with business requirements and ensure high data quality. Data Loading: Load transformed data into data warehouses, data lakes, or other target systems. Job Scheduling: Schedule and automate ETL jobs using Talend's scheduling tools or third-party schedulers. Performance Optimization: Tune and optimize ETL workflows for efficiency, speed, and resource usage. Error Handling: Implement robust error handling, logging, and alerting mechanisms to ensure reliability. Data Profiling: Perform data profiling to detect anomalies, inconsistencies, and data quality issues. Documentation: Create and maintain comprehensive ETL documentation, including data flow diagrams and technical specifications. Collaboration with Data Teams: Work closely with data analysts, data scientists, and business stakeholders to gather requirements and deliver high-quality data solutions. Required Skills & Qualifications: 4 to 7 years of relevant experience in ETL development using Talend. Strong understanding of ETL architecture, data integration principles, and data transformation techniques. Hands-on experience in data extraction from multiple source types. Proficient in SQL and working with relational and non-relational databases. Familiarity with data warehousing and data lake architectures. Excellent problem-solving, troubleshooting, and communication skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Design, develop, and optimize batch and streaming data pipelines primarily using PySpark (Python with Apache Spark) . Write efficient, reusable, and testable code for large-scale data processing. Work with diverse and large-scale datasets from various sources, including but not limited to Kafka, Hive, S3, and Parquet files. Collaborate closely with data scientists, data analysts, and DevOps teams to build robust and scalable data pipelines. Tune Spark jobs for optimal performance and resource efficiency. Implement comprehensive data quality checks, logging, and error-handling mechanisms within data pipelines. Contribute to the overall architecture and strategy for big data solutions

Posted 1 month ago

Apply

4.0 - 5.0 years

12 - 14 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

ETL Process Design: Designing and developing ETL processes using Talend for data integration and transformation. Data Extraction: Extracting data from various sources, including databases, APIs, and flat files. Data Transformation: Transforming data to meet business requirements and ensuring data quality. Data Loading: Loading transformed data into target systems, such as data warehouses or data lakes. Job Scheduling: Scheduling and automating ETL jobs using Talend's scheduling tools. Performance Optimization: Optimizing ETL workflows for efficiency and performance. Error Handling: Implementing robust error handling and logging mechanisms in ETL processes. Data Profiling: Performing data profiling to identify data quality issues and inconsistencies. Documentation: Documenting ETL processes, data flow diagrams, and technical specifications. Collaboration with Data Teams: Working closely with data analysts, data scientists, and other stakeholders to understand data requirements. Min 4 to Max 7yrs of Relevant exp. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Chennai

Work from Office

What will your day look like? - Leading a dynamic team to deliver high impact risk solutions across credit risk (underwriting, exposure controls and line management). - Work with stakeholders across product management, data science, and engineering to build relationship with the partner teams and drive implementation of risk strategies - Manage challenging time constraints to ensure on-time delivery of projects. - Work closely with partner teams in identifying, evaluating, and recommending new data that helps in risk differentiation. - Analyze loss trends and simulate risk decisioning strategies that help optimize revenue, approval rates etc. - Work closely with data science team and recommends credit risk decisioning and model deployment strategy. - Build a risk scorecard that leverages both internal performance data and external performance data that will be leveraged for credit decisioning at both underwriting and account management reviews for existing customers. - Collates analysis and builds presentations that helps articulate the risk strategy for the leadership team. To Help Us Level Up, You Will Ideally Have : - Quantitative background in engineering, statistics, math, economics, business, or related disciplines. - 5+ years experience in analyzing data and using database query language (e. SQL) analysis and programming and developer tools such as Python, R, data bricks in a finance or analytics field. - 2+ years of experience in leading high performing team of analysts. - Experience in working with non-traditional data such as social media will be a big plus. - Prior model building experience is a plus but not critical. - Possesses an analytical mindset and strong problem-solving skills. - Attention to detail and ability to multitask. - Comfortable working in a fast-paced environment and dealing with ambiguity. - Possesses strong communication, interpersonal and presentation skills; and ability to engage and collaborate with multiple stakeholders across teams. - Extremely proactive communicator willing to raise flags when needed and keep team members informed of ongoing risk or fraud related activities.

Posted 1 month ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

What kind of person are we looking for We are seeking a highly skilled and motivated Business Intelligence Assistant Manager to join us. The successful candidate will be responsible for overseeing data analysis, developing insights, and supporting business strategies through data-driven decision making. This role requires a strong background in BI analytics, excellent people managerial skills, and a deep understanding of fintech industry trends. What would you get if you worked with us You'll be closely working on problem statements and influencing decisions that impact 42 million merchants. While supporting merchant experience strategy & operations, youll own data visibility, dashboarding, insights & subsequent strategic decision making. By creating powerful narratives based on support data & ticket insights, youll help the merchant experience strategy & operations team to prioritize & achieve ambitious goals. What would you get to do in this role Be an integral part of the Merchant experience strategy team and define the critical metrics to understand MX performance. Monitor performance trends and do data analysis for any interventions done to improve merchant experience. Collaborate closely with the internal to merchant experience teams like process design, Product operations, operations, Automation, etc. to highlight problems/inefficiencies identified by right analytical problem statements. Own the entire insight generation and build narratives by working on deep, thorough analysis to provide unbiased answers on the identified problem statement. Also, come up with unidentified / unknown problem statements based on new data insights. These insights and decisions will be influenced based on your presentation of the evidence, backed by data-driven hypothesis Identify & help stakeholders prioritize improvement opportunities by co-owning experience & business metrics. Move past just being the "Data person" and contribute with individual thoughts on how to improve the critical merchant experience metrics.. Act like a business owner & leverage data to influence stakeholder decisions. Work with central analytics team to ensure that the dashboards are designed and built in a way that makes it easy for the teams to consume the data they need What do you need to have to apply for this position Minimum 3-5 years of analytics experience in relevant roles. Lead and mentor a team of data analysts, ensuring high performance and continuous development. Ability to manage multiple projects/BUs and priorities simultaneously. Present findings and insights to senior management and other stakeholders in a clear and concise manner. Collaborate with business units to understand their needs and provide data-driven recommendations to support strategic initiatives. Strong problem solving & analytical skills, followed by strong stakeholder management skills. Identify opportunities for process improvement and implement best practices in data analysis and reporting. Penchant for business & curiosity to understand how the product works Ability to clearly explain thoughts and ideas either verbally or in the written form. Candidates who are able to explain the story behind their analysis will find themselves at an advantage. Intuition for data and ability to handle big data sources. Strong working knowledge in Excel and visualization tools like PowerBI, Tableau, Qlik Sense. Ability to write complex queries on SQL to manipulate, consolidate multiple data sources for the purpose of dashboarding and analysis, is a pulse. Understanding of data-analysis languages such as R, Python and in core statistical concepts is good to have.

Posted 2 months ago

Apply

4 - 7 years

7 - 11 Lacs

Hyderabad

Work from Office

Responsibilities: Design, develop, and maintain scalable data pipelines using Python and AWS Redshift Optimize and tune Redshift queries, schemas, and performance for large-scale datasets Implement ETL/ELT processes to ensure accurate and timely data availability Collaborate with data analysts, engineers, and product teams to understand data requirements Ensure data quality, consistency, and integrity across systems Automate data workflows and improve pipeline efficiency using scripting and orchestration tools Monitor data pipeline performance and troubleshoot issues proactively Maintain documentation for data models, pipelines, and system configurations Ensure compliance with data governance and security standards

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies