Home
Jobs

1524 Looker Jobs - Page 45

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description: In this role, the Social Media Assistant will work with the wider social team to provide their subject matter expertise in paid social and assist in community management across our client’s social channels. The Social Media Assistant will execute tasks to the highest standard and manage the execution of manual day-to-day tasks for clients. They are a proactive self-starter who gets the job done with exceptional attention to detail, communication and planning skills. The ideal candidate will have experience working in performance marketing and executing paid social campaigns across different platforms and also has past experience in community management. They must be a self-starter with exceptional time management skills, be an excellent communicator and thrive in a fast-paced environment. We’re seeking a reliable team player with the ability to work autonomously when needed and they must have a keen eye for detail. Responsibilities Paid Social Responsibilities Execution and optimisation of paid social media activity across platforms such as Facebook Ads Manager, Pinterest and TikTok. Budget and KPI/results tracking. Ensure deadlines and task delivery are met with exceptional standards. Monitoring campaign performance and looking for opportunities to scale and improve performance. Reporting on campaign performance and making recommendations on how to improve. Work closely with Pattern’s Social team to ensure campaigns are executed to the highest standard with no errors Community Management Responsibilities Monitor client social media channels (Facebook, Twitter, Instagram, LinkedIn, YouTube, Google Maps etc) for conversations related to our brand, products, and industry trends. Respond promptly to comments, messages, and inquiries in a professional and friendly manner. Cultivate and nurture relationships with our online community to build trust and loyalty. Assist with scheduling social media content that resonates with our audience. Skills And Qualifications Marketing, Communications and/or Social Media University graduate Proficient in various social media ad managers including META, TikTok, Pinterest, LinkedIn, and YouTube At least 1-2 years experience in a paid social or performance marketing role and experience in community management. Experience executing and managing paid social campaigns. Experience with third-party social media scheduling and analytical tools. Proven experience in social media management and/or community management. Excellent written and verbal communication skills. Strong understanding of social media platforms and their respective audiences. Ability to work independently and as part of a team in a fast-paced environment. Knowledge of social media analytics tools is a plus. Experience with Google Suite & Google Analytics, and data interpretation. Experience with Looker Studio and Zendesk. Desired Traits & Competencies Attention to detail – does not let important details slip through the cracks or derail a project Analytical skills – able to structure and process qualitative or quantitative data and draw insightful conclusions from it. Exhibits a probing mind and achieves penetrating insights. Efficiency – able to produce significant output with minimal wasted effort Proactivity – acts without being told what to do and brings new ideas to the company Intelligence – learns quickly and demonstrates an ability to quickly and proficiently understand and absorb new information Flexibility/adaptability – adjust quickly to changing priorities and conditions and copes effectively with complexity and change Enthusiasm – Exhibits passion and excitement over work and has a can-do attitude Communication – Speaks and writes clearly and articulately without being overly verbose or talkative. Maintains this standard in all forms of written communication, including email. Teamwork – reaches out to all peers and cooperates with supervisors to establish an overall collaborative working relationship Organisation & planning – plans, organises, schedules and budgets in an efficient, productive manner. Focuses on key priorities. Pattern is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

This is a key position supporting client organization with strong Analytics and data science capabilities. There is significant revenue and future opportunities associated with this role. Job Description: Develop and maintain data tables (management, extraction, harmonizing etc.) using GCP/ SQL/ Snowflake etc. This involves designing, implementing, and writing optimized codes, maintaining complex SQL queries to extract, transform, and load (ETL) data from various tables/sources, and ensuring data integrity and accuracy throughout the data pipeline process. Create and manage data visualizations using Tableau/Power BI. This involves designing and developing interactive dashboards and reports, ensuring visualizations are user-friendly, insightful, and aligned with business requirements, and regularly updating and maintaining dashboards to reflect the latest data and insights. Generate insights and reports to support business decision-making. This includes analyzing data trends and patterns to provide actionable insights, preparing comprehensive reports that summarize key findings and recommendations, and presenting data-driven insights to stakeholders to inform strategic decisions. Handle ad-hoc data requests and provide timely solutions. This involves responding to urgent data requests from various departments, quickly gathering, analyzing, and delivering accurate data to meet immediate business needs, and ensuring ad-hoc solutions are scalable and reusable for future requests. Collaborate with stakeholders to understand and solve open-ended questions. This includes engaging with business users to identify their data needs and challenges, working closely with cross-functional teams to develop solutions for complex, open-ended problems, and translating business questions into analytical tasks to deliver meaningful results. Design and create wireframes and mockups for data visualization projects. This involves developing wireframes and mockups to plan and communicate visualization ideas, collaborating with stakeholders to refine and finalize visualization designs, and ensuring that wireframes and mockups align with user requirements and best practices. Communicate findings and insights effectively to both technical and non-technical audiences. This includes preparing clear and concise presentations to share insights with diverse audiences, tailoring communication styles to suit the technical proficiency of the audience, and using storytelling techniques to make data insights more engaging and understandable. Perform data manipulation and analysis using Python. This includes utilizing Python libraries such as Pandas, NumPy, and SciPy for data cleaning, transformation, and analysis, developing scripts and automation tools to streamline data processing tasks, and conducting statistical analysis to generate insights from large datasets. Implement basic machine learning models using Python. This involves developing and applying basic machine learning models to enhance data analysis, using libraries such as scikit-learn and TensorFlow for model development and evaluation, and interpreting and communicating the results of machine learning models to stakeholders. Automate data processes using Python. This includes creating automation scripts to streamline repetitive data tasks, implementing scheduling and monitoring of automated processes to ensure reliability, and continuously improving automation workflows to increase efficiency. Requirements: 3 to 5 years of experience in data analysis, reporting, and visualization. This includes a proven track record of working on data projects and delivering impactful results and experience in a similar role within a fast-paced environment. Proficiency in GCP/ SQL/ Snowflake/ Python for data manipulation. This includes strong knowledge of GCP/SQL/Snowflake services and tools, advanced SQL skills for complex query writing and optimization, and expertise in Python for data analysis and automation. Strong experience with Tableau/ Power BI/ Looker Studio for data visualization. This includes demonstrated ability to create compelling and informative dashboards, and familiarity with best practices in data visualization and user experience design. Excellent communication skills, with the ability to articulate complex information clearly. This includes strong written and verbal communication skills, and the ability to explain technical concepts to non-technical stakeholders. Proven ability to solve open-ended questions and handle ad-hoc requests. This includes creative problem-solving skills and a proactive approach to challenges, and flexibility to adapt to changing priorities and urgent requests. Strong problem-solving skills and attention to detail. This includes a keen eye for detail and accuracy in data analysis and reporting, and the ability to identify and resolve data quality issues. Experience in creating wireframes a nd mockups. This includes proficiency in design tools and effectively translating ideas into visual representations. Ability to work independently and as part of a team. This includes being self-motivated and able to manage multiple tasks simultaneously and having a collaborative mindset and willingness to support team members. Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Product Manager – UPI Autopay (Paytm App) Experience: 4–8 years About Us: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the Role: We are looking for a highly driven and data-oriented Product Manager to own and scale the UPI Autopay experience on the Paytm App. This role is central to growing Paytm’s footprint in recurring payments by building seamless, high-performing, and innovative products in collaboration with various TSPs (Technology Service Providers) and internal stakeholders. Key Responsibilities:  Product Ownership: Drive the vision, strategy, roadmap, and execution for UPI Autopay on the Paytm App across different use cases (subscriptions, billers, investments, OTT, etc.).  Partner Collaboration: Coordinate with multiple TSPs (for different handles like @ptsbi, @ptaxis, @ptyes, @ptybl, @paytm) to design, develop, and integrate UPI Autopay flows and features.  Success Rate Monitoring: Work with engineering, QA, and switch teams to continuously monitor and improve success rates and reliability of mandate creation and execution.  User Experience: Identify UX gaps and work with design and research teams to enhance discoverability, trust, and ease-of-use for UPI Autopay flows.  Data & Insights: Analyze user journeys, funnel drop-offs, transaction data, and feedback to drive data-led product decisions. Define and track key metrics like success rate, activation rate, drop-offs, churn, etc.  Market Expansion: Identify new opportunities and segments for UPI Autopay adoption, drive merchant and category-specific customizations, and work with GTM teams to grow market share.  Compliance & NPCI Alignment: Ensure product compliance with NPCI UPI Autopay guidelines, participate in new NPCI initiatives (like e-mandate on RuPay Credit, One time mandates, BIMA ASBA, etc.), and manage regulatory approvals where required. What We’re Looking For:  4–8 years of overall experience, with at least 3 years in product management.  Prior experience in payments (preferably UPI or recurring payments) is strongly preferred.  Hands-on experience managing cross-functional teams – tech, design, operations,compliance.  Strong analytical mindset with proficiency in data tools (SQL, Mixpanel, Looker, etc.) and wireframing user journey tools (Figma, Whimsical, Balsamiq, etc)  Ability to manage external partners and vendors (TSP, PSPs, NPCI) and launch features in collaboration with growth, business and marketing  Excellent communication, documentation, and stakeholder management skills.  Self-starter who thrives in a fast-paced, ambiguous, high-growth environment. Good to Have:  Experience in consumer fintech or large-scale transactional platforms.  Understanding of UPI switch architecture or direct bank integrations.  Exposure to billing/subscription-based products or ecosystems. Why Join Us? At Paytm, you'll be at the forefront of digital payments innovation in India. If you love building products that touch millions of lives, solving complex technical and operational problems, and want to play a key role in shaping the future of UPI Autopay – we want to hear from you! Show more Show less

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

Naukri logo

Job Description: Role Overview: We are looking for an experienced BI/Analytics Lead with a strong background in delivery and extensive experience with multiple BI tools. The ideal candidate will have a proven track record in designing and building semantic data layers for self-service discovery and reports, as well as managing reporting platforms and centers of excellence (CoE). Key Responsibilities: Lead and manage BI and analytics projects, ensuring successful delivery. Design and build semantic data layers to enable self-service data discovery and reporting. Oversee report migration and rationalization projects. Leverage Generative AI (GenAI) for conversational AI and advanced analytics. Manage and optimize reporting platforms, ensuring high performance and reliability. Establish and lead a Reporting Center of Excellence (CoE). Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Mentor and guide a team of BI developers and analysts. Required Skills and Experience: Extensive experience with BI tools such as Power BI, Tableau, and Qlik. Proven experience in designing and building semantic data layers. Strong project management skills with a focus on delivery. Hands-on experience with report migration and rationalization. Experience leveraging GenAI for conversational AI and analytics. Proven track record in designing and building Reporting CoEs. Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: Certifications in BI tools (e.g., Microsoft Certified: Data Analyst Associate, Tableau Desktop Specialist). Experience with other BI tools and platforms. Familiarity with data governance and security best practices. Knowledge of machine learning and AI technologies.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job description: Description - External At Storable, were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and were looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance, Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration - Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms. AWS, Apache Airflow, Apache Iceberg to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Mentorship & Leadership Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued Qualifications Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency : Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services s3, Lambda, Redshift, Glue Data Pipeline Mastery Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practice Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points : Experience with visualization tools Looker, Tableau and reporting frameworks to provide actionable insights. Job Title : AWS WITH PYTHON Key Skills : AWS,Python,PySpark,AWS S3,AWS Athena,AWS Redshift Spectrum,Apache Airflow,AWS Glue,Apache Iceberg,ETL pipeline design/architecture,Data Modeling Monitoring/Data Pipeline Optimisation ,CDC Implementations,AWS Database Migration Service Job Locations : Any Virtusa Experience : 5-7 Years Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract Notice Period : Immediate - 10 Days Payroll : people prime Worldwide Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title: AWS Data Engineer Location : Pan India Experience : 8-6 Years Job Typ e: Contract to Hire Notice Period : Immediate Joiners Mandatory Skills:, AWS services s3, Lambda, Redshift, Glue,Python,PySpark,SQL Job description: JD: Description - External At Storable, were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and were looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance, Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration - Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms. AWS, Apache Airflow, Apache Iceberg to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Mentorship & Leadership Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued Qualifications Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency : Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services s3, Lambda, Redshift, Glue Data Pipeline Mastery Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practice Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points : Experience with visualization tools Looker, Tableau and reporting frameworks to provide actionable insights. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title : Key Skills : Scripting, SQL, Python, PySpark, Storage, AWS S3, Querying, AWS Athena, AWS Redshift, Spectrum, Orchestration, Apache Airflow, AWS Glue Job Locations : Hyderabad (Pan India) Experience : 5- 7 Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract Notice Period : Immediate - 10 Days Payroll : people prime Worldwide Job description: Skill Bucket Must Have: Scripting SQL Python PySpark Storage AWS S3 Querying AWS Athena AWS Redshift Spectrum Orchestration Apache Airflow AWS Glue Concepts Apache Iceberg ETL pipeline design/architecture Data Modeling Monitoring/Data Pipeline Optimisation CDC Implementations Integration AWS Database Migration Service Good to have: CI/CD (Jenkins, GitHub Actions, etc.) and DevOps Practices Infrastructure as Code (Terraform) Looker, Tableau, or another BI/analytics/visualisation tools Distributed Data Processing Concepts Data Quality/Governance Description - External At Storable, were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and were looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance, Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration - Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms. AWS, Apache Airflow, Apache Iceberg to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Mentorship & Leadership Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued Qualifications Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency : Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services s3, Lambda, Redshift, Glue Data Pipeline Mastery Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practice Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points : Experience with visualization tools Looker, Tableau and reporting frameworks to provide actionable insights. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About the Role: We are seeking an experienced and highly organized Digital Program Manager to lead the orchestration of our digital release lifecycle—from web updates to campaign launches. This role bridges the gap between our content, SEO, development, analytics, and paid media teams to ensure every initiative goes live with precision, speed, and measurable impact. You will own the digital release workflow, monitor performance, and drive alignment across departments—ensuring high-quality web experiences that support user engagement and business conversion goals. Key Responsibilities: Digital Release Management · Own and manage the digital release calendar with clear timelines, dependencies, and stakeholders · Coordinate all steps in the digital workflow: · Content creation and SEO optimization · Development and QA · Analytics tracking setup (GA4, UTM) · Advertising campaign alignment · Final approval and go-live · Serve as the single point of accountability for timely, error-free website and campaign launches Quality Control & Site Performance · Conduct pre-launch QA using tools like SEMrush, Screaming Frog, and PageSpeed Insights · Ensure pages meet SEO, mobile responsiveness, and accessibility standards · Monitor uptime, site speed, and technical health post-release Cross-Functional Alignment · Facilitate weekly syncs across content, SEO, dev, analytics, and media teams · Act as a communication hub and escalation point for blockers · Maintain documentation, checklists, and release SOPs to standardize execution Continuous Improvement · Run post-release retrospectives to identify gaps and refine processes · Monitor performance analytics and implement lessons learned into future cycles · Ensure roadmap visibility and alignment across marketing and product functions Key Performance Indicators (KPIs) Category KPI Release Management % of website releases delivered on time Website Health Website uptime, load speed, SEO error rate User Engagement Bounce rate, average session duration, total page views Conversion Form submissions, content downloads, demo requests Roadmap Delivery % of digital roadmap items delivered on schedule Internal Alignment Stakeholder satisfaction score or feedback post-launch Issue Resolution Average time to resolve web-related issues or bugs Required Skills & Experience: · 3–6 years of experience in digital program management, project management, or web operations · Solid knowledge of SEO, analytics tracking (GA4, GTM), UTM tagging, and CMS workflows · Proficiency with SEMrush, Google PageSpeed, Search Console, and QA auditing tools · Familiarity with project tools like Asana, Jira, Notion, or Trello · Strong written and verbal communication skills; able to align cross-functional teams · Organized, proactive, and focused on continuous process improvement Tools You’ll Use: · Google Analytics 4, Tag Manager, Looker Studio · SEMrush, Screaming Frog, PageSpeed Insights · CMS platforms (e.g., WordPress, Webflow) · Project & collaboration tools (Asana, Notion, Jira, Slack) · Microsoft products (Docs, Sheets, Slides) Success in This Role Looks Like: · Digital releases go live on time, fully optimized, and error-free · Stakeholders are informed, engaged, and aligned · Website engagement metrics improve with each iteration · Technical SEO issues and bugs are identified and resolved proactively · Campaigns launch with full tracking, aligned messaging, and measurable ROI Show more Show less

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Nagpur, Maharashtra, India

Remote

Linkedin logo

About the job Job Title: Data Analyst Trainer Company: EdEdge Groups Location: Remote Experience Required: 0-2 years Company Description EdEdge Groups is a pioneering data intelligence company founded in 2023, specializing in developing advanced solutions that transform complex data into actionable insights for businesses. Our platforms empower organizations to uncover hidden patterns, optimize operations, and make informed decisions while prioritizing privacy and security. Role Description This is a full-time remote role for a Data Analyst Trainer at EdEdge Groups. The Data Analyst Trainer will be responsible for bellow points: Provide project management, documentation, content creation, training, and overall monitoring support to batch members. Train users via online webinars on big data tools and database maintenance. Ensure direct communication and implementation of corporate policies and procedures across corporate locations, including Australia, the United States, and the United Kingdom. Manage projects according to the standards and processes defined in the quality management system. Collaborate with sales, marketing, and customer success teams to drive product adoption, gather feedback, and iterate on product enhancements. Monitor product performance metrics and user feedback to ensure continuous improvement and optimize user satisfaction. Key Responsibilities Data Analysis: Analyze and interpret complex data sets to derive actionable insights. Market Research: Perform market research to understand trends and inform business strategies. Data Analytics: Apply data analytics techniques to generate insights and support decision-making. Data Visualization: Develop and maintain data visualizations using tools like Looker, Tableau, Power BI, matplotlib, and seaborn. Data Manipulation: Extract, manipulate, and analyze data using SQL and Python. Collaboration: Work with various teams to understand their data needs and deliver insights. Data Research: Conduct in-depth data research to gather relevant information. Qualifications Education: Bachelor's degree in Data Science, Statistics, Computer Science, or a related field. Experience: 0-2 years in data research, data analytics, or a related field. Skills: Data Research and Data Analytics skills Analytical skills and market research experience Strong communication skills Ability to extract and manipulate data Experience with data visualization tools (Looker, Tableau, Power BI, matplotlib, seaborn) Attention to detail and problem-solving abilities Extract, manipulate, and analyze data using advanced SQL. Utilize advanced Excel techniques for data manipulation and analysis. Show more Show less

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

As our Principal Machine Learning (ML) / Personalization Engineer, you will:  Architect and deploy ML-based personalization systems for our suite of digital news products, including recommender systems for content ranking, homepage personalization, push notification targeting, and audience segmentation. Collaborate closely with editors, product managers, and analysts to integrate machine learning into the editorial workflow—making content creation, packaging, and distribution smarter and audience-aware. Analyze user behavior and content consumption patterns using large-scale datasets to build user understanding models and inform personalization strategies. Own the end-to-end ML pipeline: from data acquisition, feature engineering, model training & evaluation, to deployment and real-time inference. Drive experimentation culture: lead A/B testing and iterative optimization of recommendation and ranking models. Stay on top of global trends in personalization, news AI, large language models (LLMs), and recommendation systems, and bring best-in-class solutions to our stack. Who you need to be: Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 8–12 years of experience in machine learning, ideally in recommendation systems, personalization, or search relevance. Strong experience with Python and ML frameworks like TensorFlow, PyTorch, or Scikit-learn. Hands-on with recommendation engines (collaborative filtering, content-based, hybrid models) and vector similarity models. Experience with real-time data processing frameworks and deploying models in production. Solid understanding of SQL and data platforms (e.g., Snowflake, BigQuery, or Redshift). Exposure to BI tools (Metabase, Looker, Tableau) is a plus. Comfortable navigating ambiguous, fast-paced environments and leading cross-functional initiatives. Excellent communication and collaboration skills—able to explain complex ML concepts to non-technical stakeholders. Show more Show less

Posted 3 weeks ago

Apply

0.0 - 15.0 years

0 Lacs

Delhi

Remote

Indeed logo

Delhi , India Designation: Partner Position: Data Analyst Instructor Mentor (Part-Time) Job Type: Consultant Benefits: Revenue distribution or a fixed hourly rate, with potential for performance-based bonuses tied to training outcomes. Reports to: Founder/CEO Job Overview The Data Analyst Mentor will provide strategic training and mentorship to Eduroids' students on a part-time basis, focusing on equipping them with industry-relevant data analytics skills. This role involves delivering hands-on training sessions, developing course materials, and offering personalised guidance to students to enhance their technical proficiency and career readiness. Key Responsibilities Training Delivery: Conduct weekend training sessions, focusing on data analysis tools, techniques, and methodologies. Curriculum Development: Develop and maintain course content that aligns with industry standards and the latest trends in data analytics. Hands-On Learning: Guide participants through practical exercises, real-world case studies, and data analysis projects to enhance their technical skills. Mentorship: Provide personalised support, answering queries and helping students understand and apply data analytics concepts. Industry Alignment: Ensure that all training sessions and materials reflect the latest advancements in data analytics tools, trends, and best practices. Assessment and Feedback: Monitor student progress through regular assessments, offer constructive feedback, and suggest improvement strategies. Knowledge Transfer: Share practical insights and real-world examples to bridge the gap between theoretical concepts and industry applications. Key Measures Student Progress: Track and evaluate student performance through assessments, ensuring skill acquisition and concept mastery. Industry Relevance: Maintain and update curriculum content to align with industry needs and emerging technologies in data analytics. Feedback Scores: Achieve high satisfaction ratings from participants for training quality and mentorship. Hands-On Projects: Ensure students complete practical projects showcasing their analytical capabilities. Qualifications Education: Bachelor’s or Master’s degree in Data Science, Statistics, Computer Science, or a related field. Experience: Minimum of 15 years of professional experience as a Data Analyst or in a related role. Must possess real-time experience working with Fortune 500 companies. Proven expertise in data analytics, data visualization, and reporting. Technical Skills: Proficiency in tools like SQL , Python , and R . Advanced skills in data visualization platforms such as Tableau , Power BI , or Looker . Knowledge of statistical analysis, data cleaning, and predictive analytics. Soft Skills: Excellent communication and presentation skills. Ability to simplify complex data concepts for learners from diverse backgrounds. A passion for teaching, mentoring, and guiding the next generation of data professionals. Personal Attributes Enthusiastic about sharing knowledge and empowering learners. Resilient and adaptable, committed to continuous improvement. Collaborative mentor who fosters an engaging and inclusive learning environment. Benefits Competitive compensation based on hourly or project-based engagement. Flexible remote working options. Opportunity to shape the next generation of data analytics professionals. Supportive and innovative work culture.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Engineering degree or equivalent experience 3+ years of ETL Experience using SQL Server Integration Services (SSIS) within Visual Studio 3+ years of advanced SQL Server experience. Certification in Microsoft SQL Server a plus Experience using GitHub platform for version control Experience using GitHub Copilot AI-powered code assistant Experience developing within an agile (i.e. Scrum or Kanban) framework Healthcare experience API Experience Proficiency in API Design and Architecture Understanding of RESTful Principles and GraphQL Expertise in Programming Languages (e.g. JavaScript, Python, Java) Knowledge of API Security Best Practices Proficiency with API Documentation Tools (e.g. Swagger/OpenAPI) Google Cloud Platform Experience Utilizing Google Cloud Dataflow and Google Cloud Dataproc with SQL or Python in Jupyter Notebooks to load data into BigQuery and Google Cloud Storage Implementing data processing jobs and managing data within BigQuery Creating dashboards and visualizations for business users using Google Data Studio and Looker Utilizing Google AI Platform to build and deploy machine learning models Expertise in Cloud Data Migration, Security and Administration Migrating SQL Server databases from on-premises to Google Cloud SQL, Google Cloud Spanner, and/or SQL Server on Google Compute Managing database migration projects from on-premises to Google Cloud BI report development Solid soft skills (e.g. communication, interpersonal, collaborative, resilient) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview: We are looking for an Analytics Engineer to join our team here at ShyftLabs! The ideal candidate is someone who loves data and analytics and can both understand and develop overall business strategies. A blend of great technical skills and interpersonal skills is highly desirable in this role, which will be working alongside enterprise clients. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Develop, optimize, and maintain Looker and Power BI reports, ensuring usability, performance, and alignment with business objectives. Lead and support the Looker migration and Power BI development, building robust data models, LookML structures, and interactive dashboards. Design and maintain data pipelines and ETL processes, ensuring accurate, efficient, and scalable ingestion of data into Snowflake. Collaborate with cross-functional teams to translate business requirements into scalable data solutions, focusing on automation and efficiency. Optimize SQL queries, manage database performance, and implement best practices for data modeling and transformation. Provide technical support and guidance to business teams, helping them interpret data and troubleshoot reporting issues. Work with stakeholders to define KPIs and build actionable insights that drive strategic decision-making. Basic Qualification: 5+ years of experience in analytics engineering, data engineering, or BI development. Strong expertise in Looker (LookML, Explores, Dashboards) and Power BI (DAX, data modeling, dashboarding). Proficiency in SQL, with experience optimizing queries and working with large datasets. Hands-on experience with Snowflake or other cloud data warehouses (BigQuery, Redshift). Experience developing and maintaining ETL pipelines and data transformation processes. Proficiency in Python or another scripting language for automation and data manipulation. Strong stakeholder engagement and client-facing experience, with the ability to explain technical concepts to non-technical audiences. Experience working in Agile environments, with familiarity in version control (Git) and CI/CD best practices. Excellent verbal and written communication skills, with a keen attention to detail. Preferred Qualifications: Client-facing experience is highly desired. Working with Executive level shareholders is considered an asset. Can identify ways to improve data quality and reliability. Is aligned with the latest data trends or inconsistencies to simplify the data insights. A commitment to teamwork with excellent interpersonal skills. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Title: Lead - Growth Marketing Manager Experience: 3 to 5 years Job location: Hyderabad (Work from office) About the Role As Growth Marketing Manager, you’ll be the engine behind our user/customer acquisition efforts. You’ll work cross-functionally, move fast, and take full ownership of key growth channels—from paid media to lifecycle marketing and beyond. This is a high-impact role for someone who’s equal parts strategic thinker and scrappy executor. What You’ll Do Design and execute experiments to drive growth across paid, organic, referral, and lifecycle channels Own paid acquisition (Google Ads, Meta, LinkedIn, etc.)—from campaign setup to creative iteration and performance optimization Analyse funnel data to identify drop-offs, improve conversion rates, and reduce CAC Collaborate with product, design, and content teams to develop and test landing pages, onboarding flows, and growth loops Build and maintain reporting dashboards to track KPIs (CAC, LTV, ROAS, etc.) Scale what works—and sunset what doesn’t—based on data-driven insights What We’re Looking For 2–5 years of experience in growth or performance marketing, ideally in a high-growth startup environment Proven track record of driving measurable acquisition or revenue growth Strong command of marketing tools (Google Ads, Meta Ads Manager, Mixpanel/GA4, HubSpot/CRM tools, etc.) Data-driven and comfortable working with metrics, attribution, and testing frameworks Experience running A/B tests and growth experiments across channels Agile, entrepreneurial mindset—bias toward action, comfortable with ambiguity Bonus If You Have Experience in B2B SaaS or B2C marketplaces Familiarity with SQL, Looker, or similar tools Knowledge of product-led growth tactics Creative chops (copywriting, landing page optimization, ad creative direction) Why Join Us? Competitive compensation + equity High ownership role with leadership growth potential Flexible work environment Backed by top-tier VCs and seasoned operators Opportunity to build something meaningful, fast Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Our team is looking to add a skilled Lead Python Developer to our ranks. This role emphasizes enhancing our security tools' functionality and scalability through an emphasis on integration, modularization, and collaboration to boost performance and maintainability. Responsibilities Improve integration with JIRA for ticket creation and trigger email notifications for detected issues Refactor and modularize the existing codebase to enhance maintainability and scalability Collaborate closely with the team to ensure smooth integration of new features Preserve the high performance of the tool Craft and implement sturdy Python solutions Safeguard the security and integrity of the application Requirements 5+ years of Python development experience, particularly with security tools 1+ years of relevant leadership background Proficiency in modular programming and code refactoring Knowledge of JIRA Proficiency in the use of relational databases Strong knowledge of SQL Excellent problem-solving capabilities Capability to work collaboratively in a team environment Nice to have Certifications in security or Python development Background in CI/CD and ETL/ELT solutions Familiarity with Google Cloud BigQuery and Google Cloud Platform Proficiency in Python Jira Understanding of Looker Studio Technologies Python for backend development GCP BQ + SQL for database management HTML/JS for web scraping JIRA for ticket management We offer International projects with top brands Work with global teams of highly skilled, diverse peers Healthcare benefits Employee financial programs Paid time off and sick leave Upskilling, reskilling and certification courses Unlimited access to the LinkedIn Learning library and 22,000+ courses Global career opportunities Volunteer and community involvement opportunities Opportunity to join and participate in life of EPAM's Employee Resource Groups EPAM Employee Groups Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

We are in search of a skilled Lead Python Developer to be part of our tool development team. The successful applicant will upgrade our current tool tailored for security reporting by modularizing the code and incorporating new features to expand its capabilities. Responsibilities Integrate JIRA for ticket creation and set up email notifications for detected issues Refactor and restructure the existing codebase to enhance maintainability and scalability Collaborate closely with the team to integrate new features smoothly and maintain optimal performance of the tool Design and apply new features to increase the functionality of the security tool Supervise and fine-tune the performance of the application to guarantee unhampered operations Requirements 5+ years in Python development, particularly within security tools 1+ years in leadership roles Proficiency in modular programming and code refactoring Competency with relational databases; familiarity with SQL Background in Google Cloud Platform, Python, and REST API Understanding of JIRA and flexibility to use it effectively Strong problem-solving skills with a collaborative team-focused mentality Advantageous qualifications in security or Python development Nice to have Understanding of CI/CD and ETL/ELT Solutions Knowledge of Google Cloud BigQuery and Looker Studio Showcase of Python Jira API Expertise in PyTorch We offer International projects with top brands Work with global teams of highly skilled, diverse peers Healthcare benefits Employee financial programs Paid time off and sick leave Upskilling, reskilling and certification courses Unlimited access to the LinkedIn Learning library and 22,000+ courses Global career opportunities Volunteer and community involvement opportunities Opportunity to join and participate in life of EPAM's Employee Resource Groups EPAM Employee Groups Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn Show more Show less

Posted 3 weeks ago

Apply

4.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking a highly skilled Business Intelligence (BI) Delivery Lead with deep expertise in IBM Cognos to spearhead the rollout and delivery of enterprise-scale BI projects. The ideal candidate will be capable of driving end-to-end delivery, ensuring quality, responsiveness, and performance of analytics solutions. Experience with other BI platforms like Power BI, Looker, or Datorama is a strong advantage. Job Description: Key Responsibilities: Lead the delivery and implementation of scalable BI solutions using IBM Cognos. Collaborate with cross-functional teams to define project scope, timelines, and resource requirements. Design and implement custom, device-responsive layouts for dashboards and reports. Integrate dynamic data sources, enabling real-time or near-real-time data updates. Utilize advanced charting libraries to create visually engaging and informative data visualizations. Apply frontend and backend scripting frameworks to extend dashboard capabilities and interactivity. Ensure BI solutions are optimized for performance, maintainability, and usability. Provide leadership and mentoring to junior developers or analysts on BI best practices and delivery standards. Required Skills & Experience: 4 to 5 years of hands-on experience with IBM Cognos BI tools. Proven experience leading large-scale BI/analytics projects from design through deployment. Strong understanding of responsive dashboard design and UI/UX best practices. Proficiency with scripting languages and frameworks for both frontend (JavaScript, HTML/CSS) and backend (e.g., Python, SQL) - Good to have not mandatory Expertise in integrating charting and visualization libraries (e.g., D3.js, Highcharts)-Good to have not mandatory Familiarity with other BI tools such as Power BI, Looker, or Datorama is a plus. Nice to Have: Experience working with marketing analytics platforms or customer data platforms. Familiarity with data warehousing and ETL processes Location: Chennai Brand: Paragon Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Our team is looking to add a skilled Lead Python Developer to our ranks. This role emphasizes enhancing our security tools' functionality and scalability through an emphasis on integration, modularization, and collaboration to boost performance and maintainability. Responsibilities Improve integration with JIRA for ticket creation and trigger email notifications for detected issues Refactor and modularize the existing codebase to enhance maintainability and scalability Collaborate closely with the team to ensure smooth integration of new features Preserve the high performance of the tool Craft and implement sturdy Python solutions Safeguard the security and integrity of the application Requirements 5+ years of Python development experience, particularly with security tools 1+ years of relevant leadership background Proficiency in modular programming and code refactoring Knowledge of JIRA Proficiency in the use of relational databases Strong knowledge of SQL Excellent problem-solving capabilities Capability to work collaboratively in a team environment Nice to have Certifications in security or Python development Background in CI/CD and ETL/ELT solutions Familiarity with Google Cloud BigQuery and Google Cloud Platform Proficiency in Python Jira Understanding of Looker Studio Technologies Python for backend development GCP BQ + SQL for database management HTML/JS for web scraping JIRA for ticket management We offer International projects with top brands Work with global teams of highly skilled, diverse peers Healthcare benefits Employee financial programs Paid time off and sick leave Upskilling, reskilling and certification courses Unlimited access to the LinkedIn Learning library and 22,000+ courses Global career opportunities Volunteer and community involvement opportunities Opportunity to join and participate in life of EPAM's Employee Resource Groups EPAM Employee Groups Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

We are in search of a skilled Lead Python Developer to be part of our tool development team. The successful applicant will upgrade our current tool tailored for security reporting by modularizing the code and incorporating new features to expand its capabilities. Responsibilities Integrate JIRA for ticket creation and set up email notifications for detected issues Refactor and restructure the existing codebase to enhance maintainability and scalability Collaborate closely with the team to integrate new features smoothly and maintain optimal performance of the tool Design and apply new features to increase the functionality of the security tool Supervise and fine-tune the performance of the application to guarantee unhampered operations Requirements 5+ years in Python development, particularly within security tools 1+ years in leadership roles Proficiency in modular programming and code refactoring Competency with relational databases; familiarity with SQL Background in Google Cloud Platform, Python, and REST API Understanding of JIRA and flexibility to use it effectively Strong problem-solving skills with a collaborative team-focused mentality Advantageous qualifications in security or Python development Nice to have Understanding of CI/CD and ETL/ELT Solutions Knowledge of Google Cloud BigQuery and Looker Studio Showcase of Python Jira API Expertise in PyTorch We offer International projects with top brands Work with global teams of highly skilled, diverse peers Healthcare benefits Employee financial programs Paid time off and sick leave Upskilling, reskilling and certification courses Unlimited access to the LinkedIn Learning library and 22,000+ courses Global career opportunities Volunteer and community involvement opportunities Opportunity to join and participate in life of EPAM's Employee Resource Groups EPAM Employee Groups Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

We are seeking a seasoned Lead Data Insights Analyst to join our data team and contribute to high-impact analytics initiatives across business units. The ideal candidate brings extensive experience analyzing financial and business data in professional services, and a proven record of leading analytical teams and projects Responsibilities Architect and lead the execution of analytics projects that drive business decision-making. Take ownership of KPI development, monitoring, and enhancement aligned with strategic goals. Design and refine scalable data models and pipelines for analysis and reporting. Apply statistical and machine learning methods to extract insights from diverse datasets. Deliver clear and interactive visual reports to stakeholders using modern BI tools. Collaborate with product, engineering, and business teams to define and meet data needs. Implement data quality checks and ensure compliance with governance practices. Provide coaching and guidance to junior data team members. Engage in Agile ceremonies including daily stand-ups, retrospectives, and planning sessions. Promote modular, scalable data solutions with long-term maintainability in mind. Lead peer reviews of analytical code and drive continuous process optimization. Qualifications Minimum 5 years of professional experience in data analytics, preferably in financial and business domains. 3+ years in a leadership capacity, managing projects and mentoring team members. Bachelor's degree in a relevant field such as Data Science, Statistics, Engineering, or Business. Strong command of SQL and at least one of Python, R, or equivalent scripting language. High proficiency in data visualization using platforms such as Tableau, Power BI, or Looker. Solid grasp of data modeling, ETL pipelines, and data warehousing architectures. Experience working in Agile or Kanban settings with active participation in sprints, code reviews, and iterative delivery models. Expertise in statistical evaluation, A/B testing, forecasting, and basic machine learning methods. Ability to gather business requirements and deliver technical solutions aligned with organizational goal Familiarity with performance metrics in tech-focused environments and understanding of key influencing factors. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Skill : Python , SQL Exp : 6 to 14 years Location: Kochi (Walkin 14th June) Proficient in Python for data analysis and automating reporting workflows. 5+ years of experience in data analytics, reporting, or similar roles. Strong skills in SQL, with the ability to handle complex queries and large datasets. Experience with reporting tools (e.g., Looker, Tableau, Power BI) for building dashboards and visualizations. Familiarity with Snowflake data warehouse for querying and data management. Knowledge of BI tools and dashboard creation. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 13 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: Lead the delivery and implementation of scalable BI solutions using IBM Cognos. Collaborate with cross-functional teams to define project scope, timelines, and resource requirements. Design and implement custom, device-responsive layouts for dashboards and reports. Integrate dynamic data sources, enabling real-time or near-real-time data updates. Utilize advanced charting libraries to create visually engaging and informative data visualizations. Apply frontend and backend scripting frameworks to extend dashboard capabilities and interactivity. Ensure BI solutions are optimized for performance, maintainability, and usability. Provide leadership and mentoring to junior developers or analysts on BI best practices and delivery standards. Preferred candidate profile 4 to 5 years of hands-on experience with IBM Cognos BI tools. • Proven experience leading large-scale BI/analytics projects from design through deployment. • Strong understanding of responsive dashboard design and UI/UX best practices. • Proficiency with scripting languages and frameworks for both frontend (JavaScript, HTML/CSS) and backend (e.g., Python, SQL) - Good to have not mandatory • Expertise in integrating charting and visualization libraries (e.g., D3.js, Highcharts)-Good to have not mandatory

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Position – Sr. BI Analyst (Analytics & Insights) Amherst Overview Amherst is a vertically integrated real estate investment, development, and operating platform, offering solutions across the U.S. real estate capital stack, including single-family residential (SFR), mortgage-backed securities (MBS), and commercial real estate (CRE). Amherst is headquartered in Austin, TX and New York, NY, in the United States, with regional global offices located in India and Costa Rica. Underpinned by proprietary technology, battle-tested data and mortgage model, and a deep understanding of U.S. real estate markets, Amherst’s vertically integrated platform seeks to provide investors a more efficient model to price, finance, and manage real estate with turnkey execution capabilities across the firm’s debt and equity strategies in the public and private residential, commercial, and mortgage-backed securities markets. Our Single-Family Residential (SFR) strategy has quickly scaled over the last 10 years to own and operate 40,000+ homes in 30+ markets across 20 states while building a vertically integrated real estate investment and operating platform that manages approximately $18bn in assets. Across the SFR strategy, Amherst acquires, builds, renovates, leases, finances, manages, and disposes of homes on its own account and for its investors. Outside of the SFR strategy, Amherst is engaged in various strategic initiatives and venture businesses, including commercial real estate debt and equity (all things non-SFR) and mortgage-backed securities advisory. For Further information about The Amherst Group, kindly visit https://www.amherst.com/ . Department / Role Overview: Amherst Residential We are looking for a skilled Sr. BI Analyst with expertise in working with large data sets, techniques to extract insights, build predictive models, and deliver actionable business solutions. The ideal candidate will possess strong analytical skills, technical proficiency, and a passion for working with data to solve complex problems. Job Description (Primary Responsibilities) As a BI Analyst, you will play a key role in transforming raw data into actionable insights that drive business decisions. Collaborating with business units (e.g., Risk, Finance, Product) to gather reporting requirements and provide analytical support. Writing and optimizing SQL queries to extract and manipulate data from complex systems and data warehouses. Designing and enhancing Tableau dashboards to communicate KPIs, trends, and financial metrics effectively. Working on data ETL processes in collaboration with data engineering teams to ensure smooth data flow and availability for reporting. Translating business logic into data-driven solutions by implementing calculations, data transformations, and rules that align with business definitions. Developing new metrics and performance indicators to support evolving business strategies and regulatory needs. Continuously identifying and driving process improvements across reporting workflows, automation, and dashboard performance. Conducting deep-dive analyses to uncover insights related to customer behavior, product performance, risk exposure, and compliance trends. Documenting data definitions, business rules, and reporting logic for transparency and consistency Desired Skills/Qualifications: Minimum 4 years of hands-on experience in a BI Analyst or similar role. Expertise in SQL (T-SQL, PL/SQL, or similar) for data extraction and manipulation. Strong experience with Tableau, including dashboard design, storytelling, and performance tuning. Prior experience working in the Banking or Financial Services industry is mandatory. Familiarity with data warehousing concepts, ETL processes, and BI tools. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management abilities. Bachelor’s degree in computer science, Statistics, Finance, or a related field. Soft Skills: Strong problem-solving and critical-thinking abilities. Excellent communication and presentation skills. Ability to work independently and as part of a team. Attention to detail and the ability to work with complex data sets. Preferred Qualifications: Proficiency in SQL, Python, VBA, and JavaScript for data analysis, automation, and enhancing reporting functionalities. Strong experience with Tableau; experience with other BI tools like Power BI or Looker is a plus. Exposure to Databricks for building and managing data pipelines, performing large-scale data processing, and advanced analytics. Amherst’s core values: Culture & Conduct : Positive attitude with high integrity. Agile in adapting to a dynamic environment with emerging datapoints. We do the right thing the right way and are accountable for our actions. Client-Centricity & Business Acumen : Strong Team player, multiple internal/external stakeholders management, Communication & Connectivity : Strong written and verbal communication skills with clients and management. Collaboration – We align, contribute, and win together. Execution & Delivery : Self-starter, Proactive, motivated, driven personality, Excellent organizational and time management skills. Agility – We are nimble and responsive. Community – We empower and support people to create a sense of belonging for all. Working Shift/ arrangement: US Shift (1:30 PM – 10:30 PM – IST), Flexible - Hybrid working model Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Kozhikode, Kerala, India

On-site

Linkedin logo

Job Description The role involves analyzing large and complex datasets to uncover trends, patterns, and actionable insights that support data-driven decision-making across the organization. Responsibilities include designing and building dashboards to visualize key retail KPIs and operational metrics for business stakeholders. The position also requires developing ad-hoc reports and performing deep-dive analyses to support cross-functional teams. Close collaboration with product managers and stakeholders is essential to gather reporting requirements and deliver customized data solutions. Requirements BSc/BCA/MSc/MCA/BTech/MTech 3+ Years of Experience in Data Engineering Strong SQL skills and experience working with relational databases (e.g.MySQL, PostgreSQL) Proficiency in Python for data pipelines and scripting Proven ability to build reporting dashboards and visualizations for retail KPIs using tools such as Apache Superset, Metabase, Looker, Tableau, or Power BI Strong data analysis skills with the ability to extract insights from complex datasets Hands-on experience with ETL/ELT tools (e.g., Airflow, dbt, Talend, or custom pipelines) Familiarity with cloud platforms (AWS, GCP, or Azure) and data services like S3, Redshift, BigQuery, or Snowflake Familiarity with batch and streaming data pipelines (Kafka, Spark, etc.) Strong understanding of data modeling, warehousing, and performance optimization Version control with Git and experience working in a CI/CD environment Ability to write clean, modular, and well-documented code Soft Skills: Strong communication and collaboration skill Ability to work independently and drive tasks to completion Attention to detail and a problem-solving mindset Benefits Competitive Pay Performance Bonus Longevity Bonus Monthly Fun & Entertainments Programs Office Pantry filled with Tea & Snacks Paid Time Off Parental Leave Policy Medical Coverage - Insurance for Employee and Family PF / ESI Education Allowances Requirements BSc/BCA/MSc/MCA/BTech/MTech 3+ Years of Experience in Data Engineering Strong SQL skills and experience working with relational databases (e.g.MySQL, PostgreSQL) Proficiency in Python for data pipelines and scripting Proven ability to build reporting dashboards and visualizations for retail KPIs using tools such as Apache Superset, Metabase, Looker, Tableau, or Power BI Strong data analysis skills with the ability to extract insights from complex datasets Hands-on experience with ETL/ELT tools (e.g., Airflow, dbt, Talend, or custom pipelines) Familiarity with cloud platforms (AWS, GCP, or Azure) and data services like S3, Redshift, BigQuery, or Snowflake Familiarity with batch and streaming data pipelines (Kafka, Spark, etc.) Strong understanding of data modeling, warehousing, and performance optimization Version control with Git and experience working in a CI/CD environment Ability to write clean, modular, and well-documented code Soft Skills: Strong communication and collaboration skill Ability to work independently and drive tasks to completion Attention to detail and a problem-solving mindset Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Head – Product & Market Operations (with Trading Insight) Company: Wonder Word Solutions Experience: Minimum 6 Years CTC: ₹12 – ₹14 LPA About Us Wonder Word Solutions is looking for a strategic and detail-driven Head of Product & Market Operations to manage the complete operational lifecycle of our live, market-based digital platform. This role bridges product execution with the dynamic world of user-generated markets — combining tech know-how with real-time market monitoring, user engagement, and financial logic. If you understand how to run scalable products and have a strong grasp of share market principles, market dynamics, and user behavior, this role offers the best of both worlds. Key Responsibilities: Lead end-to-end operations of live markets – from market creation and moderation to final resolution Define and enforce rulesets for market structure, pricing integrity, and user trust Coordinate closely with product managers and developers to roll out features smoothly Translate user insights and behavior into actionable product or operational tweaks Monitor and refine oracle mechanisms and ensure data reliability for market outcomes Create internal workflows and dashboards to streamline market performance, compliance, and uptime Build a structured feedback loop from users, market trends, and product analytics Scale a team to manage operational load and user expectations as the platform grows Qualifications: 6+ years of experience in Product Ops, Market Ops, or Digital Platform Management Strong understanding of product development cycles, GTM, and operational excellence Demonstrated experience managing platforms with dynamic, time-sensitive features Solid working knowledge of the share market, prediction models, or betting platforms is a major plus Strong data skills — ability to read dashboards, extract insights, and act on metrics Familiarity with tools like Jira, Notion, Looker, Mixpanel, or Dune Analytics Excellent verbal, written, and cross-functional collaboration skills Prior experience in Web3, fintech, fantasy gaming, or crypto platforms preferred Bonus Points If You: Can explain how spread, liquidity, and probability shape market behavior Have worked with or around prediction markets, stock platforms, or financial exchanges Are comfortable with basic tokenomics, odds, or pricing models Show more Show less

Posted 3 weeks ago

Apply

Exploring Looker Jobs in India

Looker, a powerful data visualization and business intelligence tool, is gaining popularity in India, leading to a growing demand for professionals with expertise in this area. Companies across various industries are actively seeking skilled individuals who can utilize Looker to analyze data and make informed business decisions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The salary range for Looker professionals in India varies based on experience levels. Entry-level positions may start at around ₹5-6 lakhs per annum, while experienced professionals can earn up to ₹15-20 lakhs per annum.

Career Path

Career progression in Looker typically follows a path from Junior Analyst to Senior Analyst, and then to roles such as Business Intelligence Manager or Data Analytics Lead. With experience and additional certifications, professionals can advance to roles like Data Scientist or Chief Data Officer.

Related Skills

Aside from proficiency in Looker, professionals in this field are often expected to have knowledge of SQL, data visualization techniques, data modeling, and experience with other BI tools like Tableau or Power BI.

Interview Questions

  • What is LookML? (basic)
  • How do you create a new Look in Looker? (basic)
  • Explain the difference between a Dimension and a Measure in Looker. (basic)
  • How can you optimize Looker queries for better performance? (medium)
  • What are some common pitfalls to avoid when working with Looker? (medium)
  • How do you handle data security in Looker? (medium)
  • Can you explain the concept of Derived Tables in Looker? (advanced)
  • How would you approach building a complex dashboard in Looker? (advanced)
  • How do you schedule data deliveries in Looker? (advanced)
  • Explain the process of data caching in Looker. (advanced)
  • ...

Closing Remark

As the demand for Looker professionals continues to rise in India, now is the perfect time to enhance your skills and pursue opportunities in this exciting field. Prepare thoroughly, showcase your expertise, and apply confidently for Looker jobs to advance your career in data analytics and business intelligence.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies