Home
Jobs

1434 Looker Jobs - Page 34

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Jaipur, Rajasthan, India

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Greater Lucknow Area

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Thane, Maharashtra, India

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Nashik, Maharashtra, India

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Kanpur, Uttar Pradesh, India

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Nagpur, Maharashtra, India

Remote

Linkedin logo

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: dBT tool, GCP, Affiliate Marketing Data, SQL, ETL, Team Handling, Data warehouse, Snowflake, BigQuery, Redshift Forbes Advisor is Looking for: Job Description: Data Warehouse Lead Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. Forbes Marketplace is seeking a highly skilled and experienced Data Warehouse engineer to lead the design, development, implementation, and maintenance of our enterprise data warehouse. We are seeking someone with a strong background in data warehousing principles, ETL/ELT processes and database technologies. You will be responsible for guiding a team of data warehouse developers and ensuring the data warehouse is robust, scalable, performant, and meets the evolving analytical needs of the organization. Responsibilities: Lead the design, development and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process to enable reliable and timely ingestion of data from the source system. Troubleshoot and resolve issues related to data pipelines and data integrity. Work closely with business analysts and other stakeholders to understand their data needs and provide solutions. Communicate technical concepts to non-technical audiences effectively. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Ensure adherence to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay up-to-date with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc - the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus. Perks: Day off on the 3rd Friday of every month (one long weekend each month). Monthly Wellness Reimbursement Program to promote health well-being. Monthly Office Commutation Reimbursement Program. Paid paternity and maternity leaves. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Minimum qualifications: Bachelor’s degree or equivalent practical experience. 3 years of experience coding in one or more programming languages. 3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts. Experience in data analysis, database querying (e.g., SQL), and data visualization (dashboards/reports). Experience in defining and implementing data governance policies, procedures, and standards, as well as data privacy regulations and compliance requirements. Preferred qualifications: Experience working in a business tooling or operations organization. Experience with data analysis at scale, including statistics, and machine learning model development (data preparation, model selection, evaluation, tuning). Experience with a wide range of data engineering and data governance tools and technologies, including cloud platforms (e.g., GCP, Looker), data warehousing solutions, data quality tools, and metadata management systems. Experience in building prototypes or proof-of-concepts using generative AI models (e.g., LLMs, diffusion models) or agentic workflow frameworks. About the job In this role, you will be responsible for developing and maintaining solutions that streamline data access, enhance data hygiene, and improve telemetry for YouTube's internal business teams. Your work will impact the efficiency and effectiveness of our support organization, enabling them to better serve our users. You will be part of an engineering team focused on building tools, creating proof-of-concept prototypes, and generating/reviewing technical design documents to address the data needs of our support teamsYouTube/Video Global Solutions is the link between Google video products and sales. Our mission is to fuel innovation that keeps YouTube and Video free and accessible to the world. We do this by translating global market needs into meaningful product solutions that drive business results for content partners and customers. Responsibilities Lead the design, development, and maintenance of reliable data pipeline. Write queries and other data manipulation languages like Python and R. Conduct advanced quantitative data analysis. Partner with technical and non-technical stakeholders to translate data needs into technical designs. Lead technical design review and collaborate with cross-functional teams. Develop dashboards and reports in partnership with User Experience (UX) team and establish data visualization standards. Drive ownership of the Data Governance charter, defining and implementing policies and ensuring compliance. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Minimum qualifications: Bachelor's degree or equivalent practical experience. 8 years of experience in vendor strategy and operations, either in vendor management (working at the client) or in vendor delivery (working at the supplier). Experience in change management. Experience with data spreadsheets, SQL, or Looker. Preferred qualifications: Experience in executing operational initiatives within a global team. Experience in Enterprise/B2B support delivery models or offshore delivery models, operational efficiency transformation, or management/ops consulting. Knowledge of technologies and leveraging GenAI tools in your day-to-day to enhance the quality deliverables. Ability to influence without direct authority, to drive change at all levels within an organization, and to build working relationships. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. In this role, you will partner with all Cloud teams in Support, Learning, Sales and Engineering to set up new vendor engagements and improve existing ones. Google Cloud is a growing business with continuously evolving needs. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Identify operational gaps using a data-driven approach and develop innovative solutions to address these in collaboration with vendor sites and business stakeholders. Understand processes which impact the operation (ie. contracting, finances, capacity, workflows, etc.) and support the relevant stakeholders driving these efforts. Act as the escalation poc for internal and external escalations regarding performance or supplier relationships. Collaborate with executive stakeholders across a global organization, and execute operational initiatives. Co-create with suppliers to deliver innovative solutions and tools improve customer experience and service delivery. Oversee the vendor network distributed across different sites or products. Own and manage the relationship with vendor service providers, own service delivery performance, partner with suppliers to ensure quality output and customer outcomes. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Senior Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Senior Specialist- Data Visualization, you will be leading a team of Data Visualization and Analytics experts who are focused on designing and developing compelling analytical solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in team leadership, stakeholder management, product management while leading the development of user-centric visualization products that empower stakeholders with data driven insights & decision-making capability. Responsibilities Lead and manage a team of ~10 visualization experts, designers, data analysts, and insights specialists, fostering a high-performing, inclusive, and engaged work environment. Drive team-building initiatives, mentorship, coaching, and performance management while prioritizing diversity, inclusion, and well-being. Develop user-centric analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Foster a culture of innovation and continuous learning, encouraging the adoption of new visualization tools, methodologies, and a product mindset to build scalable and reusable solutions. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Required Experience And Skills 10+ years of experience in insight generation, business analytics, business intelligence, and interactive visual storytelling, with a strong focus on infographics and data-driven decision-making. Proven leadership and people management skills, including team development, mentoring, and fostering a high-performing, engaged workforce. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Hands-on expertise in BI and visualization tools such as Qlik, Power BI, MicroStrategy, Looker, and ThoughtSpot, with proficiency in PowerPoint and data storytelling for creating impactful presentations. Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Proven ability to manage third-party vendors and contractors, ensuring high-quality deliverables and cost-effective solutions. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R334763 Show more Show less

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Title: Marketing Analytics Manager Function: Marketing Level: M2 What You Will Do The Marketing Analytics Manager will play a critical role in supporting the Trimble Demand Generation Teams by playing a pivotal role in shaping the future of Marketing analytics by defining and driving a clear vision for the team, and ensuring its successful implementation. You will be responsible for building the technical foundation and analytics framework to support Marketing and Publishing initiatives, optimizing processes, and aligning data strategies with broader business goals Key Responsibilities Define, develop and implement the technical foundation and analytics framework for Marketing and Publishing initiatives, including stakeholder collaboration, process optimization, and data strategy alignment. Collaborate with cross-functional teams to define data requirements, objectives, and key performance indicators (KPIs) for user acquisition campaigns, app store or broader publishing strategies . Collaborate with the Marketing Operations team to create and implement data collection strategies for capturing key marketing data from sources like digital ad platforms, e-commerce, marketing automation, product, and other external platforms. Develop and maintain data models & dashboards to organize and visualize marketing data, providing actionable insights to stakeholders. Conduct in-depth analysis of user acquisition campaigns, including cohort analysis, attribution modeling, and A/B testing, to identify trends, patterns, and opportunities for optimization. Comprehend and possess the capability to perform data modeling for predicting and analyzing Lifetime Value (LTV). Provide guidance, mentorship, and training to junior analysts, fostering a culture of data-driven decision-making and continuous improvement. Manage a team of 3-5 marketing analysts. Stay informed about industry trends, best practices, and emerging technologies in marketing analytics and user acquisition, incorporating new techniques and tools as appropriate. Requirements 5-7 years proven experience in marketing analytics, with a focus on customer acquisition in B2B and B2C industries. 3-5 years of direct people management experience. Strong analytical skills and proficiency in data analysis tools and programming languages such as SQL, Python, R, or similar platforms. High proficiency in statistical analysis (help designing A/B tests and analyze performance to assess marketing campaign effectiveness) High proficiency in data visualization tools like Domo, Saleforce, or Looker, demonstrating the capability to craft engaging and insightful dashboards and reports. Deep understanding of digital marketing channels, performance metrics, and attribution models, with a track record of optimizing user acquisition campaigns for maximum ROI. Excellent communication and collaboration skills, with the ability to work effectively across teams and influence stakeholders at all levels of the organization. Strong project management skills and ability to manage multiple priorities and deadlines in a fast-paced environment. Proven experience in mentoring and guiding analysts Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description Melonleaf Consulting is a leading Salesforce Silver Partner offering Consulting, Implementation, and Innovative Technology services with an offshore support center in India. The company specializes in Salesforce products including Sales Cloud, Service Cloud, Marketing Cloud, Health Cloud, Community Cloud, Commerce Cloud, CPQ, Marketing Cloud, and Pardot. Role Description This is a full-time on-site Marketing Manager role located in Gurugram at Melonleaf Consulting. The Marketing Manager will be responsible for developing and implementing marketing strategies, managing marketing campaigns, analyzing market trends, and collaborating with cross-functional teams to drive business growth. Qualifications Marketing Strategy Development and Implementation Marketing Campaign Management Market Trend Analysis Cross-functional Collaboration Experience with Salesforce products is a plus Bachelor's degree in Marketing, Business, or related field Strong analytical and problem-solving skills Excellent communication and interpersonal skills Tools Expertise • GA4 | Looker Studio | Google Tag Manager • MOZ | Ahrefs | SEMRush | Screaming Frog etc, Show more Show less

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer What you'll be doing: ● [Data Modeling & Design] You will be working with Business, Product and Engineering teams to design, document and implement data tracking and modeling for product features. ● [Data Pipeline & ETLs] You will be responsible for building data pipelines using SQL and/or Python, and creating tables and views for business reporting and analysis needs. You are not only expected to construct datasets that are useful and scalable, but also are expected to make constant changes to existing and new data pipeline jobs to reflect new business requirements and optimize ETL pipeline performance. ● [Ad Hoc Analyses] You will be responsible for providing analytical insights to empower Sales, Marketing, Product and Engineering teams. ● [Data Integrity & Data Bug Investigations] You will be responsible for monitoring, investigating and driving intra-team communications on data issues. ● [A/B Testing Design & Executions] You will be responsible for designing and performing A/B testings to track outcomes of recommendation engine video content suggestions, be able to make sense of the testing data and constantly adjust concurrent tests and strive to optimize the recommendations towards the set KPI We'll be excited if you have: ● Highly proficient in SQL ● Highly experienced in data transformation and modeling tools such as DBT ● Proficient in Python or any other programming language ● Working experience with one of the popular orchestration tools such as Airflow ● 1 to 3+ years professional work experience in data or information technology space ● Attention to detail and bias to action: pursue data quality, troubleshoot data validation, and see issues to resolutions ● Experience partnering with data engineering to develop high data integrity, automated reporting and insightful ad-hoc analyses that illuminate traffic volume, consumer engagement, funnel interaction, conversion rates opportunities within acquisition and product flows ● A proven problem solver who can quickly assess quantitative data sets and distill key insights using analytics and statistical methods ● Familiar with data analytics platforms such as Looker or Tableau ● Experience with big data technologies such as SparkSql, Athena, or Druid would be a big plus ● Outstanding communications skills with both technical and non-technical colleagues. ● BS or MS in a quantitative field (statistics, mathematics, economics, finance, operations research, etc.) Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description IQEQ is a preeminent service provider to the alternative asset industry. IQEQ works with managers in multiple capacities ranging from hedge fund, private equity fund, and mutual fund launches; private equity fund administration; advisory firm set-up, regulatory registration and infrastructure design; ongoing regulatory compliance (SEC, CFTC, and 40 Act); financial controls and operational support services; compliance and operational related projects and reviews; and outsourced CFO/controller and administration services to private equity fund investments – portfolio companies, real estate assets and energy assets. Our client base is growing, and our existing clients are engaging the firm across the spectrum of our services offerings. Job Description About IQ-EQ: IQ-EQ is a leading investor services group that brings together that rare combination of global expertise and a deep understanding of the needs of clients. We have the know-how and the know you that allows us to provide a comprehensive range of compliance, administration, asset and advisory services to investment funds, global companies, family offices and private clients globally. IQ-EQ employs a global workforce of 5,000+ people located in 23 jurisdictions and has assets under administration (AUA) exceeding US$500 billion. IQ-EQ works with eight of the top 10 global private equity firms. This is an exciting time to be part of IQ-EQ, we are growing substantially and currently seeking talented Data individuals to come along for the journey into the Data, Analytics and Reporting Department. You will have the opportunity to utilise IQ-EQ's leading-edge technology stack whilst enjoying our continuous learning and development programme. What does the Analytics Engineer opportunity look like for you? You will play a pivotal role in the development and maintenance of interactive client, investor and operational team facing dashboards using Tableau, Power BI and other visual analytics tools. You will work closely with clients and senior stakeholders to capture their BI requirements, conduct data analysis using SQL, Python (or other open-source languages), and visualise the insights in an impactful manner. In order to be successful in this role we require the following experience Experience of interacting directly with external clients in a verbal manner and through those interactions be able to conceptualise what the overall end-to-end database architecture and data model would look like from the source to destination of the data Intermediate experience of working with structured and unstructured data, data warehouses and data lakes both on-prem and in cloud (Microsoft SQL Server, Azure, AWS or GCP) At least 5 yrs. of demonstrable experience in SQL Server and cloud-based data stores. Intermediate knowledge of SQL Server Data Tool such as SSIS or Azure Data Factory (at least 5 yrs. of demonstrable experience) Intermediate experience of ETL / ELT methods such as incremental and full load as well as various tools to implement these methods such as Azure Data Factory, Azure DataBricks, SSIS, Python, dbt, airflow, Alteryx Intermediate experience of implementing dimensional data models within analytics databases / data warehouses Intermediate knowledge of Python / Spark packages for analytical data modelling and data analysis (panda, NumPy, scikit, etc.) as well as data visualisation (matplotlib, plotly, dash, etc.) Intermediate experience of BI tools – Power BI, Tableau (at least 4 yrs. of demonstrable experience) Experience of various java libraries for front end development and embedding of visuals (i.e. D3, react, node, etc.) Tasks (what does the role do on a day-to-day basis) Engage with external clients, vendors, project reps, internal stakeholders from operations as well as client services teams to understand their analytics and dashboard requirements Maintain and enhance existing database architecture of the BI solution Conduct in-depth exploratory data analysis and define business critical features Work with key teams within Group Technology to get appropriate access and infrastructure setup to develop advanced visualisations (BI dashboards) Optimize SQL server queries, stored procedures, views to efficiently and effectively retrieve data from databases, employing the fundamentals of dimensional data modelling Ensure updates to tickets and work in progress are well communicated and escalations regarding the delivery being provided is kept to a minimum Maintain and enhance existing data solutions. Maintain best-practice data warehouse solutions that support business analytics needs. This is achieved using on-prem or cloud databases and different ETL / ELT programs and software, such as Azure Data Factory, SSIS, Python, PowerShell, Alteryx or other open source technology Create, maintain and document the different analytics solution processes created per project worked on Resolve IT and data issues. When database issues arise or development requests come in through the help desk, BI Developers work to resolve these problems. This requires an understanding of legacy solutions and issues. Key competencies for position and level Analytical Reasoning – Ability to identify patterns within a group of facts or rules and use those patterns to determine outcomes about what could / must be true and come to logical conclusions Critical Thinking – Ability to conceptualise, analyse, synthesise, evaluate and apply information to reach an answer or conclusion Conceptual Thinking and Creative Problem Solving - Original thinker that has the ability to go beyond traditional approaches with the resourcefulness to adapt to new / difficult situations and devise ways to overcome obstacles in a persistent manner that does not give up easily Interpersonal Savvy – Relating comfortably with people across all levels, functions, cultures & geographies, building rapport in an open, friendly & accepting way Effective Communication – Adjusting communication style to fit the audience & message, whilst providing timely information to help others across the organisation. Encourages the open expression of diverse ideas and opinions Results / Action Orientated and Determination – Readily taking action on challenges without unnecessary planning and identifies new opportunities, taking ownership of them with a focus on getting the problem solved Key behaviours we expect to see Role In addition to demonstrating our Group Values (Authentic, Bold, and Collaborative), the role holder will be expected to demonstrate the following: Facilitate open and frank debate to drive forward improvement Willingness to learn, develop, and keep abreast of technological developments An analytical mind, excellent problem-solving & diagnostic skills, attention to detail Qualifications Required Experience Education / Professional Qualifications Degree level education in Data Analytics or Computer Science is preferred but equivalent professional IT certification is acceptable. Background Experience A Minimum of 5 years’ experience in a developer / engineer role or similar DB experience. Good understanding of dimensional data modelling methodologies Experience with visualization and reporting tools namely Tableau and Power BI as well as Qlikview, Looker, ThoughtSpot Experience with Microsoft Fabric Platform Experience with MS Excel including PowerPivot Technical Experience of supporting a variety of SQL based applications. Hands on experience with SQL 2016 and above Experience with T-SQL and the ability to analyse queries for efficiency. Experience with MS SQL Server suite, including SSIS Experience in Fabric Data Factory, Azure Data Factory, Azure Synapse Experience in both batch (incremental and full load) and near-real time data ETL / ELT data processing Experience with version control software e.g. Git, Bitbucket as well as software development platforms such Azure DevOps and Jira Languages English Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Role: Senior Data Analytics Engineer – GCP Experience: 5+ years Location: Noida & Bhubaneswar We are seeking a highly skilled Senior Data Analytics Specialist with deep expertise in Google Cloud Platform (GCP) tools to join our dynamic team. The ideal candidate will have strong experience in data analytics, business intelligence , and advanced data processing on GCP, specifically with Firebase Analytics , GA4 , and BigQuery . Key Responsibilities: Lead and execute data analytics projects leveraging GCP services, primarily focusing on Firebase Analytics and Google Analytics 4 (GA4). Design, develop, and optimize complex SQL queries and scripts in BigQuery for large-scale data processing. Build and maintain interactive dashboards and data models using Looker or Looker Studio. Collaborate with cross-functional teams to implement custom event tracking and user journey analysis. Work with other GCP components such as Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow to streamline data pipelines. Ensure compliance with data privacy and governance standards, including GDPR and CCPA. Analyze data to uncover trends, insights, and opportunities for business improvement. Communicate findings and recommendations clearly to stakeholders across technical and non-technical teams. Required Qualifications: 5 to 7 years of experience in data analytics, business intelligence, or a related domain. Proven expertise with Firebase Analytics and GA4, including custom event configuration and user behavior tracking. Advanced proficiency in BigQuery with experience in SQL scripting, query optimization, partitioning, and clustering. Hands-on experience with Looker or Looker Studio for dashboard creation and data visualization. Familiarity with additional GCP services such as Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow is preferred. Strong understanding of data privacy laws and governance frameworks like GDPR and CCPA. Excellent analytical, problem-solving, and detail-oriented skills. Strong verbal and written communication skills with the ability to work collaboratively across teams. Preferred Qualifications: Google Cloud certifications such as Professional Data Engineer or Looker Business Analyst. Experience working with A/B testing frameworks and experimentation platforms. Background in product analytics or digital marketing analytics. Exceptional communication and stakeholder management skills. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description About The Company Axis My India is Indias foremost Consumer Data Intelligence Company, which in partnership with Google is building a single-stop People Empowerment Platform, the a app, that aims to change peoples awareness, accessibility, and utilization of a slew of services. At Axis, we are dedicated to making a tangible impact on the lives of millions. If you're passionate about creating meaningful changes and aren't afraid to get your hands dirty, we want you on our team! Job Description Implement and manage GA4 tracking for the "a" App, ensuring accurate and comprehensive data collection. Integrate GA4 with Firebase for robust mobile app analytics, including event and parameter configuration. Design and develop custom dashboards and reports (using Looker Studio or similar tools) to visualize key performance indicators (KPIs) and user behavior trends. Analyze user engagement, retention, funnel performance, and other critical app metrics to identify growth opportunities and areas for optimization. Collaborate with product, UX, marketing, and engineering teams to translate data insights into actionable strategies for improving app features and user journeys. Define and track key performance indicators (KPIs) such as installs, active users, retention rate and session duration Stay current with the latest GA4 features, updates, and industry best practices, proactively recommending enhancements to analytics processes. Provide regular reporting and presentations to stakeholders, clearly communicating findings and recommendations. Identify trends in user behavior, such as most viewed screens, popular features, and drop-off points within the app journey. Track the effectiveness of marketing campaigns by analyzing acquisition sources, installs, and in-app engagement. Collaborate with developers to implement and test analytics updates in new app releases. Requirements Proven hands-on experience with Google Analytics 4 (GA4) implementation and analysis, particularly for mobile apps. Strong understanding of mobile app analytics concepts, event-based tracking, and cross-platform user journeys. Proficiency in dashboard and data visualization tools (e.g., Looker Studio, Google Data Studio). Experience with Firebase SDK integration and custom event setup for mobile apps. Familiarity with Google Tag Manager and tag management for app environments. Ability to translate complex data into clear, actionable insights for non-technical stakeholders. Strong analytical, problem-solving, and communication skills. Knowledge of data privacy regulations and best practices for secure data handling. Bachelors degree in computer science, Data Science, Analytics, or related field; advanced degree preferred Experience Required Experience with G4A data querying tool for more than 2 years Exposure to digital marketing analytics (SEO, PPC, campaign tracking) and attribution modelling. Familiarity with data engineering concepts. Experience working in fast-paced, cross-functional teams, preferably in a consumer app environment (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description About The Company Axis My India is Indias foremost Consumer Data Intelligence Company, which in partnership with Google is building a single-stop People Empowerment Platform, the a app, that aims to change peoples awareness, accessibility, and utilization of a slew of services. At Axis, we are dedicated to making a tangible impact on the lives of millions. If you're passionate about creating meaningful changes and aren't afraid to get your hands dirty, we want you on our team! Job Title : Google Analytics Engineer Data Engineering & Event Analytics Reporting to : Technical Project Manager The Google Analytics Engineer (Expert) will be responsible for configuring, maintaining, and optimizing Google Analytics 4 (GA4) and Google Cloud-based data pipelines to track event-driven insights across multiple products and services. The role involves designing scalable data collection Strategies, extracting meaningful insights from GA4, and integrating data with business intelligence (BI) dashboards for real-time performance tracking. This position requires deep expertise in GA4, Google Cloud Platform (GCP), Big Query, and event-driven analytics, ensuring that all business and product activities are accurately measured, analyzed, and optimized through data engineering best practices. Internal Interaction With Technical Project manager. MIS & Business Intelligence Teams Product & Engineering Teams Research & Data Analytics Teams Marketing & Customer Experience Teams External Interaction With Google Cloud Consultants & Partners BI & Data Vendors Technology & API Integration Partners Qualifications Degree : B.E/B. Tech in Computer Science, Data Science, Information Systems, or related fields Experience : 4+ years in Google Analytics (GA4), Google Cloud (GCP), Data Engineering, and Event-Based Analytics Preferred Certifications : Google Analytics (GA4) Certification, Google Cloud Professional Data Engineer, Big Query Certification Certifications Required Google Analytics Individual Qualification (GAIQ): Fundamental certification that validates knowledge of Google Analytics, including GA4. Google Cloud Certified Professional Data Engineer: Deep expertise in the Google Cloud Platform (GCP), Big Query, and data pipelines ensuring that candidate has skills and knowledge needed to design, build, and manage data processing systems on GCP. Requirements & Skills Google Analytics 4 (GA4) Implementation & Optimization : Configure and manage GA4 event tracking, data layers, and enhanced measurement strategies. Define and implement custom events, parameters, and conversion tracking for various products and services. Optimize user engagement tracking through GA4, Google Tag Manager (GTM), and Google Cloud integrations. Event-Based Data Engineering & Google Cloud (GCP) Integration Design and maintain scalable data pipelines on Google Cloud Platform (GCP). Extract, transform, and load (ETL) GA4 event data into Big Query for in-depth analysis. Ensure seamless data integration with internal dashboards, reports, and business intelligence tools. Big Query & Advanced Data Processing Develop complex SQL queries to extract insights from GA4 event data stored in Big Query. Optimize data processing, schema design, and performance tuning in Big Query. Create automated reports & real-time data streams for cross-functional teams. Business & Product Analytics Analyses user journeys, funnel performance, and product interactions using GA4 data. Generate customer behavior insights to support growth, retention, and engagement strategies. Work with marketing & product teams to improve attribution models, campaign tracking, and A/B testing. BI & Dashboard Integration Work with MIS & Data Teams to develop interactive dashboards (Google Data Studio, Looker, Tableau, Power BI). Ensure real-time data visualization for product, business, and operational insights. Maintain data governance & compliance across Google Cloud and analytics platforms. Performance Monitoring & Data Automation Implement data alerts and anomaly detection systems for proactive business insights. Automate data workflows using Google Cloud Functions, Python, and Big Query ML models. Ensure accuracy, scalability, and efficiency of analytics pipelines. Requirements Roles & Responsibilities : Ensure Accurate GA4 event tracking, data collection, and reporting. Real-time integration of GA4 insights with BI dashboards for decision-making. Scalable and optimized data pipelines on Google Cloud Platform (GCP). Carry Out Data extraction, transformation, and loading (ETL) from GA4 to Big Query. Advanced SQL & Big Query queries for business intelligence. Custom event tracking, audience segmentation, and engagement analysis. Contribute Automation of reporting processes and real-time data alerts. Predictive analytics models for user behavior and business intelligence. Improvement of analytics-driven decision-making through deep data insights (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role : GCP Cloud Architect. Location : Hyderabad. Notice period : Immediate joiners needed. Shift timings : US Time zones. Work Mode : Work from Office. Job Description Opportunity : We are seeking a highly skilled and experienced GCP Cloud Architect to join our dynamic technology team. You will play a crucial role in designing, implementing, and managing our Google Cloud Platform (GCP) infrastructure, with a primary focus on building a robust and scalable Data Lake in BigQuery. You will be instrumental in ensuring the reliability, security, and performance of our cloud environment, supporting critical healthcare data initiatives. Cloud Architecture & Design This role requires strong technical expertise in GCP, excellent problem-solving abilities, and a passion for leveraging cloud technologies to drive impactful solutions within the healthcare : Design and architect scalable, secure, and cost-effective GCP solutions, with a strong emphasis on BigQuery for our Data Lake. Define and implement best GCP infrastructure management, security, networking, and data governance practices. Develop and maintain comprehensive architectural diagrams, documentation, and standards. Collaborate with data engineers, data scientists, and application development teams to understand their requirements and translate them into robust cloud solutions. Evaluate and recommend new GCP services and technologies to optimize our cloud environment. Understand and implement the fundamentals of GCP, including resource hierarchy, projects, organizations, and billing. GCP Infrastructure Management Manage and maintain our existing GCP infrastructure, ensuring high availability, performance, and security. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or Cloud Deployment Manager. Monitor and troubleshoot infrastructure issues, proactively identifying and resolving potential problems. Implement and manage backup and disaster recovery strategies for our GCP environment. Optimize cloud costs and resource utilization, including BigQuery slot & Communication : Work closely with cross-functional teams, including data engineering, data science, application development, security, and compliance. Communicate technical concepts and solutions effectively to both technical and non-technical stakeholders. Provide guidance and mentorship to junior team members. Participate in on-call rotation as needed. Develop and maintain thorough and reliable documentation of all cloud infrastructure processes, configurations, and security : Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of 5- 8 years of experience in designing, implementing, and managing cloud infrastructure, with a strong focus on Google Cloud Platform (GCP). Proven experience in architecting and implementing Data Lakes on GCP, specifically using BigQuery. Hands-on experience with ETL/ELT processes and tools, with strong proficiency in Google Cloud Composer (Apache Airflow). Solid understanding of GCP services such as Compute Engine, Cloud Storage, Networking (VPC, Firewall Rules, Cloud DNS), IAM, Cloud Monitoring, and Cloud Logging. Experience with infrastructure-as-code (IaC) tools like Terraform or Cloud Deployment Manager. Strong understanding of security best practices for cloud environments, including identity and access management, data encryption, and network security. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, collaboration, and interpersonal skills. Bonus Points Experience with Apigee for API management. Experience with containerization technologies like Docker and orchestration platforms like Cloud Run. Experience with Vertex AI for machine learning workflows on GCP. Familiarity with GCP Healthcare products and solutions (e.g., Cloud Healthcare API). Knowledge of healthcare data standards and regulations (e.g., HIPAA, HL7, FHIR). GCP Professional Architect certification. Experience with scripting languages (e.g., Python, Bash). Experience with Looker. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

3.0 - 31.0 years

0 - 0 Lacs

Bawana, Delhi-NCR

Remote

Apna logo

Job Title: MIS, Business Operations Executive / Data & Automation Specialist Salary: ₹20,000 – ₹30,000 Location: [Add your office location] Working Hours: 9:30 AM to 6:30 PM Gender: Male Only Age Requirement: 22 to 32 Years Job Description: We are seeking a proactive and tech-savvy Business Operations Executive to support and streamline our day-to-day operations using automation tools, dashboards, and marketing systems. The ideal candidate will be responsible for managing data, implementing workflows, supporting marketing campaigns, and ensuring smooth integration across tools like Excel, CRM, and WhatsApp automation. Key Responsibilities: Manage and optimize FMS, IMS, MIS, PMS, and Checklist Dashboards Handle CRM Dashboard updates and client record management Set up and manage Whatsapp Automation and Drip Marketing Campaigns Create and manage automated processes using Google Sheets, Scripts, HTML Forms, and Looker Studio Operate and manage ESSL Biometric system for attendance tracking Work with Tally ERP 9 for basic accounting/data entry tasks Prepare professional documents and visuals using PowerPoint, Photoshop, and Video Editing Tools Analyze and report data using Advanced Excel and Power BI Coordinate with team members to delegate and track tasks effectively Assist with business automation using ChatGPT or AI-based tools Requirements: Male candidates only (as per company policy) Age between 22 to 32 years Minimum 1–3 years of experience in similar operations/automation role (preferred) Strong knowledge of: Advanced Excel Google Sheets, Scripts, HTML,Python,CPP,Data Mining CRM & Dashboard Systems Power BI / Looker Studio Whatsapp Automation Tools Photoshop / Video Editing Software Experience with Tally ERP 9 and Biometric Systems Ability to work independently and handle multiple tasks efficiently Strong analytical and problem-solving skills. thanks and regards Recruiter-Jyoti Contact- no 9220708293

Posted 2 weeks ago

Apply

3.0 - 31.0 years

0 - 0 Lacs

Sector 62A, Noida

Remote

Apna logo

Required Skills and Qualifications:1-2 years of experience in a similar role (MIS, data analysis, or business intelligence). Proficiency in Looker Studio (Google Data Studio) for dashboard creation. Strong knowledge of Google Sheets and advanced formulae (ARRAYFORMULA, QUERY, IMPORTRANGE, etc.). Experience with Google Apps Script to automate workflows and integrate with other Google services (Forms, Calendar, Gmail, Drive). Good understanding of data visualization best practices and ability to present complex data in a clear, actionable format. Familiarity with data cleaning and transformation techniques. Excellent problem-solving and analytical skills.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Req ID: 323226 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karn?taka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing... Responsibilities We seek a highly motivated and experienced Site Reliability Engineering (SRE) specialist to join our application team as a critical member focused on Application Operational Support. This role encompasses comprehensive monitoring responsibilities, including the implementation of alerts and dashboards utilizing various SRE tools. These responsibilities include, but are not limited to: Applying their expertise to develop unique solutions for complex problems, improving customer experience. Their responsibilities include: Collaborating with stakeholders to monitor applications using SRE tools and resolve issues. Developing alerts and dashboards for proactive monitoring. Creating an SRE gating process to identify anomalies in test environments, ensuring reliability and preventing issues in later stages. Monitoring activation and order fallouts, working with stakeholders to resolve errors, and identifying root causes to improve success rates. Tracking system outages impacting applications, identifying opportunities, and developing comprehensive proactive alerting mechanisms. Building automation and tools to support the team's operations. Where you’ll be working… In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What we’re looking for... Adept at leading teams through change to achieve exceptional outcomes, you foster a culture of innovation and creative problem-solving. Your customer-centric approach ensures phenomenal experiences. A strong communicator, you can clearly articulate complex ideas to all levels of leadership. You are driven by curiosity about emerging technologies and their potential. You’ll Need To Have Bachelor’s degree with four or more years of work experience. Four or more years of relevant work experience. Proficiency in Splunk, NewRelic, Kibana, Catchpoint, Grafana (for dashboard creation), Quantum Metrics, or other digital analytics platforms. Strong analytical and problem-solving abilities. Excellent communication and presentation skills, including the ability to convey complex information effectively to diverse audiences. Proven ability to multitask, prioritize tasks, work independently, and drive issues to resolution. Even better if you have one or more of the following: Experience in Python, Java, Relational or Non-Relational database tools Experience using Data Visualization Tools - Qliksense/ Looker / Tableau Proficient in Microsoft Excel/ G-Sheets with knowledge of Scripting/ Automation. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Summary JOB DESCRIPTION We are seeking a talented and experienced Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and systems to support analytics and data-driven decision-making. This role requires expertise in data processing, data modeling, and big data technologies. Responsibilities Key Responsibilities: Design and develop datapipelines to collect, transform, and load data into datalakes and datawarehouses . Optimize ETLworkflows to ensure data accuracy, reliability, and scalability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Implement and manage cloud − baseddataplatforms (e.g., AWS , Azure , or GoogleCloudPlatform ). Develop datamodels to support analytics and reporting. Monitor and troubleshoot data systems to ensure high performance and minimal downtime. Ensure data quality and security through governance best practices. Document workflows, processes, and architecture to facilitate collaboration and scalability. Stay updated with emerging data engineering technologies and trends. Qualifications Required Skills and Qualifications: Strong proficiency in SQL and Python for data processing and transformation. Hands-on experience with bigdatatechnologies like ApacheSpark , Hadoop , or Kafka . Knowledge of datawarehousingconcepts and tools such as Snowflake , BigQuery , or Redshift . Experience with workfloworchestrationtools like ApacheAirflow or Prefect . Familiarity with cloudplatforms (AWS, Azure, GCP) and their data services. Understanding of datagovernance , security , and compliance best practices. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications Certification in cloudplatforms (AWS, Azure, or GCP). Experience with NoSQLdatabases like MongoDB , Cassandra , or DynamoDB . Familiarity with DevOpspractices and tools like Docker , Kubernetes , and Terraform . Exposure to machinelearningpipelines and tools like MLflow or Kubeflow . Knowledge of datavisualizationtools like PowerBI , Tableau , or Looker . Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

We don't just want reports. We want results. At Found&Chosen, we combine creative firepower with strategic intelligence. We're looking for a Senior Data Analyst who's not just comfortable with data—but obsessed with uncovering stories in the numbers. You'll be at the heart of our Account-Based Marketing (ABM) engine, shaping strategies with insights that drive real impact for global B2B brands. What You'll Do: Own data strategy across multiple ABM accounts using Terminus, DemandScience, Propensity, Hubspot, and Salesforce Monitor campaign performance, lead scoring, pipeline contribution, and ROI Build dashboards and visualizations that answer real marketing questions—not just vanity metrics Partner with client-facing teams to influence campaign targeting and budget decisions Identify inefficiencies in funnel performance and recommend optimizations Bring clarity to chaos: consolidate, clean, and standardize data across sources Stay ahead of trends in B2B analytics, marketing operations, and ABM tools Requirements You Might Be the One If You Have: 3-5+ years experience in a marketing analytics or data analyst role ABM experience a must Strong working knowledge of Salesforce, Terminus, Propensity, and DemandScience (or similar ABM tools) Advanced Excel/Sheets skills, and experience with BI tools like Looker, Power BI, or Tableau A strategic mindset—comfortable speaking the language of both marketers and data scientists A love for making complex data simple, visual, and actionable Strong communication and collaboration skills Attention to detail and a problem-solving mindset Bonus: agency or fast-paced marketing environment experience Benefits Why Join Found&Chosen: We're a strategy-first marketing studio partnering with brands that want more than templates. Our analysts play a central role in shaping smart, high-conversion campaigns for real decision-makers. Here, your work isn't buried—it's valued, seen, and used. Show more Show less

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About TripleLift We're TripleLift, an advertising platform on a mission to elevate digital advertising through beautiful creative, quality publishers, actionable data and smart targeting. Through over 1 trillion monthly ad transactions, we help publishers and platforms monetize their businesses. Our technology is where the world's leading brands find audiences across online video, connected television, display and native ads. Brand and enterprise customers choose us because of our innovative solutions, premium formats, and supportive experts dedicated to maximizing their performance. As part of the Vista Equity Partners portfolio, we are NMSDC certified, qualify for diverse spending goals and are committed to economic inclusion. Find out how TripleLift raises up the programmatic ecosystem at triplelift.com. Job Overview We are looking for a detail-oriented and data-driven Yield Analyst with a focus on publisher-side yield optimization in the programmatic advertising space. The ideal candidate will have expertise in managing and maximizing revenue from programmatic ad inventory, with a focus on supply-side platforms (SSPs), private marketplaces (PMPs), and direct publisher partnerships. As a Yield Analyst, you will be responsible for analyzing inventory performance, optimizing yield strategies, and ensuring our publisher partners achieve their revenue goals. This is an exciting opportunity for someone passionate about maximizing the value of publisher ad inventory and working closely with a collaborative team in a dynamic, fast-paced environment. Key Responsibilities Yield Management & Optimization: Monitor, analyze, and optimize programmatic ad inventory across multiple SSPs and exchanges to maximize yield for publishers. Revenue Forecasting & Reporting: Develop and deliver regular reports on revenue performance, impressions, eCPMs (effective CPM), fill rates, and inventory utilization, and provide actionable insights for yield improvements. Inventory & Pricing Strategy: Work closely with the product and supply teams to optimize pricing strategies for programmatic inventory, ensuring proper management of floor prices, private marketplace deals, and remnant inventory. Bidstream & Auction Analysis: Analyze bidstream data to identify trends, anomalies, and opportunities to enhance bidding performance, increase competition, and maximize publisher revenue. Ad Quality & Placement Optimization: Ensure high-quality ad placements that maximize revenue without compromising user experience, brand safety, or ad relevance. Partnership Collaboration: Collaborate with internal teams and external partners (Publisher Client Services, Publishers, Product) to ensure that demand sources are fully leveraged, inventory is being fully monetized, and revenue goals are met. Troubleshooting & Issue Resolution: Identify and address issues related to ad delivery, revenue leakage, or technical problems that may impact yield optimization, collaborating with internal technical teams to resolve them. Market Intelligence & Trend Analysis: Stay up-to-date on the latest trends in programmatic advertising, particularly from a publisher perspective. Benchmark yield performance against industry standards and competitors to suggest new strategies. Continuous Improvement: Continuously improve and refine yield management practices, leveraging data, new technologies, and best practices to optimize monetization efforts. Qualifications Education: Bachelor’s degree in Business, Marketing, Data Science, Economics, or a related field. Experience: 2-4 years of experience in publisher-focused programmatic advertising, yield optimization, or related data analytics roles. Familiarity with SSPs, RTB, and header bidding is a must. Skills: Strong analytical skills with the ability to assess and interpret large datasets to inform decision-making. Proficiency with data analysis and reporting tools (Excel, SQL, Tableau, Looker, PowerBI, etc.). Deep understanding of the programmatic ad tech ecosystem, including SSPs, RTB, PMP, header bidding, and auction mechanics. Knowledge of key programmatic metrics, such as eCPM, fill rate, ad latency, and inventory yield. Excellent problem-solving skills with the ability to troubleshoot and resolve technical issues quickly. Strong communication skills to explain complex yield strategies and performance data to non-technical stakeholders. Ability to work collaboratively with cross-functional teams (sales, product, account management, and external partners). Technical Knowledge: Familiarity with ad tech platforms such as Google Ad Manager, OpenX, PubMatic, Rubicon, etc. Preferred Skills: Experience with data visualization tools (e.g., Tableau, Power BI), Python, R, or other data manipulation tools for advanced analytics. Life at TripleLift At TripleLift, we’re a team of great people who like who they work with and want to make everyone around them better. This means being positive, collaborative, and compassionate. We hustle harder than the competition and are continuously innovating. Learn more about TripleLift and our culture by visiting our LinkedIn Life page. Establishing People, Culture and Community Initiatives At TripleLift, we are committed to building a culture where people feel connected, supported, and empowered to do their best work. We invest in our people and foster a workplace that encourages curiosity, celebrates shared values, and promotes meaningful connections across teams and communities. We want to ensure the best talent of every background, viewpoint, and experience has an opportunity to be hired, belong, and develop at TripleLift. Through our People, Culture, and Community initiatives, we aim to create an environment where everyone can thrive and feel a true sense of belonging. Privacy Policy Please see our Privacy Policies on our TripleLift and 1plusX websites. TripleLift does not accept unsolicited resumes from any type of recruitment search firm. Any resume submitted in the absence of a signed agreement will become the property of TripleLift and no fee shall be due. Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Here is a combined Job Description (JD) for a Project & Data Manager (or Project Data Manager ) that merges project management responsibilities with technical data engineering expertise. This hybrid role is ideal for organizations needing someone to lead data-focused projects end-to-end, from strategic planning through to technical execution. Job Title: Project Data Manager Location: [Your Location] Department: Data & Analytics / Project Management Office Reports To: Head of Data Engineering / Director of Projects Job Summary: We are looking for an experienced and versatile Project Data Manager to lead data-centric projects, combining robust project management with deep technical data expertise. This hybrid role is responsible for overseeing the successful delivery of data initiatives—from understanding stakeholder needs to designing and deploying scalable data pipelines and analytics-ready infrastructure. You will ensure alignment between business goals, data strategy, and technical execution while maintaining high standards of data quality and operational efficiency. Key Responsibilities: Project Management: Manage the planning, execution, and delivery of data engineering projects, ensuring they are completed on time, within scope, and within budget. Define project scopes, objectives, timelines, resource plans, and risk mitigation strategies in coordination with stakeholders. Serve as the primary point of contact between technical teams, business units, and leadership. Track project milestones, deliverables, and KPIs using appropriate tools (e.g., Jira, Asana, MS Project). Facilitate project meetings, status updates, and stakeholder communications. Data Engineering & Technical Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes to ingest, transform, and store data from various structured and unstructured sources. Implement efficient data models and schemas to support business intelligence, reporting, and machine learning use cases. Ensure data accuracy and quality by building validation, monitoring, and error-handling mechanisms. Collaborate closely with analysts, data scientists, and business units to deliver clean, reliable, and accessible data. Optimize performance, scalability, and cost-effectiveness of data infrastructure and cloud-based platforms. Stay up to date with emerging trends in data engineering, cloud computing, and project management best practices. Required Qualifications: Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field. Minimum 8 years of experience in data engineering or related technical roles, with at least 2 years of experience in a project management capacity. Proficient in SQL, Python, and data pipeline tools (e.g., Airflow, Kafka, Talend, Informatica). Hands-on experience with relational and NoSQL databases (e.g., Postgres, MongoDB, Oracle), data warehousing platforms (e.g., Snowflake, Redshift, BigQuery), and distributed computing frameworks (e.g., Spark, Flink). Familiarity with cloud services (AWS, Azure, GCP) and infrastructure-as-code concepts. Understanding of data governance, data security, and compliance best practices. Strong working knowledge of Agile, Scrum, or Waterfall methodologies. Excellent analytical thinking, problem-solving skills, and attention to detail. Effective verbal and written communication skills for cross-functional collaboration and stakeholder management. Preferred Qualifications: PMP, PRINCE2, or Agile/Scrum certification. Experience with data visualization platforms such as Power BI, Tableau, or Looker. Experience building CI/CD pipelines for data deployment and managing version control (e.g., Git). Prior experience in leading data transformation initiatives or enterprise-wide data modernization projects. Would you like this tailored further for a specific industry (e.g., healthcare, fintech, SaaS) or level (mid-senior, director, etc.)? Show more Show less

Posted 2 weeks ago

Apply

Exploring Looker Jobs in India

Looker, a powerful data visualization and business intelligence tool, is gaining popularity in India, leading to a growing demand for professionals with expertise in this area. Companies across various industries are actively seeking skilled individuals who can utilize Looker to analyze data and make informed business decisions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The salary range for Looker professionals in India varies based on experience levels. Entry-level positions may start at around ₹5-6 lakhs per annum, while experienced professionals can earn up to ₹15-20 lakhs per annum.

Career Path

Career progression in Looker typically follows a path from Junior Analyst to Senior Analyst, and then to roles such as Business Intelligence Manager or Data Analytics Lead. With experience and additional certifications, professionals can advance to roles like Data Scientist or Chief Data Officer.

Related Skills

Aside from proficiency in Looker, professionals in this field are often expected to have knowledge of SQL, data visualization techniques, data modeling, and experience with other BI tools like Tableau or Power BI.

Interview Questions

  • What is LookML? (basic)
  • How do you create a new Look in Looker? (basic)
  • Explain the difference between a Dimension and a Measure in Looker. (basic)
  • How can you optimize Looker queries for better performance? (medium)
  • What are some common pitfalls to avoid when working with Looker? (medium)
  • How do you handle data security in Looker? (medium)
  • Can you explain the concept of Derived Tables in Looker? (advanced)
  • How would you approach building a complex dashboard in Looker? (advanced)
  • How do you schedule data deliveries in Looker? (advanced)
  • Explain the process of data caching in Looker. (advanced)
  • ...

Closing Remark

As the demand for Looker professionals continues to rise in India, now is the perfect time to enhance your skills and pursue opportunities in this exciting field. Prepare thoroughly, showcase your expertise, and apply confidently for Looker jobs to advance your career in data analytics and business intelligence.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies