Home
Jobs

731 Bigquery Jobs - Page 26

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 12 years

32 - 37 Lacs

Jaipur

Work from Office

Naukri logo

About The Role : Job Title: Analytics Senior Analyst LocationJaipur, India Corporate TitleAVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The teams purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Offices deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 month ago

Apply

10 - 15 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Introduction: We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Solution Architect . As a Solution Architect, you will play a pivotal role in designing and implementing cloud-based solutions for our team using GCP. The ideal candidate will have a deep understanding of cloud architecture, a proven track record of delivering cloud-based solutions, and experience with GCP technologies. You will work closely with technical teams and clients to ensure the successful deployment and optimization of cloud solutions. Responsibilities: Lead the design and architecture of GCP-based solutions, ensuring scalability, security, performance, and cost-efficiency. Collaborate with business stakeholders, engineering teams, and clients to understand technical requirements and translate them into cloud-based solutions. Provide thought leadership and strategic guidance on cloud technologies, best practices, and industry trends. Design and implement cloud-native applications, data platforms, and microservices on GCP. Ensure cloud solutions are aligned with clients business goals and requirements, with a focus on automation and optimization. Conduct cloud assessments, identifying areas for improvement, migration strategies, and cost-saving opportunities. Oversee and manage the implementation of GCP solutions, ensuring seamless deployment and operational success. Create detailed documentation of cloud architecture, deployment processes, and operational guidelines. Engage in pre-sales activities, including solution design, proof of concepts (PoCs), and presenting GCP solutions to clients. Ensure compliance with security and regulatory requirements in the cloud environment. Requirements: At least 2+ years of experience as a Cloud Architect or in a similar role with strong expertise in Google Cloud Platform. In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud Functions, and networking. Experience with infrastructure-as-code tools such as Terraform Strong understanding of cloud security, identity management, and compliance frameworks (e.g., GDPR, HIPAA). Hands-on experience with GCP networking, IAM, and logging/monitoring tools (Cloud Monitoring, Cloud Logging). Strong experience in designing and deploying highly available, fault-tolerant, and scalable solutions. Proficiency in programming languages like Java, Golang. Experience with containerization and orchestration technologies such as Docker, Kubernetes, and GKE (Google Kubernetes Engine). Experience in cloud cost management and optimization using GCP tools. Thanks, Pratap

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Mumbai

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Kanpur

Work from Office

Naukri logo

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills.Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications

Posted 1 month ago

Apply

3 - 5 years

9 - 18 Lacs

Chennai

Hybrid

Naukri logo

Role & responsibilities As a part of Tonik Data Analytics team candidate will be responsible for Develop and enhance the Data Lake Framework for ingestion of data from various sources and providing reports to downstream systems/users. Work with the different Tonik IT Teams to implement the data requirements. Develop Data Pipelines based on requirement on Key GCP Services i.e. BigQuery, Airflow, GCS using Python/SQL language. Ensure the proper GCP Standards are followed for the implementation. Preferred candidate profile Handson experience in any one of the Programming languages (Python, Java). Working experience in cloud platform (AWS/GCP/Azure). Experience in Design pattern & designing scalable solutions Handson experience in SQL and able to convert the requirements to a standard, scalable, cost effective and with better performance. Should be aware of the Data Engineering principals and Data pipeline techniques. Communicate effectively with stakeholders and other team members. Implemented E2E automated pipeline to ingest the data from different formats (csv, fixed width, json etc.) Closely work with various business teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and implementation approaches in the delivery. Experienced and should have hands-on experience in the following key offerings from GCP or equivalent services from AWS. Composer/Airflow, BigQuery, Dataflow, Cloud Storage, Apache Beam, Data Proc Should have good understanding on the Security related configuration and arrangements in BigQuery and how to handle the data securely while sharing. Nice to have exposure/knowledge on the ML and pipelines

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Nagpur

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

2 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Data Analyst an experienced analytics professional who is passionate about unleashing the power of data to inform decision-making, achieve strategic objectives, and support hiring and retention of world-class talent. As an integral part of the team, the Data Analyst will use analytical skills and business acumen to turn data into knowledge and drive business success. Requirements and Qualifications: Minimum 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering, preferably in a globally recognized organization. Expert-level proficiency in writing complex SQL queries to create views in data warehouses like Snowflake, Redshift, SQL Server, Oracle, or BigQuery. Advanced skills in designing and developing data models and dashboards using BI tools such as Tableau, Domo, Looker, etc. Intermediate-level skills with analytical tools such as Excel, Google Sheets, or Power BI (e.g., complex formulas, lookups, pivot tables). Bachelors or advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Mathematics, Statistics, or a related field. Willingness to collaborate with internal team members and stakeholders across different time zones. Roles and Responsibilities: Perform advanced analytics such as cohort analysis, scenario analysis, time series analysis, and predictive analysis, and create powerful data visualizations. Clearly articulate assumptions, data interpretations, and analytical findings in a variety of formats for different audiences. Design data models that define the structure and relationship of data elements across various sources based on reporting and analytics needs. Collaborate with BI Engineers to build scalable, high-performing reporting and analytics solutions. Write SQL queries to extract and manipulate data from warehouses such as Snowflake. Conduct data validation and quality assurance checks to ensure high standards of data integrity. Investigate and resolve data issues, including root cause analysis when inconsistencies arise in reporting.

Posted 1 month ago

Apply

8 - 12 years

18 - 25 Lacs

Noida

Remote

Naukri logo

Job Title: Data Modeler Enterprise Data Platform (BigQuery, Retail Domain) Location: Remote Duration: Contract Timing: US EST Hours We have an immediate need for an IT Data Modeler Enterprise Data Platform (BigQuery, Retail Domain) reporting to Enterprise Data Platform (EDP) Architecture Team. Job Summary: We are seeking an experienced Data Modeler to support the Enterprise Data Platform (EDP) initiative, focusing on building and optimizing curated data assets on Google BigQuery. This role requires expertise in data modeling, strong knowledge of retail data, and an ability to collaborate with data engineers, business analysts, and architects to create scalable and high-performing data structures. Key Responsibilities: 1. Data Modeling & Curated Layer Design Design logical, conceptual, and physical data models for EDPs curated layer in BigQuery. Develop fact and dimension tables, ensuring adherence to dimensional modeling best practices (Kimball methodology). Optimize data models for performance, scalability, and query efficiency in a cloud-native environment. Work closely with data engineers to translate models into efficient BigQuery implementations (partitioning, clustering, materialized views). 2. Data Standardization & Governance Define and maintain data definitions, relationships, and business rules for curated assets. Ensure data integrity, consistency, and governance across datasets. Work with Data Governance teams to align models with enterprise data standards and metadata management policies. 3. Collaboration with Business & Technical Teams Engage with business analysts and product teams to understand data needs, ensuring models align with business requirements. Partner with data engineers and architects to implement best practices for data ingestion and transformation. Support BI & analytics teams by ensuring curated models are optimized for downstream consumption (e.g., Looker, Tableau, Power BI, AI/ML models, APIs). Required Qualifications: 7+ years of experience in data modeling and architecture in cloud data platforms (BigQuery preferred). Expertise in dimensional modeling (Kimball), data vault, and normalization/denormalization techniques. Strong SQL skills, with hands-on experience in BigQuery performance tuning (partitioning, clustering, query optimization). Understanding of retail data models (e.g., sales, inventory, pricing, supply chain, customer analytics). Experience working with data engineering teams to implement models in ETL/ELT pipelines. Familiarity with data governance, metadata management, and data cataloging. Excellent communication skills and ability to translate business needs into structured data models. Prior experience working in retail or e-commerce data environments. Exposure to modern data architectures (data lakehouse, event-driven data processing). Familiarity with GCP ecosystem (Dataflow, Pub/Sub, Cloud Storage) and BigQuery security best practices. Thanks & Regards: Abhinav Krishna Srivastava Mob : +91- 9667680709 Email: asrivastava@fcsltd.com

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Kolkata

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Jaipur

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

4 - 9 years

5 - 12 Lacs

Pune

Work from Office

Naukri logo

Night Shift: 9:00PM to 6:00AM Hybrid Mode: 3 days WFO & 2 Days WFH Job Overview We are looking for a savvy Data Engineer to manage in-progress and upcoming data infrastructure projects. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder using Python and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities for Data Engineer * Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes. * Identify, design, and implement internal process improvements through: automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Build the infrastructure required for optimal extraction, transformation, and loading ofdata from a wide variety of data sources using SQL / AWS / Snowflakes technologies. * Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. * Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. * Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. * Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer * Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. * Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. * Strong analytic skills related to working with unstructured datasets. * Build processes supporting data transformation, data structures, metadata, dependency and workload management. * A successful history of manipulating, processing and extracting value from large disconnected datasets. Desired Skillset:- * 2+ years of experience in a Python Scripting and Data specific role, with Bachelor degree. * Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, APIs and how they work. * Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake. * Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra. * Experience with most or all the following cloud services: AWS, Azure, Snowflake,Google Strong project management and organizational skills. * Experience supporting and working with cross-functional teams in a dynamic environment.

Posted 1 month ago

Apply

3 - 7 years

14 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Data Engineer with 3 to 7 years of experience to join our team in Bengaluru. The ideal candidate will have a strong background in building data pipelines using Google Cloud Platform (GCP) and hands-on experience with BigQuery, GCP SDK, and API scripting. ### Roles and Responsibility Design, develop, and implement data pipelines using BigQuery and GCP SDK. Build and orchestrate data fusion pipelines for data migration from various databases. Develop scripts in Python and write technical architecture documentation. Implement monitoring architecture and test GCP services. Participate in Agile and DevOps concepts, including CI/CD pipelines and test-driven frameworks. Collaborate with cross-functional teams to deliver high-quality solutions. ### Job Requirements Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 3 years of experience in data engineering, preferably with GCP. Strong understanding of application architecture and programming languages. Experience with AWS and/or Azure and/or GCP, along with a proven track record of building complex infrastructure programmatically. Strong scripting and programming skills in Python and Linux Shell. Experience with Agile and DevOps concepts, as well as CI/CD pipelines and test-driven frameworks. Certification in Google Professional Cloud Data Engineer is desirable. Proactive team player with good English communication skills. Good understanding of Application Architecture. Experience working on CMMI / Agile / SAFE methodologies. Experience working with AWS and/or Azure and/or GCP and a proven track record of building complex infrastructure programmatically with IaC tooling or vendor libraries. Strong communication and written skills. Experience creating technical architecture documentation. Experience in Linux OS internals, administration, and performance optimization.

Posted 1 month ago

Apply

3 - 8 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 1 month ago

Apply

11 - 12 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Lead Data Engineer Position: Lead Data Engineer Location: Hyderabad (Work from Office Mandatory) Experience: 10+ years overall | 8+ years relevant in Data Engineering Notice Period: Immediate to 30 days. About the Role We are looking for a strategic and hands-on Lead Data Engineer to architect and lead cutting-edge data platforms that empower business intelligence, analytics, and AI initiatives. This role demands a deep understanding of cloud-based big data ecosystems, excellent leadership skills, and a strong inclination toward driving data quality and governance at scale. You will define the data engineering roadmap, architect scalable data systems, and lead a team responsible for building and optimizing pipelines across structured and unstructured datasets in a secure and compliant environment. Key Responsibilities 1. Technical Strategy & Architecture Define the vision and technical roadmap for enterprise-grade data platforms (Lakehouse, Warehouse, Real-Time Pipelines). Lead evaluation of data platforms and tools, making informed build vs. buy decisions. Design solutions for long-term scalability, cost-efficiency, and performance. 2. Team Leadership Mentor and lead a high-performing data engineering team. Conduct performance reviews, technical coaching, and participate in hiring/onboarding. Instill engineering best practices and a culture of continuous improvement. 3. Platform & Pipeline Engineering Build and maintain data lakes, warehouses, and lakehouses using AWS, Azure, GCP, or Databricks. Architect and optimize data models and schemas tailored for analytics/reporting. Manage large-scale ETL/ELT pipelines for batch and streaming use cases. 4. Data Quality, Governance & Security Enforce data quality controls: automated validation, lineage, anomaly detection. Ensure compliance with data privacy and governance frameworks (GDPR, HIPAA, etc.). Manage metadata and documentation for transparency and discoverability. 5. Cross-Functional Collaboration Partner with Data Scientists, Product Managers, and Business Teams to understand requirements. Translate business needs into scalable data workflows and delivery mechanisms. Support self-service analytics and democratization of data access. 6. Monitoring, Optimization & Troubleshooting Implement monitoring frameworks to ensure data reliability and latency SLAs. Proactively resolve bottlenecks, failures, and optimize system performance. Recommend platform upgrades and automation strategies. 7. Technical Leadership & Community Building Lead code reviews, define development standards, and share reusable components. Promote innovation, experimentation, and cross-team knowledge sharing. Encourage open-source contributions and thought leadership. Required Skills & Experience 10+ years of experience in data engineering or related domains. Expert in PySpark, Python, and SQL . Deep expertise in Apache Spark and other distributed processing frameworks. Hands-on experience with cloud platforms (AWS, Azure, or GCP) and services like S3, EMR, Glue, Databricks, Data Factory . Proficient in data warehouse solutions (e.g., Snowflake, Redshift, BigQuery) and RDBMS like PostgreSQL or SQL Server. Knowledge of orchestration tools (Airflow, Dagster, or cloud-native schedulers). Familiarity with CI/CD tools , Git, and Infrastructure as Code (Terraform, CloudFormation). Strong data modeling and lifecycle management understanding.

Posted 1 month ago

Apply

8 - 12 years

25 - 40 Lacs

Hyderabad

Remote

Naukri logo

Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

3 - 5 years

10 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Overview About us We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We currently have 2500+ awesome colleagues (in Annalect India) who are committed to solve our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together . Responsibilities This is an exciting role and would entail you to Partner with internal and external client in their desire to create ‘best-in-class’ data & analytics to support their business decisions. Be a passionate champion of data-driven marketing and create a data and insight-led culture across teams. Requirement gathering and evaluation of clients’ business situations in order to implement appropriate analytic solutions. Data management and reporting using different tools and techniques like Alteryx. Strong knowledge on the media metrics, custom calculations, and metrics co-relation. Good to have (not mandatory) data visualization using excel Ability to identify and determine key performance indicators for the clients. QA process: Maintain, create and re-view QA plans for deliverables to align with the requirements, identify discrepancies if any and troubleshoot issues. Responsible for maintaining the reporting requirements as per the delivery cadence defined by the client. • Create and maintain project specific documents such as process / quality / learning documents. Able to work successfully with teams, handling multiple projects and meeting client expectations. Qualifications You will be working closely with Our global marketing agency teams. You will also be closely collaborating with Manager and colleagues within the Performance Reporting function. This may be the right role for you if you have Bachelor’s Degree required. 4-6 years' experience in data management and analysis in Media or relevant domain with strong problem-solving ability Good analytical ability and logical reasoning Strong working knowledge of MS Excel and Advanced Excel Strong working knowledge and hands on experience in data visualization and report generation using Power BI is mandatory. Proficiency in PPT and SharePoint Experience in data processing tools like SQL, Python, Alteryx, etc. would be beneficial Knowledge of media/advertising is beneficial but not mandatory Strong written and verbal communication Familiarity working with large data sets and creating cohesive stories Understanding of media domain and channels like Display, Search, Social, Competitive Experience of creating tables in database like AWS, Google Big Query etc Knowledge of scripting language like Python or SQL is preferred Familiarity with data platforms like Double Click Campaign Manager, DV360, SA360, MOAT, IAS, Facebook Business Manager, Twitter, Innovid, Sizmek, Kenshoo, Nielsen, Kantar, MediaMath, Prisma, AppNexus

Posted 1 month ago

Apply

1 - 4 years

8 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer (1- 4 Years Experience) Location: Bangalore Company: Lenskart About the Role We are looking for a hands-on Data Engineer to help us scale our data infrastructure and platforms. In this role, youll work closely with engineering, analytics, and product teams to build reliable data pipelines and deliver high-quality datasets for analytics and reporting. If you're passionate about cloud data engineering, writing efficient code in Python, and working with technologies like BigQuery and GCP, this is the perfect role for you. Key Responsibilities 1. Build and maintain scalable ETL/ELT data pipelines using Python and cloud-native tools. 2. Design and optimize data models and queries on Google BigQuery for analytical workloads. 3. Develop, schedule, and monitor workflows using orchestration tools like Apache Airflow or Cloud Composer. 4. Ingest and integrate data from multiple structured and semi-structured sources, including MySQL , MongoDB , APIs, and cloud storage. 5. Ensure data integrity, security, and quality through validation, logging, and monitoring systems. 6. Collaborate with analysts and data consumers to understand requirements and deliver clean, usable datasets. 7. Implement data governance, lineage tracking, and documentation as part of platform hygiene. Must-Have Skills 1. 1-4 years of experience in data engineering or backend development. 2. Strong experience with Google BigQuery and GCP (Google Cloud Platform). 3. Proficiency in Python for scripting, automation, and data manipulation. 4. Solid understanding of SQL and experience with relational databases like MySQL. 5. Experience working with MongoDB and semi-structured data (e.g., JSON, nested formats). 6.Exposure to data warehousing, data modeling, and performance tuning. 7. Familiarity with Git-based version control and CI/CD practices.

Posted 1 month ago

Apply

3 - 8 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : React.js, Cloud Network Operations Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Develop and implement scalable applications using Google BigQuery. Collaborate with cross-functional teams to ensure application functionality. Conduct code reviews and provide technical guidance to junior developers. Stay updated on industry trends and best practices in application development. Troubleshoot and resolve application issues in a timely manner. Professional & Technical Skills: (Project specific) BQ, BQ Geospatial, Python, Dataflow, Composer , Secondary skill -Geospatial Domain Knowledge" Must To Have Skills: Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 3 years of experience in Google BigQuery. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 6 years

12 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Lead – Biddable (Reporting) This exciting role of a Lead – Biddable (Reporting) requires you to creatively manage Biddable media campaigns for our global brands. Your expertise of DSPs and knowledge of the Digital Market Cycle would make you a great fit for this position. This is a great opportunity to work closely with the Top Global brands and own large and reputed accounts. About us We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together > Responsibilities Work with clients and stakeholders on gathering requirements around reporting. Design solutions and mock-ups of reports based on requirements that define every detail. Develop reporting based on marketing data in Excel, Power BI Collaborate with other members of the reporting design team and data & automation team to build and manage complex data lakes that support reporting. Extract reports from the platforms like, DV360, LinkedIn, TikTok, Reddit, Snapchat, Meta, Twitter Organize the reports into easily readable tables Offer comprehensive insights based on the reporting data. Assess data against previous benchmarks and provide judgments/recommendations Direct communication with agencies for projects Managed communication for end clients Preferred: Experience in digital marketing such as paid-search, paid-social, or programmatic display is extremely helpful. Qualifications A full-time graduate degree (Mandatory) A proven history of 6+ years as a marketing reporting Analyst or experience in a similar role with opportunities in this. A solid understanding of paid digital marketing functions is essential to this job. Strong experience working with data such as Excel (Vlookups, SUMIFS, advanced functions are a must). Experience in working with web-based reporting platforms such as Looker is preferred. Strong communication skills with a strong preference on having collaborated with teams in the United States or United Kingdom including gathering requirements or collaborating with teams on solution design.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies