Home
Jobs

520 Bigquery Jobs - Page 16

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Role Senior Data Analyst. Experience 6 to 10 years. Location Bangalore, Pune, Hyderabad, Gurgaon, Noida. Notice Immediate joiners only. About The Role Data Analyst EDA Exploratory Data Analysis, Communication ,Strong hands-on SQL ,Documentation Exp, GCP Exp, Data pipeline Exp. Requirements - 8+ years experience in Data mining working with large relational databases, succession using advanced data extraction and manipulation tools (for example; Big Query, Teradata, etc.) working with both structured and unstructured data. - Excellent communication skills, both written and verbal able to explain solutions, problems in clear and concise manner. - Experience in conducting business analysis to capture requirements from non-technical partners. - Superb analytical and conceptual thinking skills; to not only to manipulate but also derive relevant interpretations from data. - Proven knowledge of the data management lifecycle, including experience with data quality and metadata management. - Hands on experience in Computer Science, Statistics, Mathematics or Information Systems. - Experience in cloud, GCP Bigquery including but not limited to complex SQL querying. - 1-2 years or experience/exposure in the following : 1. Experience with CI/CD release processes using gitlab,Jira, confluence. 2. Familiarity with creating yaml files, understanding unstructured data such as json. 3. Experience with Looker Studio, Dataplex is a plus. - Hands on engineering experience is an asset. - Exposure to Python, Java nice to have. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary We are seeking a highly analytical and detail-oriented Data Specialist with deep expertise in SQL, Python, statistics, and automation. The ideal candidate will be responsible for designing robust data pipelines, analyzing large datasets, driving insights through statistical methods, and automating workflows to enhance data accessibility and business decision-making. Key Responsibilities - Write and optimize complex SQL queries for data extraction, transformation, and reporting. - Develop and maintain Python scripts for data analysis, ETL processes, and automation tasks. - Conduct statistical analysis to identify trends, anomalies, and actionable insights. - Build and manage automated dashboards and data pipelines using tools such as Airflow, Pandas, or Apache Spark. - Collaborate with cross-functional teams (product, engineering, business) to understand data needs and deliver scalable solutions. - Implement data quality checks and validation procedures to ensure accuracy and consistency. - Support machine learning model deployment and performance tracking (if applicable). - Document data flows, models, and processes for internal knowledge sharing. Key Requirements - Strong proficiency in SQL (joins, CTEs, window functions, performance tuning). - Solid experience with Python (data manipulation using Pandas, NumPy, scripting, and automation). - Applied knowledge of statistics (hypothesis testing, regression, probability, distributions). - Experience with data automation tools (Airflow, dbt, or equivalent). - Familiarity with data visualization tools (Tableau, Power BI, or Plotly) is a plus. - Understanding of data warehousing concepts (e.g., Snowflake, BigQuery, Redshift). - Strong problem-solving skills and the ability to work independently. Preferred Qualifications - Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or a related field. - Exposure to cloud platforms like AWS, GCP, or Azure. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Category: Technology Location: Shuru is a technology-consulting company that embeds senior product and engineering teams into fast-growing companies worldwide to accelerate growth and de-risk strategy Our work is global, high-stakes, and unapologetically business-first, Role Overview Youll join a lean, senior-only business intelligence team as a Senior Data Analyst who will sit shoulder-to-shoulder with our clients, operating as their in-house analytics brain-trust Your mandate: design the data questions worth asking, own the pipelines that answer them, and convert findings into clear, bottom-line actions If you need daily direction, this isnt for you If you see a vague brief as oxygen, read on, Key Responsibilities Frame the right questions Translate ambiguous product or commercial goals into testable hypotheses, selecting the metrics that truly explain user behaviour and unit economics, Own data end-to-end Model, query, and transform data in SQL and dbt, pushing to cloud warehouses such as Snowflake/BigQuery, with zero babysitting, Build self-service BI Deliver dashboards in Metabase/Looker that non-technical stakeholders can tweak without coming back to you every week, Tell unforgettable stories Turn complex analyses into visuals and narratives that drive decisions in the C-suite and on the sprint board, Guard the data moat Champion data governance, privacy, and quality controls that scale across multiple client engagements, Mentor & multiply Level-up engineers and product managers on analytical thinking, setting coding and insight standards for future analysts, Requirements Must-Have Skills & Experience Minimum Experience of 3 years Core Analytics: Expert SQL; comfort with Python or R for advanced analysis; solid grasp of statistical inference and experimentation, Modern Data Stack: Hands-on with dbt, Snowflake/BigQuery/Redshift, and at least one orchestration tool (Airflow, Dagster, or similar), BI & Visualisation: Proven delivery in Metabase, Looker, or Tableau (including performance tuning for big data models ) Product & Growth Metrics: Demonstrated ability to define retention, activation, and LTV/Payback KPI for SaaS or consumer-tech products, Communication: Relentless clarity; you can defend an insight to both engineers and the CFO, and change course when the data disproves you, Independence: History of thriving with ?figure it out? briefs and distributed teams across time zones, Bonus Points Feature-flag experimentation at scale (e-g , Optimizely, LaunchDarkly), Familiarity with privacy-enhancing tech (differential privacy, data clean rooms), Benefits Work on international projects Execute with founders and execs from around the globe, stacking your playbook fast, Regular team outings We fund quarterly off-sites and virtual socials to keep the remote vibe human, Collaborative & growth-oriented Learn directly from CXOs, leads, and seasoned PMs; no silos, no artificial ceilings, Competitive salary & benefits Benchmark ?90th percentile for similar-stage firms, plus performance upside, Details

Posted 3 weeks ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

In the minute it takes you to read this job description, Bluecore has launched over 100,000 individually personalized marketing campaigns for our retail ecommerce customers! Job Title: Product Support Engineer (PSE) Location: Remote About Us At Bluecore, we are revolutionizing the digital marketing space As a Product Support Engineer (PSE), you will help our customers optimize their use of our platform by resolving technical issues, setting up campaigns, and ensuring they get the most value from the tools and features we offer, Who You Are Pride yourself on a job well done You take ownership of the task at hand, ensuring you deliver accurate and effective solutions every time Customer support is a team effort, and you embrace feedback, actively listening to customers and colleagues alike, Collaborative and empathetic You put others first and commit to the right solution, not just your own You enjoy collaborating with others, learning from them, and sharing your own knowledge in a way that benefits the team, Disciplined curiosity When somethings unclear, you approach it head-on, asking the right questions and seeking to expand your technical knowledge Youre passionate about learning and improving, always curious to explore new technologies and share your insights, Customer-focused You understand the bigger picture of what matters to our customers and make sure to communicate clear solutions that address their needs Every interaction is intentional and designed to build confidence toward solving their challenges, Proactive and adaptable You stay ahead of issues, identifying patterns in client problems, and work with internal teams to address them swiftly Youre comfortable working in a 24x7 shift culture to ensure that client issues are addressed around the clock, What Youll Do Client Support & Troubleshooting: Provide expert technical support for clients using Bluecore, BigQuery, Datadog, and other tools Help them resolve issues, optimize campaigns, and maximize the platforms capabilities, Campaign Management: Assist clients in configuring, optimizing, and troubleshooting email/SMS campaigns, including segmentation, automation, and reporting, Problem Resolution: Youll quickly identify technical issues, solve them, and communicate the solution clearly to clients Whether its a data issue or a platform error, youll ensure its resolved efficiently, Collaboration & Knowledge Sharing: Collaborate with Product, Engineering, and Technical Support teams to escalate and resolve complex issues Share patterns, trends, and learnings with your team to help improve the overall customer experience, Continuous Learning: Develop your technical skills through hands-on experience with our tools and contribute to our internal knowledge base Share insights and best practices with the team, Qualifications 1+ years in product support, technical support, or related roles (preferably SaaS, eCommerce, or digital marketing environments), Hands-on experience with tools like Bluecore, BigQuery, Datadog, Looker, or similar platforms, Strong technical troubleshooting skills in a customer-facing role, Excellent written and verbal communication skills, with the ability to simplify complex technical issues for clients, Customer-first attitude, ensuring every interaction is aligned with the customers needs and provides a clear path to resolution, Bachelors degree in Computer Science, Engineering, or a related field (or equivalent experience), Open to 24x7 shift work culture, Why Join Us Work with cutting-edge tools and technologies in the eCommerce and digital marketing space, Competitive salary and benefits, Collaborative and innovative team culture where your contributions make a direct impact, Growth opportunities for continued professional development and learning, More About Us Bluecore is a multi-channel personalization platform that gives retailers a competitive advantage in a digital-first world Unlike systems built for mass marketing and a physical-first world, Bluecore unifies shopper and product data in a single platform, and using easy-to-deploy predictive models, activates welcomed one-to-one experiences at the speed and scale of digital Through Bluecores dynamic shopper and product matching, brands can personalize 100% of communications delivered to consumers through their shopping experiences, anywhere, This Comes To Life In Three Core Product Lines Bluecore Communicate?a modern email service provider (ESP) + SMS Bluecore Site?an onsite capture and personalization product Bluecore Advertise?a paid media product Bluecore is credited with increasing lifetime value of shoppers and overall speed to marketing for more than 400 brands, including Express, Tommy Hilfiger, The North Face, Teleflora and Bass Pro Shops We have been recognized as one of the Best Places to Work by Fortune, Crain's, Forbes and BuiltIn as well as ranked on the Inc 5000, the most prestigious ranking of the nations fastest-growing private companies, We are proud of the culture of flexibility, inclusivity and trust that we have built around our workforce We are a remote first organization with the option to potentially work in our New York headquarters on occasion moving forward We love the opportunity to come together but employees will always have the option on where they work best, At Bluecore we believe in encouraging an inclusive environment in which employees feel encouraged to share their unique perspectives, demonstrate their strengths, and act authentically We know that diverse teams are strong teams, and welcome those from all backgrounds and varying experiences Bluecore is a proud equal opportunity employer We are committed to fair hiring practices and to building a welcoming environment for all team members All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, disability, age, familial status or veteran status, We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment Please contact us to request accommodation,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleProduction Specialist, AVP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As an Assistant Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Define and track KPIs, SLAs and operational metrics to measure and improve application stability and performance. Conduct real time monitoring to ensure application SLAs are achieved and maximum application availability (up time) using an array of monitoring tools. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Approach support with a proactive attitude, desire to seek root cause, in-depth analysis, and strive to reduce inefficiencies and manual efforts. Mentor and guide junior team members, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Drive service resilience by implementing SRE(site reliability engineering) principles, ensuring proactive monitoring, automation and operational efficiency. Ensure regulatory and compliance adherence, managing audits,access reviews, and security controls in line with organizational policies. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 9-15 years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred Team lead role experience, Experience in an investment bank, financial institution. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP) and hybrid applications. Strong understanding of ITIL /SRE/ DEVOPS best practices for supporting a production environment. Understanding of KPIs, SLO, SLA and SLI Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring, Airflow,Splunk. Working Knowledge of creation of Dashboards and reports for senior management Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading L2 support teams, including managing vendor teams and offshore resources. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleProduct and Change Specialist, VP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As a Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead and drive production support strategy, ensuring alignment with business objectives and SRE/RTB transformation goals Provide thought leadership in implementing ITIL principles to enhance automation, monitoring and operational efficiency. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Guiding technical subject matter experts on the applications being supported including business flows, application architecture, and hardware configuration. Own, define and track KPIs, SLAs, Dashboards and operational metrics to measure and improve application stability and performance. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Fostering a culture of continuous learning, proactive monitoring, and incident prevention. Establish governance frameworks for production support operations, ensuring effective tracking and reporting of incidents, problems and changes Mentor and guiding AVPs, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Evaluate and implement emerging technologies to enhance production support capabilities. Ensure regulatory and compliance adherence, managing audits, access reviews, and security controls in line with organizational policies. Drive Programs and Projects for RTB function across domains Lead Application onboarding for all new applications coming into RTB remit to ensure safe and timely transition Develop executive-level reporting on production health, risk, and stability metrics for senior leadership. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 13-20+ years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred VP or head of domain role experience, Experience in an investment bank, financial institution or Managed Service Industry Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP), AWS and hybrid applications. Understanding of SII and Audit concepts and ability to drive Audit calls Strong understanding of ITIL / DEVOPS best practices for supporting a production environment. Monitoring ToolsKnowledge of Control M, Grafana, Geneos, Google Cloud Monitoring. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading and managing large L2/L3 support teams, including managing vendor teams and offshore resources. across multiple geographies. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Prior experience in automation projects is great to have. Budget and resource planning experience, optimizing operational costs and workforce efficiency. Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 weeks ago

Apply

8.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Design and develop Looker Dashboards, with apt security provisioning and drill down capabilities. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets. Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes. Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data quality, governance, and security best practices across all layers of the data stack. Write clean, maintainable, and efficient code following best practices. Requirements Data Engineering: 8–12 years of experience in data engineering, with at least 3–5 years hands-on experience specifically in Google Cloud Platform (GCP) and BI tools like Looker. BigQuery (data modeling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Basic to intermediate knowledge of Python for scripting and automation. Exposure to ETL and batch scheduling/ orchestration solutions Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. Looker: Hands on experience in Looker with design, development, configuration/ setup, dashboarding and reporting techniques. Experience building and maintaining LookML models, Explores, PDTs, and semantic layers. Understanding of security provisioning and access controls, performance tuning of dashboard/ reports based on large dataset, building drill down capabilities. Proven ability to design scalable, user-friendly dashboards and self-service analytics environments. Expertise in optimizing Looker performance: materialized views, query tuning, aggregate tables. Strong command over Row-Level Security, Access Filters, and permission sets in Looker to support enterprise-grade data governance. General: Experience with Agile delivery methodologies (e.g. Scrum, Kanban) Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, Looker modeling best practices, and advanced visualization techniques. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

15 - 30 Lacs

Gurugram, Chennai

Work from Office

Naukri logo

Key Responsibilities : Lead the design and development of scalable data pipelines using PySpark and ETL frameworks on Google Cloud Platform (GCP) . Own end-to-end data architecture and solutions, ensuring high availability, performance, and reliability. Collaborate with data scientists, analysts, and stakeholders to understand data needs and deliver actionable insights. Optimize complex SQL queries and support advanced data transformations. Ensure best practices in data governance, data quality, and security . Mentor junior engineers and contribute to team capability development. Requirements : 8+ years of experience in data engineering roles. Strong expertise in GCP data services (BigQuery, Dataflow, Pub/Sub, Composer, etc.). Hands-on experience with PySpark and building ETL pipelines at scale. Proficiency in SQL with the ability to write and optimize complex queries. Solid understanding of data modeling, warehousing, and performance tuning. Experience with CI/CD pipelines, version control, and infrastructure-as-code is a plus. Excellent problem-solving and communication skills. Preferred Qualifications : GCP Certification (e.g., Professional Data Engineer). Experience with Airflow, Kubernetes, or Terraform.

Posted 3 weeks ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 3 weeks ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Required Skills & Qualifications: • Education: Bachelors degree in computer science, Information Technology, Engineering, or a related field. • Experience: 7+ years of experience in data engineering, with at least 2 years working with GCP. • Technical Skills: Proficiency in GCP services: Big Query, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Cloud Functions. Strong programming skills in Python, SQL, Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in Big Query. Familiarity with CI/CD pipelines and version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: • Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. • Domain Knowledge: Experience in finance, e-commerce, healthcare domain is a plus.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Narmadha

Posted 3 weeks ago

Apply

5 - 10 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

DBA Role: Expertise in writing and optimizing queries for performance, including but not limited to Redshift/Postgres/SQL/Big Query and using query plans. Expertise in database permissions, including but not limited to Redshift/BigQuery /Postgres/ SQL / Windows AD Knowledge of database design/ ability to work with data architects and other IT specialists to set up, maintain and monitor data networks/ storage / metrics. Expertise in backup and recovery, including AWS Redshift snapshot restores. Redshift (provisioned and serverless) configuration and creation. Redshift Workload Management, Redshift table statistics. Experience working with third party vendors, being able to coordinate with third parties and internal stakeholders to troubleshoot issues. Experience working with internal stakeholders and business partners on both long- and short-term projects related to efficiency, optimization and cost reduction. Expertise in database management best practices/ IT security best practices Experience with the following a plus: Harness Git Cloud watch Cloudablity Other monitoring dashboard configurations Experience with a variety of computer information systems Excellent attention to detail Problem-solving and critical thinking Ability to explain complex ideas in simple terms

Posted 1 month ago

Apply

8 - 13 years

14 - 24 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team

Posted 1 month ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .

Posted 1 month ago

Apply

10 - 18 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Roles and Responsibilities • 10+ years of relevant work experience, including previous experience leading Data related projects in the field of Reporting and Analytics. • Design, build & maintain scalable data lake and data warehouse in cloud (GCP) • Expertise in gathering business requirements, analyzing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements • Creating solution prototypes and participating in technology selection. Perform POC and technical presentations • Architect, develop and test scalable data warehouses and data pipelines architecture in Cloud Technologies (GCP) • Experience in SQL and No SQL DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB. • Design and develop scalable ETL processes, including error handling. • Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R. • Preparing data structures for advanced analytics and self-service reporting using MS SQL, SSIS, SSRS • Write scripts for stored procedures, database snapshots backups and data archiving. • Experience with any of these cloud-based technologies: o PowerBI/Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake o AWS RedShift, Glue, Athena, AWS Quicksight o Google Cloud Platform Good to have : • Agile development environment pairing DevOps with CI/CD pipelines • AI/ML background

Posted 1 month ago

Apply

2 - 5 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. ? Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution ? Please find the below JD Exp5-8 Years Good understanding of DWH GCP(Google Cloud Platform) BigQuery knowledge Knowledge of GCP Storage GCP Workflows and Functions Python CDC Extractor Tools like(Qlik/Nifi) BI Knowledge(like Power BI or looker) ? 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet ? Deliver No. Performance Parameter Measure 1. Contribution to customer projects Quality, SLA, ETA, no. of tickets resolved, problem solved, # of change requests implemented, zero customer escalation, CSAT 2. Automation Process optimization, reduction in process/ steps, reduction in no. of tickets raised 3. Skill upgradation # of trainings & certifications completed, # of papers, articles written in a quarter ? Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

10 - 15 years

10 - 20 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Job Title - Data Scientist and Analytics Level 7:Manager Ind & Func AI Decision Science Manager S&C Management Level:07 - Manager Location Bangalore/Gurgaon/Hyderabad/Mumbai Must have skills: Technical (Python, SQL, ML and AI), Functional (Data Scientist and B2B Analytics preferably in Telco and S&P industries) Good to have skillsGEN AI, Agentic AI, cloud (AWS/Azure, GCP) Job Summary : About Global Network Data & AI:- Accenture Strategy & Consulting Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition About Comms & Media practice The Accenture Center for Data and Insights (CDI) team helps businesses integrate data and AI into their operations to drive innovation and business growth by designing and implementing data strategies, generating actionable insights from data, and enabling clients to make informed decisions. In CDI, we leverage AI (predictive + generative), analytics, and automation to build innovative and practical solutions, tools and capabilities. The team is also working on building and socializing a Marketplace to democratize data and AI solutions within Accenture and for clients. Globally, CDI practice works across industry to develop value growth strategies for its clients and infuse AI & GenAI to help deliver top their business imperatives i.e., revenue growth & cost reduction. From multi-year Data & AI transformation projects to shorter more agile engagements, we have a rapidly expanding portfolio of hyper-growth clients and an increasing footprint with next-gen solutions and industry practices. Roles & Responsibilities: Experienced in Analytics in B2B domain. Responsible to help the clients with designing & delivering AI/ML solutions. He/she should be strong in Telco and S&P domain, AI fundamentals and should have good hands-on experience working with the following: Ability to work with large data sets and present conclusions to key stakeholders; Data management using SQL. Data manipulation and aggregation using Python. Propensity modeling using various ML algorithms. Text mining using NLP/AI techniques Propose solutions to the client based on gap analysis for the existing Telco platforms that can generate long term & sustainable value to the client. Gather business requirements from client stakeholders via interactions like interviews and workshops with all stakeholders Track down and read all previous information on the problem or issue in question. Explore obvious and known avenues thoroughly. Ask a series of probing questions to get to the root of a problem. Ability to understand the as-is process; understand issues with the processes which can be resolved either through Data & AI or process solutions and design detail level to-be state Understand customer needs and identify/translate them to business requirements (business requirement definition), business process flows and functional requirements and be able to inform the best approach to the problem. Adopt a clear and systematic approach to complex issues (i.e. A leads to B leads to C). Analyze relationships between several parts of a problem or situation. Anticipate obstacles and identify a critical path for a project. Independently able to deliver products and services that empower clients to implement effective solutions. Makes specific changes and improvements to processes or own work to achieve more. Work with other team members and make deliberate efforts to keep others up to date. Establish a consistent and collaborative presence with clients and act as the primary point of contact for assigned clients; escalate, track, and solve client issues. Partner with clients to understand end clients' business goals, marketing objectives, and competitive constraints. Storytelling Crunch the data & numbers to craft a story to be presented to senior client stakeholders. Professional & Technical Skills: Overall 10+ years of experience in Data Science B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Demonstrated experience in solving real-world data problems through Data & AI Direct onsite experience (i.e., experience of facing client inside client offices in India or abroad) is mandatory. Please note we are looking for client facing roles. Proficiency with data mining, mathematics, and statistical analysis Advanced pattern recognition and predictive modeling experience; knowledge of Advanced analytical fields in text mining, Image recognition, video analytics, IoT etc. Execution level understanding of econometric/statistical modeling packages Traditional techniques like Linear/logistic regression, multivariate statistical analysis, time series techniques, fixed/Random effect modelling. Machine learning techniques like - Random Forest, Gradient Boosting, XG boost, decision trees, clustering etc. Knowledge of Deep learning modeling techniques like RNN, CNN etc. Experience using digital & statistical modeling software Python (must), R, PySpark, SQL (must), BigQuery, Vertex AI Proficient in Excel, MS word, Power point, and corporate soft skills Knowledge of Dashboard creation platforms Excel, tableau, Power BI etc. Excellent written and oral communication skills with ability to clearly communicate ideas and results to non-technical stakeholders. Strong analytical, problem-solving skills and good communication skills Self-Starter with ability to work independently across multiple projects and set priorities Strong team player Proactive and solution oriented, able to guide junior team members. Execution knowledge of optimization techniques is a good-to-have Exact optimization Linear, Non-linear optimization techniques Evolutionary optimization Both population and search-based algorithms Cloud platform Certification, experience in Computer Vision are good-to-haves Qualifications Experience: B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Educational Qualification: B.tech or MSC in Statistics and Data Science

Posted 1 month ago

Apply

7 - 11 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Skill required:Procure to Pay Processing - Invoice Processing Operations Designation:Management Level - Team Lead/Consultant Job Location:Mumbai Qualifications:Any Graduation Years of Experience:7 to 11 years What would you do? The incumbent should be an expert in Accounts payable lifecycle and will be responsible for Must be flexible in working hours UK/US (EST hours in US shift if required) Managing team of 30-35 FTEs.for end to end process. Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon and T&E. The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years The Procure to Pay Processing team helps clients and organizations by boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. The team is responsible for accounting of goods and services, through requisitioning, purchasing and receiving. They also look after order sequence of procurement and financial process end to end. In Invoice Processing Operations you will ensure efficient and accurate processing of expense invoices / claims in adherence with client policy and procedures.You will be working on audit claims in accordance with client policies and procedures. You will work on save/post invoice in ERP,verify WHT, VAT/WHT discrepancy resolution.You will also be required to post the invoices for payment and work on PO Process, Non - PO, credit note, 2 way Match & 3 Way Match, Email management and ERP Knowledge. What are we looking for? Adaptable and flexible Ability to perform under pressure Problem-solving skills Detail orientation Ability to establish strong client relationship Minimum 10-12 years of AP experience in BPO out of which 7-8 years minimum with experience @ Lead roles in different capacities. Minimum Bachelor's degree in Finance Accounting or related field Advanced knowledge of AP concepts and applications Strong understanding of AP metrics and SLAs and the factors that influence them System & applications Experience of working in SAP/Oracle ERP would be an added advantage. Intermediate knowledge of MS office tools (Excel/Word/PPT) is must. Having advanced excel knowledge would be an added advantage. Ability to run/support automation/RPA/process improvement initiatives parallel to the core job Ability to interact with client finance leads, understands the business and process. Excellent in communication skills both oral and written as need to interact client leadership. Should be able ssto articulate the things. Good understanding of risks, issues and have thought process to anticipate the potential risks in a process and set mitigations plans/controls to eliminate or minimize the risks. Roles and Responsibilities The Role:The incumbent should be an expert in Accounts payable lifecycle & will be responsible for:Must be flexible in working hours UK/US (EST hours in US shift if required)Managing team of 30-35 FTEs for end to end process.Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon & T&E.The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years Functional Responsibilities:Complete underst&ing of accounts payable life cycle & must possess in-depth knowledge of processing all categories of Invoices (PO, Non-PO, OTP Invoices, Utility Invoices, Statutory Payments, Payments Vendor Master, AP helpdesk.Should be an expert in managing all sub-processes PTP.Should have experience of h&ling international client in BPM organization. Must possess great interpersonal skills, must have experience of speaking to client leads & have regular governance.Manage AP teams & processes in accordance with documented procedures & policies.Participate in the weekly, monthly governance call & manage the status call. Lead the resolution of complex or sensitive issues from client, senior management, or vendor queries on a timely basis.Track the progress of Knowledge Transfer, Transition progress & proactively work on deviation if any to fix it.Monitor process & operational KPIs to ensure effective delivery against targets & benchmarks.Manage & oversee control procedures & practices to ensure no significant SOX control deficiencies in the AP delivery sub-function.Drive controls & compliance in a process & ensure 100% noiseless operations. Identify & support AP improvement initiatives to drive operational efficiencies & improved controls.Manage required & appropriate reporting to facilitate informed decision making (e.g. aging, forecasted payables)Support regional leadership through business partnering by providing metrics, problem resolution, & reporting process performance.Maintain files & documentation thoroughly & accurately, in accordance with company policy. People Management Responsibilities:Supervise & manage an PTP team with multiple sub-processes, approximately 30-35 team members, ensuring communication & coordination across teams Closely work with Team leads & SMEs to drive the business transformation

Posted 1 month ago

Apply

5 - 10 years

7 - 11 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

EDS Specialist - NAV02KL Company Worley Primary Location IND-MM-Navi Mumbai Job Engineering Design Systems (EDS) Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Apr 7, 2025 Unposting Date May 30, 2025 Reporting Manager Title Manager We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team . Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects.? The Role As an EDS Specialist with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Duties and responsibilities The AVEVA Engineering Senior Administrator is responsible for project set up, maintenance and support of the system. Senior Administrator shall ensure the set-up, configuration and deliverables are in line with organization/project/client standards. Gain full understanding of the scope, overall schedule, deliverables, milestones and coordination procedure. Understanding, documenting and managing the functional requirements (business scope) for an AVEVA Engineering implementation. Performing AVEVA Engineering support tasks Performing project implementations including configurations, reports and gateway. Suggesting how to improve AVEVA Engineering or optimize implementation. Providing advanced support and troubleshooting. Continually seeking opportunities to increase end-user satisfaction. Promote use of AVEVA Engineering and the value it brings to the projects within the organization. Qualifications: Bachelors degree in Engineering with at least 10 years of experience 5+ years of relevant experience in AVEVA Engineering. 5+ years of relevant experience in AEVA PDMS/E3D Administration. In-depth working knowledge of configuration and management of AVEVA Engineering, including project Administration Fully proficient with the management of Dabacon databases Knowledge of Engineering workflow in an EPC environment. Strong analytical and problem-solving skills Ability to work in a fast-paced environment. Effective oral and written communication skills required Experience with setting up Integration between Aveva Engineering and other Aveva and Hexagon design applications Good understanding of the Engineering data flow between various engineering application will be a plus Proficient in PML programming Good to have : Knowledge of writing PML1/2/.net and C# programs, and Visual basic .Net Previous experience of AVEVA NET Previous experience of AVEVA CAT/SPEC, ERM Moving forward together Were committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change.

Posted 1 month ago

Apply

3 - 5 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Google BigQuery. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2

Posted 1 month ago

Apply

- 2 years

2 - 4 Lacs

Gurugram

Remote

Naukri logo

What does the team do? The Ad Operations team is responsible for setting up, managing, analysing and optimising digital advertising campaigns. They ensure ads are delivered correctly, track performance, and troubleshoot issues to maximise campaign effectiveness and revenue consumption. What Youll Do? Data Management: Gather, organize, and maintain data related to advertising campaigns and their revenue, ensuring accuracy and consistency. Querying the Database: Using SQL/ BigQuery to run queries on ShareChats analytical engine Scripting: Writing scalable scripts to fetch or modify data from API endpoints. Collaborate with data teams to ensure proper integration and flow of data between different systems and platforms. Reporting and Insights: Create reports and dashboards to visualize key performance metrics. Generate regular and ad-hoc reports that provide insights into monthly/quarterly/ annual revenue, campaign performance and key metrics. Communicate findings and insights to cross-functional teams, including AdOps, sales and management, to drive data-informed decision-making. Ad Strategy: Work with Strategy team with data insights to develop Go to Market strategies for Key Clients Monitor ad inventory levels and work with Strategy teams to ensure ad space is efficiently utilized. Assist in forecasting future ad inventory needs based on historical data. Identify opportunities for process improvements and automation in ad operations workflows. Contribute to the development and implementation of best practices and standard operating procedures for ad operations. Salesforce Administration, Integration and Automation: Configure, customize and maintain the Salesforce CRM system to meet the specific needs of the advertising team. Create and manage custom objects, fields, and workflows to support advertising operations. Integrate Salesforce with other advertising and marketing tools and platforms for seamless data flow. Automate routine tasks and processes to improve efficiency and reduce manual work. Who are you? BS in Mathematics, Economics, Computer Science, Information Management or Statistics is preferred. Proven working experience as a data analyst or business data analyst Strong knowledge of and experience with SQL/ BigQuery and Excel Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Experience with Salesforce would be an advantage.

Posted 1 month ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies