Home
Jobs
Companies
Resume

962 Bigquery Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327298 We are currently seeking a GCP Solution Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Chennai, Gurugram, Bengaluru

Work from Office

Naukri logo

Req ID: 321996 We are currently seeking a GCP-Cloud Run Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). GCP-Cloud Run Exp- 5+ years Notice Period- Immediate to 30 days : Senior Application Developer with Google Cloud Platform experience in BigQuery, SQL, CloudRun. Need a Senior Application Developer with GCP Skillset for a project involving re-design and re-platform of legacy Revenue Allocation system Mandatory Skills: GCP BigQuery, SQL, CloudRun Desired Skills: Linux Shell Scripting is a huge plus; Nice to have - Kafka, MQ Series, Oracle PL/SQL Location - Bengaluru,Chennai,Gurugram,Hyderabad,Noida,Pune

Posted 6 days ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Google BigQuery.

Posted 6 days ago

Apply

7.0 - 12.0 years

35 - 40 Lacs

Pune

Work from Office

Naukri logo

Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune . We are looking for candidates with 7 + years of experience in below skills - Primary skills : Understanding of AI ML in DE Python Data Engineers Database -Big query or Snowflake Interested candidates for above position kindly share your CVs on chitralekha.so@peoplefy.com with below details - Experience : CTC : Expected CTC : Notice Period : Location :

Posted 6 days ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs

Posted 6 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA Senior GenAI Data Engineer We are seeking an experienced Senior Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What you'll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Requirements: Bachelors degree in computer science, Engineering, or related fields (Master's recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs Location: Delhi or Bangalore Workplace type : Hybrid Working

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location - Bangalore, Chennai, Pune, Gurgaon, Kolkata Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Here is the job description for the same - Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3~13 years of experience in IT or professional services experience in IT delivery or large-scale IT analytics projects 3+ years of expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets

Posted 1 week ago

Apply

6.0 - 8.0 years

17 - 18 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. • Work extensively with BigQuery for data processing, querying, and optimization. • Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. • Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. • Debug technical issues, perform root cause analysis, and provide solutions for production incidents. • Ensure data quality, accuracy, and integrity across data pipelines. • Collaborate with cross-functional teams to define technical requirements and deliver solutions. • Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required: • Proficiency in SQL and PL/SQL for querying and manipulating data. • Experience in Python for data processing and automation. • Hands-on experience with Google Cloud Platform (GCP), particularly: o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub • Experience with GitHub and CI/CD pipelines for automation and deployment. • Performance tuning and performance testing of ELT processes. • Strong analytical and debugging skills to resolve data and pipeline issues efficiently. • Self-motivated and able to work independently as an individual contributor. • Good understanding of data modeling, database design, and data warehousing concepts. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 16 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. • Work extensively with BigQuery for data processing, querying, and optimization. • Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. • Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. • Debug technical issues, perform root cause analysis, and provide solutions for production incidents. • Ensure data quality, accuracy, and integrity across data pipelines. • Collaborate with cross-functional teams to define technical requirements and deliver solutions. • Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required: • Proficiency in SQL and PL/SQL for querying and manipulating data. • Experience in Python for data processing and automation. • Hands-on experience with Google Cloud Platform (GCP), particularly: o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub • Experience with GitHub and CI/CD pipelines for automation and deployment. • Performance tuning and performance testing of ELT processes. • Strong analytical and debugging skills to resolve data and pipeline issues efficiently. • Self-motivated and able to work independently as an individual contributor. • Good understanding of data modeling, database design, and data warehousing concepts. Preferred candidate profile

Posted 1 week ago

Apply

3.0 - 6.0 years

8 - 16 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Job Description: Node.js and JavaScript development. Proficiency in Python for scripting and automation. Experience in BigQuery SQL scripting or equivalent SQL platforms. Familiarity with CI/CD pipelines and tools like CircleCI. Strong understanding of Test-Driven Development (TDD) principles. Basic experience with cloud platforms (e.g., GCP, AWS, Azure).

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

About The Role : Role Purpose Data Analyst, Data Modeling, Data Pipeline, ETL Process, Tableau, SQL, Snowflake. Do Strong expertise in data modeling, data warehousing, and ETL processes. - Proficient in SQL and experience with data warehousing tools (e.g., Snowflake, Redshift, BigQuery) and ETL tools (e.g., Talend, Informatica, SSIS). - Demonstrated ability to lead and manage complex projects involving cross-functional teams. - Excellent analytical, problem-solving, and organizational skills. - Strong communication and leadership abilities, with a track record of mentoring and developing team members. - Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. - Preference to candidates with experience in ETL using Python, Airflow or DBT Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Google BigQuery. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Google BigQuery. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

4.0 - 8.0 years

4 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Key Responsibilities: Monitor and analyze website traffic, user engagement, and conversion rates using Piwik PRO and other relevant analytics tools. Develop and maintain dashboards and reports to track key performance indicators (KPIs) for web and digital marketing initiatives. Define, own and drive the tagging and tracking strategy , roadmap and operational plan in line with business needs on website and website funnel performances Ensure accurate tagging, tracking, and data collection by working closely with developers to implement Piwik PRO Tag Manager or other tracking solutions. Be able to analyze user paths and and optimize desired goals completion. Provide insights and recommendations based on data analysis to support business goals. Identify trends, patterns, and areas for improvement to enhance user experience and site performance. Track and report the effectiveness of online campaigns, landing pages, and digital assets. Collaborate with marketing, UX/UI designers, developers, and product managers to implement data-driven website enhancements. Qualifications & Skills: Experience in web analytics, digital marketing analytics, or a similar role. Proficiency in Piwik PRO , Google Tag Manager, or equivalent web analytics tools. Experience working with data visualization tools like Excel and Power BI. (Looker studio is a plus.) Strong understanding of SEO, digital marketing strategies, and user experience principles. Knowledge of HTML, CSS, and web tracking implementation. Knowledge of JavaScript or SQL is a plus. Strong analytical and problem-solving skills with attention to detail.

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Role - Machine Learning Engineer Required Skills & Experience 5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting. Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts. Proven expertise in data blending, transformation, and feature engineering , adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data. Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable. Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R). Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision . Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch ). Familiarity with MLOps tools and practices , including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines. Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs. Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Interested candidates share cv to dikshith.nalapatla@motivitylabs.com

Posted 1 week ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Mumbai

Work from Office

Naukri logo

Design and conduct statistical analysis of large and complex educational datasets to generate actionable insights Build analytical models to uncover trends, test hypotheses, and support data-driven policy decisions Develop and maintain interactive dashboards and visualizations that communicate insights clearly to nontechnical stakeholders Conduct diagnostic and predictive analysis using sound statistical methodologies Collaborate with internal teams and government stakeholders to translate insights into strategic program or policy recommendations Support data cleaning, preparation, and management processes to ensure high data integrity across sources Contribute to the design of data collection frameworks for new education initiatives Prepare high-quality analytical reports and documents that summarize findings Document methodologies, analysis processes, and assumptions used during data analysis to ensure reproducibility and transparency Who we are looking for: We are seeking a recent graduate or early-career professional with strong technical and analytical skills. The ideal candidate will have: Freshers or candidates with upto 2 years of experience. A strong academic background in Statistics, Econometrics, or a related quantitative field (e.g., M.Sc. in Statistics) Deep understanding of statistical techniques, including regression, clustering, hypothesis testing, timeseries analysis, and multi-variate analysis Proficiency in at least one programming language for statistical analysis: R or Python Strong data visualization skills using tools such as Tableau, Looker Studio, Power BI, or libraries like ggplot, matplotlib, seaborn Hands-on experience with SQL-based databases (e.g., MySQL, PostgreSQL) and familiarity with NoSQL databases (e.g., MongoDB, Firebase) Ability to clearly present complex data in compelling and accessible formats to diverse stakeholders Excellent problem-solving, analytical thinking, and communication skills High level of self-motivation, attention to detail, and alignment with EdIndias mission of transforming public education through data Preferred Skills: Experience with cloud platforms such as AWS or Google BigQuery Exposure to software development tools like Python (for scripting or automation) or Flutter (for app or dashboard development) Understanding of education data systems or prior experience working with government/public sector datasets Why Join Us: Opportunity to work on impactful, data driven education projects. Collaborative and growth-oriented work environment. Hands-on experience with real-world datasets and decision-making processes A chance to contribute to the transformation of education in India.

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 week ago

Apply

15.0 - 20.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Business and Integration Architect Project Role Description : Designs the integration strategy endpoints and data flow to align technology with business strategy and goals. Understands the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Must have skills : SAP EWM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Architect, you will be responsible for designing the integration strategy endpoints and data flow to align technology with business strategy and goals. Your typical day will involve collaborating with various teams to understand their needs, analyzing project requirements, and ensuring that the integration processes are seamless and efficient. You will engage in discussions to refine strategies, oversee the implementation of solutions, and monitor the overall project lifecycle to ensure that all aspects from coding to deployment are executed effectively. Your role will also require you to stay updated with the latest technologies and methodologies to enhance integration practices and drive business success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP EWM.- Strong understanding of integration strategies and data flow management.- Experience with project lifecycle management, including requirements analysis and deployment.- Ability to analyze complex business requirements and translate them into technical solutions.- Familiarity with various integration tools and technologies. Additional Information:- The candidate should have minimum 5 years of experience in SAP EWM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

8.0 - 12.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Roles & Responsibilities: Total 8-10 years of working experience Experience/Needs 8-10 Years of experience with big data tools like Spark, Kafka, Hadoop etc. Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc. Design, build, test and deploy streaming pipelines for data processing in real time and at scale Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc.. Experience with object-oriented/object function scripting languagesScala, Java, etc. Develop software systems using test driven development employing CI/CD practices Partner with other engineers and team members to develop software that meets business needs Follow Agile methodology for software development and technical documentation Good to have banking/finance domain knowledge Strong written and oral communication, presentation and interpersonal skills. Exceptional analytical, conceptual, and problem-solving abilities Able to prioritize and execute tasks in a high-pressure environment Experience working in a team-oriented, collaborative environment 8-10 years of hand on coding experience Proficient in Java, with a good knowledge of its ecosystems Experience with writing Spark code using scala language Experience with BigData tools like Sqoop, Hive, Pig, Hue Solid understanding of object-oriented programming and HDFS concepts Familiar with various design and architectural patterns Experience with big data toolsHadoop, Spark, Kafka, fink, Hive, Sqoop etc. Experience with relational SQL and NoSQL databases like MySQL, PostgreSQL, Mongo dB and Cassandra Experience with data pipeline tools like Airflow, etc. Experience with AWS cloud servicesEC2, S3, EMR, RDS, Redshift, BigQuery Experience with stream-processing systemsStorm, Spark-Streaming, Flink etc. Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Expertise in design / developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks Location:Pune/ Mumbai/ Bangalore/ Chennai

Posted 1 week ago

Apply

6.0 - 10.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Develop SAP BW-IP data flow in S4 HANA system. Provide inputs on data modelling between BW 7.4 on HANA and native HANA using Composite provider, ADSO and open ODS view Excellent communication skills both verbal & written in English are required. Self-motivated, capable to manage own workload with minimum supervision. Creating complex, enterprise-transforming applications within a dynamic, progressive, technically diverse environment Location : Pan India

Posted 1 week ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies; defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies