Home
Jobs

962 Bigquery Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1958_JOB Date Opened 16/05/2023 Industry Technology Job Type Work Experience 6-10 years Job Title SAP BW IP City Hyderabad Province Telangana Country India Postal Code 500001 Number of Positions 5 Develop SAP BW-IP data flow in S4 HANA system. Provide inputs on data modelling between BW 7.4 on HANA and native HANA using Composite provider, ADSO and open ODS view Excellent communication skills both verbal & written in English are required. Self-motivated, capable to manage own workload with minimum supervision. Creating complex, enterprise-transforming applications within a dynamic, progressive, technically diverse environment Location : Pan India check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2109_JOB Date Opened 09/02/2024 Industry Technology Job Type Work Experience 4-6 years Job Title CXO Analyst City Pune City Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationBangalore, Mumbai, Pune, Chennai, Hyderabad, Kolkata, Delhi Technical Skills: Data strategist with expertise in Product Growth, Conversion Rate Optimization (experiments) Personalisation using digital analytics skills such as Adobe Analytics, Google Analytics, and others (Quantitative and Qualitative)Proven success in acquisition marketing and retention marketing by leveraging strategic and tactical implementation of Conversion Rate Optimization (A/B Testing) and Product Optimization (Landing Page Optimization, Product-Market Fit) through data insights.Personalisation - It includes customer segmentation, targeted messaging, dynamic content, recommendations, behavioral targeting, and AI-powered personalization.Skilled in leveraging advanced analytics tools for actionable data extraction from extensive datasets such as SQL, Big Query, Excel, Python for data analysis, Power BI, and Data StudioProficient in implementing digital analytics measurement across diverse domains using tools such as Adobe Launch, Google Tag Manager, and TealiumSoft skills:Experience in client-facing projects & stakeholder mgmt, excellent communication skillsCollaborative team player, aligning product vision with business objectives cross-functionallyAvid learner, staying current with industry trends and emerging technologiesCommitted to delivering measurable results through creative thinking, exceeding performance metrics, and fostering growth check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2412_JOB Date Opened 04/02/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Data Modeller City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1963_JOB Date Opened 17/05/2023 Industry Technology Job Type Work Experience 5-8 years Job Title Neo4j GraphDB Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 5 Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such as Neo Dash, Tableau and PowerBI The Expertise You Have: Bachelors or Masters Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Demonstrable experience in implementing data solutions in Graph DB space. Hands-on experience with graph databases (Neo4j(Preferred), or any other). Experience Tuning Graph databases. Understanding of graph data model paradigms (LPG, RDF) and graph language, hands-on experience with Cypher is required. Solid understanding of graph data modelling, graph schema development, graph data design. Relational databases experience, hands-on SQL experience is required. Desirable (Optional) skills: Data ingestion technologies (ETL/ELT), Messaging/Streaming Technologies (GCP data fusion, Kinesis/Kafka), API and in-memory technologies. Understanding of developing highly scalable distributed systems using Open-source technologies. Experience in Supply Chain Data is desirable but not essential. Location: Pune, Mumbai, Chennai, Bangalore, Hyderabad check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Naukri logo

"Position Description: Full Stack Data Engineers will work with Data Scientists and Product Development Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Mandatory skill : ETL_GCP_Bigquery Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. Work extensively with BigQuery for data processing, querying, and optimization. Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. Debug technical issues, perform root cause analysis, and provide solutions for production incidents. Ensure data quality, accuracy, and integrity across data pipelines. Collaborate with cross-functional teams to define technical requirements and deliver solutions. Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required: Proficiency in SQL and PL/SQL for querying and manipulating data. Experience in Python for data processing and automation. Hands-on experience with Google Cloud Platform (GCP), particularly: o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub Experience with GitHub and CI/CD pipelines for automation and deployment. Performance tuning and performance testing of ELT processes. Strong analytical and debugging skills to resolve data and pipeline issues efficiently. Self-motivated and able to work independently as an individual contributor. Good understanding of data modeling, database design, and data warehousing concepts.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Data Analyst Location: Bangalore Experience: 8 - 15 Yrs Type: Full-time Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Masters degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 5+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer Location: Bangalore - Onsite Experience: 8 - 15 years Type: Full-time Role Overview We are seeking an experienced Data Engineer to build and maintain scalable, high-performance data pipelines and infrastructure for our next-generation data platform. The platform ingests and processes real-time and historical data from diverse industrial sources such as airport systems, sensors, cameras, and APIs. You will work closely with AI/ML engineers, data scientists, and DevOps to enable reliable analytics, forecasting, and anomaly detection use cases. Key Responsibilities Design and implement real-time (Kafka, Spark/Flink) and batch (Airflow, Spark) pipelines for high-throughput data ingestion, processing, and transformation. Develop data models and manage data lakes and warehouses (Delta Lake, Iceberg, etc) to support both analytical and ML workloads. Integrate data from diverse sources: IoT sensors, databases (SQL/NoSQL), REST APIs, and flat files. Ensure pipeline scalability, observability, and data quality through monitoring, alerting, validation, and lineage tracking. Collaborate with AI/ML teams to provision clean and ML-ready datasets for training and inference. Deploy, optimize, and manage pipelines and data infrastructure across on-premise and hybrid environments. Participate in architectural decisions to ensure resilient, cost-effective, and secure data flows. Contribute to infrastructure-as-code and automation for data deployment using Terraform, Ansible, or similar tools. Qualifications & Required Skills Bachelors or Master’s in Computer Science, Engineering, or related field. 6+ years in data engineering roles, with at least 2 years handling real-time or streaming pipelines. Strong programming skills in Python/Java and SQL. Experience with Apache Kafka, Apache Spark, or Apache Flink for real-time and batch processing. Hands-on with Airflow, dbt, or other orchestration tools. Familiarity with data modeling (OLAP/OLTP), schema evolution, and format handling (Parquet, Avro, ORC). Experience with hybrid/on-prem and cloud platforms (AWS/GCP/Azure) deployments. Proficient in working with data lakes/warehouses like Snowflake, BigQuery, Redshift, or Delta Lake. Knowledge of DevOps practices, Docker/Kubernetes, Terraform or Ansible. Exposure to data observability, data cataloging, and quality tools (e.g., Great Expectations, OpenMetadata). Good-to-Have Experience with time-series databases (e.g., InfluxDB, TimescaleDB) and sensor data. Prior experience in domains such as aviation, manufacturing, or logistics is a plus. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Coimbatore

Work from Office

Naukri logo

Overview Role-Data Support-Sr Analyst skills: SQL, AWS, Meta ads, Google ads or Tiktok ads Location-Bangalore, Hyderabad, Chennai & Coimbatore Work Model-Hybrid Annalect is continuously evolving its Technology Operations function, and as part of this expansion, we are seeking a motivated and dynamic individual to fill the Senior Analyst role. In this position, you will work on providing technical support for multiple applications, both developed by our internal Annalect Technology team and integrated into Annalect's ecosystem as SaaS and PaaS solutions. This role is essential for the continued daily support demands of our global user community and is critical to the overall success of the organization in deploying and supporting its technical stack across the company. Responsibilities Key Responsibilities: Maintaining, documenting, Technology/Application Support and Service Level Agreements. Manage user access and onboarding for platforms, including Active Directory (AD) and cloud-based tools. Develop, implement, and manage IT solutions to improve visibility, automate workflows, and optimize IT operations. Work closely with the onshore/offshore/cross-functional team providing ongoing support for the Annalect technology and Business teams. Ongoing support of the Annalect technology and Business teams using existing tools or to be built tool/application sets. Strong understanding of ad platforms such as Google Ads, Meta, TikTok, Amazon DSP, DV360, The Trade Desk Etc. Must have good QA Skill to compare key advertising metrics (Clicks, Impressions, Cost, etc.) between the platform and the destination data. Documenting, implementing and managing statistics to ensure that the AOS team is operating at a highly efficient and effective level. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Must be able to work both independently and as a productive member in a team. Leads team projects and activities using project management methodology. It is expected that this position will require 40% process/procedure and 60% technology skills Can be available 24x7 for occasional technology / application related issues. Support robots encountering issues by taking on tickets and identifying root cause analysis. Maintain and update documentations whenever changes are made to the robots. Knowledgeable and able to support RPA infrastructure and architecture. Qualifications Required Skills 5-7 years of relevant and progressive experience as a Technology Operations Analyst, or in similar role Self-motivated and action-driven with the ability to take initiative, execute and follow-through Ability to clearly articulate technical and functional information to various audiences, both verbally and written High degree of organizational skills and ability to reprioritize based on business needs Excellent written and verbal communication skills Strong understanding of ad platform ecosystems, including campaign management, Ad Manager and Business Manager, tracking methodologies, data ingestion, and reporting workflows. Knowledge of ad operations, audience targeting, attribution models. Proficient in Excel, with demonstrated ability to organize and consolidate multiple data sources for analysis. Must possess critical thinking and exhibit problem solving skills for technical and software related issues. Has worked with a single sign-on platform inclusive of user and application setup/support. Good understanding of different methodologies such as DevOps, CICD (Continuous Integration, Continuous Delivery)/Agile/Kanban, AWS. Good working knowledge of Microsoft tools (Office, Sharepoint), CRM (JIRA, Hubspot) and reporting tools (PowerBI, Tableau etc.) Proficiency in SQL, Google BigQuery, Starburst for querying and analyzing large datasets. Strong understanding of APIs and troubleshooting. Exposure to Generative AI models (e.g., OpenAI, GPT-based models).

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities:Cloud Infrastructure Management:o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP).o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services.Kubernetes and Containerization:o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications. Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies.CI/CD Pipelines:o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD.o Automate deployment workflows for containerized and serverless applications.Security and Compliance:o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption. Conduct regular audits to ensure compliance with organizational and regulatory standards. Collaboration and Support:o Work closely with development teams to containerize applications and ensure smooth deployment on GCP.o Provide support for troubleshooting and resolving infrastructure-related issues.Cost Optimization:o Monitor and optimize GCP resource usage to ensure cost efficiency.o Implement strategies to reduce cloud spend without compromising performance.Required Skills and Qualifications:Certifications:o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification.Cloud Expertise:o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub.DevOps Tools:o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.o Experience with containerization tools like Docker.Kubernetes Expertise:o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets.o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize.5. Programming and Scripting:o Strong scripting skills in Python, Bash, or Go.o Familiarity with YAML and JSON for configuration management.Monitoring and Logging:o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite.Networking:o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers. Soft Skills: Strong problem-solving and troubleshooting skills.o Excellent communication and collaboration abilities.o Ability to work in an agile, fast-paced environment.

Posted 1 week ago

Apply

8.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Gurugram

Hybrid

Naukri logo

We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based marketing ODL applications. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCPs big data tools like: BigQuery: For data warehousing and SQL analytics. Dataproc: For running Spark and Hadoop clusters. Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions . 2. Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology.

Posted 1 week ago

Apply

7.0 - 12.0 years

25 - 37 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

GCP Data Engineer (Big Query + SQL + ETL Knowledge + Python, Data Flow, Pubsub, CICD) "KASHIF@D2NSOLUTIONS.COM"

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 15 Lacs

Gurugram

Hybrid

Naukri logo

Department overview AutomotiveMastermind provides U.S. automotive dealers with AI/behavior prediction analytics software and marketing solutions that improve the vehicle purchase process and results. The companys cloud-based technology helps dealers precisely predict automobile-buying behavior and automates the creation of microtargeted customer communications, leading to proven higher sales and more consistent customer retention. Responsibilities: Work closely with Product Management and Data Strategy leadership to understand short and long-term roadmaps, and overall Data product strategy Drive backlog grooming agile sprint ceremony, acting as bridge between business needs and technical implementation Present on behalf of agile teams in sprint review, reiterating business value delivered with each work increment completed Develop expertise on the existing aM ecosystem of integrations and data available within the system Collaborate with data analysts, data management, data science, and engineering teams to develop short and long-term solutions to meet business needs and solve distinct problems Application of deep, creative, rigorous thinking to solve broad, platform-wide technical and/or business problems Identify key value drivers and key opportunities for/sources of error across products and processes Develop short-term preventive or detective measures, and leading medium/long-term product improvement initiatives arrived at via close collaboration with engineering, QA, and data support Coordinate with data engineers as appropriate to design and enable repeatable processes and generate deliverables to answer routine business questions What Were Looking For: Basic Required Qualifications: Minimum 4 years working experience as a Product Owner or Product Manager in an Agile scrum framework Experience using data and analytical processes to drive decision making, with ability to explain how analysis was done to an executive audience Strong knowledge of Agile development framework, with practical experience to support flexible application of principles Strong conceptual understanding of data integrations technologies and standards Working familiarity with road-mapping and issue tracking software applications (Aha!, MS Azure DevOps, Salesforce) Familiarity with Microsoft Excel, SQL, BigQuery, MongoDB, and Postman preferred An advocate for the importance of leveraging data, a supporter of the use of data analysis in decision-making, and a fierce promoter of data and engineering best practices throughout the organization. Passionate about empirical research A team player who is comfortable working with a globally distributed team across time zones A solid communicator, both with technology teams and with non-technical stakeholders Preferred: Experience with or awareness of and interest in dimensional data modeling concepts B.tech/M.tech qualified. Grade: 9 Location: Gurgaon Hybrid Mode: twice a week work from office Shift Time: 12 pm to 9 pm IST

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 17 Lacs

Ahmedabad

Work from Office

Naukri logo

Job Summary: Key Responsibilities: Design, develop, and maintain interactive and user-friendly Power BI dashboards and reports. Translate business requirements into functional and technical specifications. Perform data modeling, DAX calculations, and Power Query transformations. Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs. Optimize Power BI datasets, reports, and dashboards for performance and usability. Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy and relevance. Ensure security and governance best practices in Power BI workspaces and datasets. Provide ongoing support and troubleshooting for existing Power BI solutions. Stay updated with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelors degree in Computer Science, Information Technology, Data Analytics, or a related field. 4+ years of professional experience in data analytics or business intelligence. 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service). Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema). Proficiency in writing complex SQL queries and optimizing them for performance. Experience in working with large and complex datasets. Experience in BigQuery, MySql, Looker Studio is a plus. Ecommerce Industry Experience will be an added advantage. Solid understanding of data warehousing concepts and ETL processes. Experience with version control tools such as Power Apps & Power Automate would be a plus. Preferred Qualifications: Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse). Knowledge of other BI tools (Tableau, Qlik) is a plus. Familiarity with scripting languages (Python, R) for data analysis is a bonus. Experience integrating Power BI into web portals using Power BI Embedded.

Posted 1 week ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 09 The Team: Automotive Mastermind was founded on the idea that there are patterns in peoples behavior that, with the right logic, can be used to predict future outcomes.Our software helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale.Our culture is creative and entrepreneurial where everyone contributes to company goals in a very real way. We are a hardworking group, but we have a lot of fun with what we do and are looking for new people with a similar mindset to join the organization. The Impact: As a Quality Engineer you will collaborate with members of both, Product and Development Teams to help them make informed decisions on releases of one of the best tools there is for car dealerships in the United States. Whats in it for you: Possibility to work on a project in a very interesting domain - Automotive industry in the United States, and influence the quality of one of the best tools there is for car dealerships. Affect processes and tools used for Quality Engineering. Our Team has a high degree of autonomy in automotive Mastermind organization to decide what tools and processes we will use. Responsibilities: Own and be responsible for testing and delivery of product or core modules. Assessing the quality, usability and functionality of each release. Reviewing software requirement and capable in preparing test scenarios for complex business rules Interact with the stakeholders to understand the detailed requirements and expectations Be able to gain technical knowledge and aim to be a quality SME(s) in core functional components Developing and organizing QA Processes for assigned projects to align with overall QA goals Designing and implementing a test automation strategy supporting multiple product development teams Leading efforts for related automation projects, design and code reviews Producing regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audiences. What Were Looking For: Participate in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation, and refinement. Participate in the release planning process to review functional specifications and create release plans. Collaborate with software engineers to design verification test plans. Design regression test suites and review with engineering, applications and the field organization. Produce regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audience. Assess the quality, usability and functionality of each release. Develop and organize QA Processes for assigned projects to align with overall QA goals Lead and train a dynamically changing team of colleagues who participate in testing processes Exhibit expertise in handling large scale programs/projects that involve multiple stakeholders (Product, Dev, DevOps) Maintain a leading edge understanding of QA as related to interactive technologies best practices Design and implement test automation strategy for multiple product development teams at the onset of the project. Lead efforts for related automation projects, design and code reviews. Work closely with leadership and IT to provide input into the design and implementation of the automation framework. Work with Architecture, Engineering, Quality Engineering, IT, and Product Operations leaders to create and implement processes that accelerate the delivery of new features and products with high quality and at scale. Develop and contribute to a culture of high performance, transparency and continuous improvement as it relates to the infrastructure services and streamlining of the development pipeline. Participate in a diverse team of talented engineers globally, providing guidance, support and clear priorities. ? Who you are: Total Experience: 2 to 6 years. Hands on experience with at least 2 or more of leading testing tools/framework like Playwright, Robot Framework, K6, Jmeter. Hands on experience working on Python. Experience with Databases SQL/NoSQL. Experience working on CloudNative Applications. Hands on experience with Google Cloud Services like Kubernetes, Composer, Dataplex, Pub-Sub, BigQuery, AlloyDb, CloudSQL , lookerstudio etc. Strong analytical skills and ability to solve complex technical problems. API testing - must have understanding of RESTful design / best practices. Hands on experience testing APIs and test tools Experience with load / stress / performance testing and tools, Experience with Azure DevOps (or other similar issue/bug tracking systems) is required, Experience working with Cloud native applications. Ability to think abstract to ensure ability to not conform to the norm. Norms do not find bugs quickly, Experience working in an Agile software development organization, Experience supporting development and product teams Excellent verbal, written, and interpersonal communication skills; ability to interact with all levels of an organization Ability to work in an advisory capacity to identify key technical and business problems, develop and evaluate. Grade: 08 / 09 Job Location: Gurugram Hybrid Mode: twice a week work from office. Shift Time: 12 pm to 9 pm IST.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. TechnoGen Brief Overview: TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title : Senior Data Engineer Location : Hyderabad Required Experience : 5+ years Job Summary :- Overview: Seeking a Senior Data Engineer to design and optimize scalable data pipeline architectures and support analytics needs across cross-functional teams. Key Responsibilities: Design, build, and maintain data pipelines (ETL/ELT) using BigQuery, Python, and SQL Optimize data flow, automate processes, and scale infrastructure Develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools) Implement data quality checks and testing strategies Support CI/CD (DevSecOps) processes, conduct code reviews, and mentor junior engineers Collaborate with QA/business teams and troubleshoot issues across environments Core Skills: BigQuery, Python, SQL, Collibra, Airflow/Cloud Composer, Ascend or similar ETL tools Data integration, warehousing, and pipeline orchestration Data quality frameworks and incremental load strategies Strong experience with GCP or AWS serverless data warehouse environments Preferred Skills: DBT for transformation Collibra for data governance Working with unstructured datasets Qualifications: 5+ years in data engineering Graduate degree in CS, Statistics, or related field Strong analytical and SQL expertise Best Regards, Rampelli Kiran Kumar| Sr.IT Recruiter kiran.r@technogenindia.com www.technogenindia.com | Follow us on LinkedIn

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Program/Project Management Lead Project Role Description : Lead business and technology outcomes for assigned program, project, or contracted service. Leverage standard tools, methodologies and processes to deliver, monitor, and control service level agreements. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Kubernetes Engine Minimum 7.5 year(s) of experience is required Educational Qualification : M Tech Summary:As a Program/Project Management Lead, you will be responsible for leading business and technology outcomes for assigned programs, projects, or contracted services. Your typical day will involve leveraging standard tools, methodologies, and processes to deliver, monitor, and control service level agreements. Roles & Responsibilities:- Lead the planning and execution of programs and projects, ensuring adherence to timelines, budgets, and quality standards.- Collaborate with cross-functional teams to identify and prioritize project requirements, risks, and dependencies.- Develop and maintain project plans, status reports, and other project-related documentation.- Manage project budgets, forecasts, and financial reporting, ensuring accurate and timely delivery of financial information.- Provide leadership and guidance to project team members, ensuring effective communication and collaboration throughout the project lifecycle. Professional & Technical Skills:- Must To Have Skills:Expertise in Google Cloud Platform Architecture.- Good To Have Skills:Experience with Google Kubernetes Engine.- Strong understanding of program and project management methodologies and tools.- Experience in managing large-scale, complex programs and projects.- Excellent communication, leadership, and stakeholder management skills. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture.- The ideal candidate will possess a strong educational background in computer science, engineering, or a related field, along with a proven track record of delivering successful programs and projects.- This position is based at our Bengaluru office. Qualifications M Tech

Posted 1 week ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Tech for Operations - Agile Project Management Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: 13 to 18 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAn iterative, incremental method of managing the design and build activities of engineering, information technology and other business areas that aim to provide new product or service development in a highly flexible and interactive manner. It requires individuals and interactions from the relevant business to respond to change, customer collaboration, and management openness to non-hierarchical forms of leadership. What are we looking for? Results orientation Problem-solving skills Ability to perform under pressure Strong analytical skills Written and verbal communication Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead application development projects Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of data warehousing concepts Experience with cloud-based data platforms Hands-on experience in SQL and database management Good To Have Skills:Experience with Teradata BI Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Mumbai office A minimum of 15 years Full time education is required Qualifications minimum 15 years Full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Managed File Transfer Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure effective communication between client and operations teams. Analyze service delivery health and address performance issues. Conduct performance meetings to share data and trends. Professional & Technical Skills: Must To Have Skills:Proficiency in Managed File Transfer. Strong understanding of cloud orchestration and automation. Experience in SLA management and performance analysis. Knowledge of IT service delivery and escalation processes. Additional Information: The candidate should have a minimum of 5 years of experience in Managed File Transfer. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 week ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Customer Insight & Marketing Analytics Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years What would you do? Data & AIProcess by which data from customer behavior is used to help make key business decisions via market segmentation and predictive analytics. This information is used by businesses for direct marketing, site selection, and customer relationship management. What are we looking for? Data Analytics - with a specialization in the marketing domain Ability & experience working with paid media, CRM, Digital Advertising Analytics Website clickstream data and GA 4 Knowledge Highly experienced with SQL, Python and Big Query for exploring large datasets. Data Storytelling Familiarity with Tableau and Looker is a plus Problem-solving skills Ability to establish strong client relationship Ability to manage multiple stakeholders Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Microsoft 365 Security & Compliance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer in the Security Delivery job family group, you will be responsible for ensuring Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Your typical day will involve acting as a liaison between the client and Accenture operations teams for support and escalations, communicating service delivery health to all stakeholders, and holding performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Act as a liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Professional & Technical Skills: Must To Have Skills:Experience in Microsoft 365 Security & Compliance. Defender for O365, Defender for Identity, Defender for Endpoints, Defender for Cloud Apps, Defender for Cloud, Microsoft Purview, DLP, eDiscovery, Microsoft priva, Microsoft Sentinel. Good To Have Skills:Experience in Cloud orchestration and automation. Strong understanding of Cloud technologies and security principles. Experience in managing and monitoring Cloud infrastructure. Experience in incident management and problem resolution. Additional Information: The candidate should have a minimum of 6 years of experience in Microsoft 365 Security & Compliance. Qualifications 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Hyderabad, Pune

Hybrid

Naukri logo

Role 1 - GCP Data Engineer Must have skills /Mandatory Skills GCP, Big Query, Dataflow, Cloud Composer Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python Role 3 - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps Experience Range – 5+ Years Location – Only Pune & Hyderabad, If you are applying from outside Pune or Hyd, then you have to relocate to Pune or Hyd . Work Mode – Min 2 days are mandatory to work from home. Salary – 12-16 LPA Point to be remember Pls fill the Candidate Summary Sheet. Not Considering more than 30 days’ notice period. Highlights of this role. It’s a long term role. High Possibility of conversion within 6 Months or After 6 months ( if you perform well). Interview -Total 2 rounds of Interview ( Both Virtual), but one face to face meeting is mandatory @ any location - Pune/Hyderabad /Bangalore /Chennai. UAN Verification will be done in Background Check. Any overlapping in past employment will eliminate you. Last 4 Years Continuous PF deduction is mandatory. One face to face meeting is mandatory, Otherwise we can’t onboard you. Client Company – One of Leading Technology Consulting Payroll Company – One of Leading IT Services & Staffing Company ( This company has a presence in India, UK, Europe , Australia , New Zealand, US, Canada, Singapore, Indonesia, and Middle east. Do not change the subject line or do not create new email while sharing /applying for this position. Pls reply on this email thread only. Role 1 - GCP Data Engineer Must have skills /Mandatory Skills – GCP, Big Query, Dataflow, Cloud Composer About the Role: We are seeking a highly skilled and passionate GCP Data Engineer to join our growing data team. In this role, you will be instrumental in designing, building, and maintaining scalable and robust data pipelines and solutions on Google Cloud Platform (GCP). You will work closely with data scientists, analysts, and other stakeholders to translate business requirements into efficient data architectures, enabling data-driven decision-making across the organization. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. Min 5+ years of experience (e.g., 3-8 years) as a Data Engineer, with a strong focus on Google Cloud Platform (GCP). Mandatory hands-on experience with core GCP data services: BigQuery (advanced SQL, data modeling, query optimization) Dataflow (Apache Beam, Python/Java SDK) Cloud Composer / Apache Airflow for workflow orchestration Cloud Storage (GCS) Cloud Pub/Sub for messaging/streaming Strong programming skills in Python (preferred) or Java/Scala for data manipulation and pipeline development. Proficiency in SQL and experience with relational and NoSQL databases. Experience with data warehousing concepts, ETL/ELT processes, and data modeling techniques. Understanding of distributed systems and big data technologies (e.g., Spark, Hadoop concepts, Kafka). Familiarity with CI/CD practices and tools. Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python About the Role: We are looking for an experienced and passionate Big Data Engineer to join our dynamic team. In this role, you will be responsible for designing, building, and maintaining scalable, high-performance data processing systems and pipelines capable of handling vast volumes of structured and unstructured data. You will play a crucial role in enabling our data scientists, analysts, and business teams to derive actionable insights from complex datasets. Qualifications : bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Min 5 years of proven experience as a Big Data Engineer or a similar role. Extensive hands-on experience with Apache Spark (PySpark, Scala) for data processing. Strong expertise in the Hadoop ecosystem (HDFS, Hive, MapReduce). Proficiency in Python and/or Scala/Java. Solid SQL skills and experience with relational databases. Experience designing and building complex ETL/ELT pipelines. Familiarity with data warehousing concepts and data modeling techniques (star schema, snowflake, data vault). Understanding of distributed computing principles. Excellent problem-solving, analytical, and communication skills Role 3r - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps We are seeking a highly motivated and experienced GCP DevOps Engineer to join our innovative engineering team. You will be responsible for designing, implementing, and maintaining robust, scalable, and secure cloud infrastructure and automation pipelines on Google Cloud Platform (GCP). This role involves working closely with development, operations, and QA teams to streamline the software delivery lifecycle, enhance system reliability, and promote a culture of continuous improvement. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 5 years of experience in a DevOps or SRE role, with significant hands-on experience on Google Cloud Platform (GCP). Strong expertise in core GCP services relevant to DevOps: Compute Engine, GKE, Cloud SQL, Cloud Storage, VPC, Cloud Load Balancing, IAM. Proficiency with Infrastructure as Code (IaC) tools, especially Terraform. Extensive experience in designing and implementing CI/CD pipelines using tools like Cloud Build, Jenkins, or GitLab CI. Hands-on experience with containerization (Docker) and container orchestration (Kubernetes/GKE). Strong scripting skills in Python and Bash/Shell. Experience with monitoring and logging tools (Cloud Monitoring, Prometheus, Grafana, ELK stack). Solid understanding of networking concepts (TCP/IP, DNS, Load Balancers, VPNs) in a cloud environment. Familiarity with database concepts and experience managing cloud databases (e.g., Cloud SQL, Firestore). *** Mandatory to share ***Candidate Summary Sheet*** Interested parties can share their resume at (shant@harel-consulting.com) along with below details Applying for which role ( Pls mention the role name)- Your Name – Contact NO – Email ID – Do you have valid passport – Total Experience – Role 1 . Experience in GCP - Experience in Big Query - Experience in Data Flow - Experience in Cloud Composer – Experience in Apache Airflow – Experience in Python OR Java OR Scala and how much – Role 2nd Experience in Big data- Experience in Hive – Experience in Python OR Java OR Scala and how much – Experience in Pyspark- Role 3. Experience in GCP Devops – Experience in Python Current CTC – Expected CTC – What is your notice period in your current Company- Are you currently working or not- If not working then when you have left your last company – Current location – Preferred Location – It’s a Contract to Hire role, Are you ok with that- Highest Qualification – Current Employer (Payroll Company Name) Previous Employer (Payroll Company Name)- 2nd Previous Employer (Payroll Company Name) – 3rd Previous Employer (Payroll Company Name)- Are you holding any Offer – Are you Expecting any offer - Are you open to consider Contract to Hire role , It’s a C2H Role- PF Deduction is happening in Current Company – PF Deduction happened in 2nd last Employer- PF Deduction happened in 3 last Employer – Latest Photo - If incase you are working with a company whose employee strength is less than 2000 employees, than its mandatory to share UAN Service history. BR Shantpriya Harel Consulting shant@harel-consulting.com 9582718094

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies