Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 19 hours ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 19 hours ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 4 days ago
8.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 4 days ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 4 days ago
4.0 - 9.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Senior Software Engineer - DevOps Bangalore, India Who we are: INVIDI Technologies Corporation is the worlds leading developer of software transforming television all over the world. Our two-time Emmy Award-winning technology is widely deployed by cable, satellite, and telco operators. We provide a device-agnostic solution delivering ads to the right household no matter what program or network you re watching, how youre watching, or whether you re in front of your TV, laptop, cell phone or any other device. INVIDI created the multi-billion-dollar addressable television business that today is growing rapidly globally. INVIDI is right at the heart of the very exciting and fast-paced world of commercial television; companies benefiting from our software include DirecTV and Dish Network, networks such as CBS/Viacom and A&E, advertising agencies such as Ogilvy and Publicis, and advertisers such as Chevrolet and Verizon. INVIDI s world-class technology solutions are known for their flexibility and adaptability. These traits allow INVIDI partners to transform their video content delivery network, revamping legacy systems without significant capital or hardware investments. Our clients count on us to provide superior capabilities, excellent service, and ease of use. The goal of developing a unified video ad tech platform is a big one and the right DevOps Engineer --like you--flourish in INVIDI s creative, inspiring, and supportive culture. It is a demanding, high-energy, and fast-paced environment. INVIDI s developers are self-motivated quick studies, can-do individuals who embrace the challenge of solving difficult and complex problems. About the role: We are a modern agile product organization looking for an excellent DevOps engineer that can support and offload a remote product development team. Our platform handles tens of thousands of requests/second with sub-second response times across the globe. We serve ads to some of the biggest live events in the world, providing reports and forecasts based on billions of log rows. These are some of the complex challenges that make development and operational work at INVIDI interesting and rewarding. To accomplish this, we use the best frameworks and tools out there or, when they are not good enough, we write our own. Most of the code we write is Java or Kotlin on top of Dropwizard, but every problem is unique, and we always evaluate the best tools for the job. We work with technologies such as Kafka, Google Cloud (GKE, Pub/Sub), BigTable, Terraform and Jsonnet and a lot more. The position will report directly to the Technical Manager of Software Development and will be based in our Chennai, India office. Key responsibilities: You will maintain, deploy and operate backend services in Java and Kotlin that are scalable, durable and performant. You will proactively evolve deployment pipelines and artifact generation. You will have a commitment to Kubernetes and infrastructure maintenance. You will troubleshoot incoming issues from support and clients, fixing and resolving what you can You will collaborate closely with peers and product owners in your team. You will help other team members grow as engineers through code review, pairing, and mentoring. Our Requirements: You are an outstanding DevOps Engineer who loves to work with distributed high-volume systems. You care about the craft and cherish the opportunity to work with smart, supportive, and highly motivated colleagues. You are curious; you like to learn new things, mentor and share knowledge with team members. Like us, you strive to handle complexity by keeping things simple and elegant. As a part of the DevOps team, you will be on-call for the services and clusters that the team owns. You are on call for one week, approximately once or twice per month. While on-call, you are required to be reachable by telephone and able to act upon alarm using your laptop. Skills and qualifications: Master s degree in computer science, or equivalent 4+ years of experience in the computer science industry Strong development and troubleshooting skill sets Ability to support a SaaS environment to meet service objectives Ability to collaborate effectively and work well in an Agile environment Excellent oral and written communication skills in English Ability to quickly learn new technologies and work in a fast-paced environment. Highly Preferred: Experience building service applications with Dropwizard/Spring Boot Experience with cloud services such as GCP and/or AWS. Experience with Infrastructure as Code tools such as Terraform. Experience in Linux environment. Experience working with technologies such as SQL, Kafka, Kafka Streams Experience with Docker Experience with SCM and CI/CD tools such as GIT and Bitbucket Experience with build tools such as Gradle or Maven Experience in writing Kubernetes deployment manifests and troubleshooting cluster and application-level issues. Physical Requirements: INVIDI is a conscious, clean, well-organized, and supportive office environment. Prolonged periods of sitting at a desk and working on a computer are normal. Note: Final candidates must successfully pass INVIDI s background screening requirements. Final candidates must be legally authorized to work in India. INVIDI has reopened its offices on a flexible hybrid model. Ready to join our team? Apply today!
Posted 5 days ago
5.0 - 7.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 week ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Mumbai
Work from Office
Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in GCP Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Minimum 4 years' experience in GCP Data Engineering. Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Should have worked on handling big data. Strong communication skills. experience in Agile methodologies ETL, ELT skills, Data movement skills, Data processing skills. Certification on Professional Google Cloud Data engineer will be an added advantage. Proven analytical skills and Problem-solving attitude Ability to effectively function in a cross-teams environment. Primary Skills GCP, data engineering.Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark. GCS (Cloud Storage), Composer (Airflow) and BigQuery experience. Experience building data pipelines using above skills Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 1 week ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 2 weeks ago
12.0 - 15.0 years
40 - 45 Lacs
Chennai
Work from Office
Skill & Experience Strategic Planning and Direction Maintain architecture principles, guidelines and standards Project & Program Management Data Warehousing Big Data Data Analytics &; Data Science for solutioning Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Experience in designing & mplementing solution in mentioned areas:Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
Posted 2 weeks ago
4.0 - 7.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Overview This role will drive technical architecture of services engineering teams within the Software Solutions business unit. These products focus on empowering front-line workers using Zebra (and other) hardware to be more effective in their roles. Our software solutions combine the power of the cloud with IoT, big data analytics, and machine learning to put powerful insights and capabilities into the hands or our customer’s workers in real-time. We serve customers in various verticals, including transport & logistics, healthcare, retail and manufacturing Responsibilities • Responsibilities include working with cross-functional, Agile development teams to evaluate, design and architect cloud native solutions. • Requires a sound working knowledge of Google Cloud Platform services and capabilities, microservice design and domain-driven-development. • Will collaborate across functions to define the system architecture and ensure the integrity of the design by defining SLOs. • Will have overall responsibility for setting the technical direction of a solution, including selecting frameworks, identifying dependencies, writing technical requirements and driving continuous improvement. • Shall understand DevOps and can drive requirements for CI/CD, automation scripts, load performance testing, auto-scaling and monitoring tools capable of operating solutions with minimal manual intervention. • Shall evaluate work upon completion to ensure technical KPIs have been met and will drive functional excellence initiatives. • Works effectively in a fast-paced, global environment supporting high-demand solutions. Qualifications Minimum Education : • Bachelor's in Computer Science, Computer Engineering, or Computer Information Systems • Masters Degree preferred Minimum Work Experience (years): • 15+ years of experience • 5+ years of experience in cloud-based enterprise-systems architecture • Experience in product engineering Key Skills and Competencies: • Experienced hands-on architect. Enjoys active prototyping, design and code reviews, production incident handling and service evolution. • Experience with highly Agile development teams and large-scale system creation • Experience in GCP and API development, including externally exposed APIs • Excellent communication skills and the ability to create architecture diagrams and distill complex technical subjects in order to present updates to executive leaders • Real-world experience in cloud native product development, including delivering a software solution through to production • Proven problem solver and impediment remover, identifies issues, finds solution to blockers and helps resolve technical issues quickly to achieve goals. • Eager to evaluate the pros and cons of different technical solutions and determine how best to apply different cloud capabilities. • Drives innovation and acts as evangelist for technology improvements • Knowledge of variety of database and datastore technologies and how best to utilize each (RDBMS, Distributed SQL, DeltaLake, bucket/blob storage, & document stores such as MongoDB and BigTable). • Working knowledge of building compelling back-end application frameworks. • Strong interest in product development and platform approach to system design. • Understand ways to incorporate Machine Learning and AI into a larger system design. • Familiarity with a variety of API and client/server communication technologies such as gRPC/Protocol Buffers, OpenAPI, GraphQL, MQTT, Kafka, MongoDB/Realm • Working knowledge of Kubernetes and Service Mesh • Familiar with load-testing, scaling services, right-sizing and cost management • Familiar with Infrastructure-as-Code approach to cloud deployment via Terraform and Helm. • Working knowledge of several computer languages, especially modern Java and Javascript • Solid understanding of continuous integration, continuous deployment and operations concepts. • Experience in a broad range of Google Cloud Platform services and capabilities, preferred Position Specific Information: • Travel Requirements (as a % of time): 10%
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Chennai, Tamil Nadu
Work from Office
Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent
Posted 3 weeks ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 1 month ago
5 - 10 years
0 - 1 Lacs
Pune
Work from Office
Position Overview : Cloud Data Engineer with expertise in Google Cloud Platform (GCP) Data Stack , including Event Hub, MS SQL DB, Azure Redis, and GCP Big Table Storage . The ideal candidate should have strong experience in Big Data architecture, data migration, and large-scale data processing using tools like Hadoop, Hive, HDFS, Impala, Spark, MapReduce, MS SQL, Kafka, and Redis . Familiarity with Cloudera, HBase, MongoDB, MariaDB, Python Scripts, and Unix Shell Scripting is a plus. Key Responsibilities: Design, develop, and optimize Big Data solutions on GCP and cloud-based architectures . Lead and execute data migration projects from on-premise systems to GCP or hybrid cloud environments . Build and maintain ETL pipelines using Hadoop, Hive, Spark, Kafka, and SQL databases . Work with GCP Big Table Storage, Azure Redis, and Event Hub for data processing and storage. Implement real-time streaming solutions using Kafka and Event Hub . Optimize performance and security for Hadoop clusters, HDFS, and cloud storage solutions . Develop and automate Python and Unix Shell Scripts for data processing and workflow orchestration. Collaborate with data analysts, data scientists, and DevOps teams to improve data infrastructure. Required Skills & Experience: 5-10 years of experience as Cloud Data Engineer Strong hands-on experience in GCP Data Stack (Big Table Storage, Event Hub, Azure Redis, MS SQL DB). Proficiency in Big Data technologies (Hadoop, Hive, HDFS, Impala, Spark, MapReduce). Experience with Kafka, Redis, and real-time data processing . Hands-on knowledge of SQL and NoSQL databases (MS SQL, HBase, MongoDB, MariaDB). Experience in data migration projects across cloud and on-premise environments. Strong scripting skills in Python and Unix Shell Scripting . Understanding Big Data security, performance tuning, and scalability best practices . Location: Koregaon Park, Pune, Maharashtra (India) Shift Timings: USA Time Zone (06:30 PM IST to 03:30 AM IST)
Posted 2 months ago
5 - 7 years
8 - 12 Lacs
Bengaluru
Work from Office
Job Title Cloud Developer (AWS/ AZURE/ GCP) Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data->Big Table,Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Data On Cloud - Platform->AWS Preferred Skills: Technology->Big Data->Big Table->GCP Technology->Data On Cloud - Platform->AWS Technology->Cloud Integration->Azure Data Factory (ADF) Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Application Development and Maintenance * Location of posting is subject to business requirements
Posted 2 months ago
2 - 5 years
4 - 7 Lacs
Ahmedabad
Work from Office
Role Overview. As a Senior Backend Developer (Node.js), you will play a key role in designing, developing, and optimizing backend infrastructure for real-time applications and Web3 solutions. You will collaborate with cross-functional teams to ensure the seamless performance, security, and scalability of our backend systems. Key Responsibilities. Backend DevelopmentDesign, develop, and maintain scalable backend systems using Node.js. Real-Time CommunicationImplement real-time data transmission using WebSocket, WebRTC, and other relevant protocols to enhance user experience. Database ManagementWork with databases such as Redis, BigQuery, BigTable, PubSub, and DataFlow to efficiently manage and optimize application data. Scalability & PerformanceArchitect solutions that can handle high concurrency while ensuring optimal performance and reliability. SecurityImplement best practices for security to prevent vulnerabilities, data breaches, and unauthorized access. CollaborationWork closely with frontend developers, designers, and other stakeholders to seamlessly integrate backend solutions. Testing & DebuggingEnsure robustness and reliability through thorough testing and debugging processes. DocumentationMaintain comprehensive documentation for APIs, backend services, and system architecture. Ideal Candidate Profile. Proven experience in backend development using Node.js. Hands-on experience in real-time applications or multiplayer game development is a plus. Strong expertise in WebSocket, WebRTC, and other real-time communication technologies. Deep understanding of databases like Redis, BigQuery, BigTable, PubSub, DataFlow. Experience working with Google Cloud Platform (GCP). Familiarity with serverless architecture is a strong advantage. Strong problem-solving skills with a results-driven mindset. Excellent communication and teamwork skills. Fluency in English. Knowledge of the gaming or mobile apps industry is a plus. Interest or experience in the Web3 industry is an added advantage. Show more Show less
Posted 2 months ago
15 - 24 years
30 - 45 Lacs
Pune
Hybrid
Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, dataproc, GCS, Cloud Function and related CI/CD processes
Posted 2 months ago
15 - 24 years
30 - 45 Lacs
Bengaluru
Hybrid
Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, dataproc, GCS, Cloud Function and related CI/CD processes
Posted 2 months ago
7 - 10 years
25 - 35 Lacs
Indore, Bengaluru, Hyderabad
Work from Office
Jo b Description: GCP Lead Developer Location: Hyderabad, Bangalore and Indore Notice period - 15-30days Max What we are looking for: 7+ years of experience in data analytics with at least 4+ years in tech-lead role. Expertise in building solution architecture, provision infrastructure, secure and reliable data-centric services, and application in GCP Minimum 3-5 year of designing, building and operationalizing enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc. Minimum 2 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc. Ability to formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with business users. Work with data team to efficiently use Cloud infrastructure to analyze data, build models, and generate reports/visualizations. Extracting, Loading, Transforming, cleaning, and validating data; Integrate datasets from multiple data sources for data modelling. Excellent programming skills with SQL and python Strong communication and analytical skills Preferable GCP Professional Data Engineer certification Able to drive workshops with the business users to gather requirements and translate those to analytics solutions. Drive and Contribute presales activities including proposals, POC and Demos. Working at TekLink At TekLink our employees get an open and collaborative environment and gain experience working for customers in various industries while solving complex business problems. They get support to learn new technologies as well as to enhance existing skills to further their career growth. We offer: Competitive remuneration Excellent Benefits including Health, accidental and Life coverage. Excellent performance based annual increments. Fast paced growth opportunities International work experience J portunity to participate in various sports and CSR activities.J
Posted 3 months ago
6 - 11 years
13 - 18 Lacs
Mumbai
Work from Office
Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: GCP services (like Big Query, GKE, Spanner, Cloud run, Data flow etc.,) Angular, Java (Rest Api), SQL, Python, Terraforms, Azure DevOps CICD Pipelines Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in GCP platform (Cloud PubSub, GKE, BigQuery) - Experience in BigTable and Spanner will be a plus Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Employee Type: Permanent
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Gurgaon
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 3 months ago
3 - 8 years
9 - 12 Lacs
Hyderabad
Work from Office
Role: Google Cloud Developer & Architect Business Vertical: Digital Operations Technology Qualifications and Experience Graduate in engineering or computer science 2-5 years of cloud based development experience (GCP Preferrable) Knowledge in Cloud platforms : GCP & Firebase Languages : Python & Javascript Database : SQL & BigQuery Data Science ML and AI In depth knowledge of key GCP services GCE, GKE, GAE, GCS, Cloud SQL, VPC, Resource Manager, StackDriver, Cloud CDN, Cloud IAM, GSuite(good to have) Infrastructure as Code etc.. Must have worked on Workflow monitoring in distributed computing environment for implementing Big Data solution experience in Google Cloud Platform modules like Dataproc, Dataflow, Composer, BigQuery, GCS In-depth knowledge of design, implementation, engineering, automation and devops implementation, service operation and service improvement initiatives Experience in emerging reporting tools and technologies like Tableau,Bigtable, Datastudio & Analytics dashboards etc. Experience in collaborating with teams across geographies and functions. Responsibilities (not limited to) Responsible for design, development, implementation, operation improvement and debug cloud environments in GCP and Cloud Management Platform Performs engineering design evaluations for new environment builds, Recommends alterations to development and design to improve the quality of products and/or procedures, Implementation of industry standard security practices during implementation and maintain it throughout the lifecycle , Has knowledge of commonly-used concepts, practices, and procedures through automation tools. Relies on instructions and pre-established guidelines to perform the functions of the job, Have a good understanding of Google best practice/recommendation and should be able to align the same with the customer requirements to deliver best in class solution for the customer. Personal Skills: Good time management and organizational skills Good interpersonal skills along with people management skills Flexible to adapt to any changes in the organization / process Self-starter, Ambitious, go-getter and deadline driven Keen to learn and implement learning Maintain Organizational and Client confidentiality.
Posted 3 months ago
5 - 7 years
8 - 10 Lacs
Chennai, Pune, Delhi
Work from Office
The ideal candidate should possess technical expertise in the following areas, along with soft skills such as communication, collaboration, time management, and organizational abilities. Key Skills Experience: Soft Skills: Communication, Collaboration, Time Management, Organizational Skills, Positive Attitude. Experience: Proficiency in Data Engineering, SQL, and Cloud Technologies. Must-Have Technical Skills: Talend SQL, SQL Server, T-SQL SQL Agent Snowflake / BigQuery GCP (Google Cloud Platform) SSIS Dataproc Composer / Airflow Python Nice-to-Have Technical Skills: Dataplex Dataflow Big Lake, Lakehouse, BigTable GCP Pub/Sub BQ API, BQ Connection API Other Details: .
Posted 3 months ago
5 - 10 years
20 - 35 Lacs
Bengaluru
Hybrid
GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2