Bengaluru
INR 15.0 - 30.0 Lacs P.A.
Remote
Full Time
Job Title: Senior GCP Data DevOps Engineer Job Type: Remote Exp: 4+ years Position Overview: As a Senior DevOps Engineer specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and managing our cloud infrastructure to ensure optimal performance, scalability, and reliability. You will collaborate closely with cross-functional teams to streamline development processes, automate deployment pipelines, and enhance overall system efficiency. Responsibilities: Design, implement, and manage scalable and highly available cloud infrastructure on Google Cloud Platform (GCP) to support our applications and services. Develop and maintain CI/CD pipelines to automate the deployment, testing, and monitoring of applications and microservices. Collaborate with software engineering teams to optimize application performance, troubleshoot issues, and ensure smooth deployment processes. Implement and maintain infrastructure as code (IaC) using tools such as Terraform , Ansible, or Google Deployment Manager. Monitor system health, performance, and security metrics, and implement proactive measures to ensure reliability and availability. Implement best practices for security, compliance, and data protection in cloud environments. Continuously evaluate emerging technologies and industry trends to drive innovation and improve infrastructure efficiency. Mentor junior team members and provide technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 4-8 years of experience in a DevOps role, with a focus on Google Cloud Platform (GCP). In-depth knowledge of GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL , Pub/Sub, and BigQuery . Proficiency in scripting languages such as Python , Bash, or PowerShell. Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes. Strong understanding of CI/CD concepts and experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of infrastructure as code (IaC) principles and experience with tools such as Terraform, Ansible, or Google Deployment Manager. Experience with monitoring and logging tools such as Prometheus, Grafana, Stackdriver, or ELK Stack. Knowledge of security best practices and experience implementing security controls in cloud environments. Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems. Strong communication skills and ability to collaborate effectively with cross-functional teams. Preferred Qualifications: Google Cloud certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud platforms such as AWS or Azure. Familiarity with agile methodologies and DevOps practices. Experience with software development using languages such as Java, Node.js, or Go. Knowledge of networking concepts and experience with configuring network services in cloud environments. Skills: Gcp CloudSQLBigqueryKubernetesIac ToolsCi Cd PipelineTerraformPythonAirflowSnowflakePower BiIacData FlowPubsubCloud StorageCloud Computing
Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Job Title: Senior Digital Designer (UI/UX) Experience: 5+ Years Location: Remote (Night shift) Job Type: Full-time Job Description: We are looking for a Senior Digital Designer (UI/UX) with 5+ years of hands-on experience to join our creative team. You will work closely with the Creative Director and internal teams to design visually stunning and user-friendly digital assets for web applications. The role demands strong design skills, UX thinking, and a keen eye for detail. As a Senior Digital Designer, you will play a key role in concept development, wireframing, UI design, and maintaining visual consistency across all platforms. You will also collaborate with developers, clients, and project managers to ensure seamless delivery of high-quality designs. Responsibilities: Collaborate with the Creative Director and project teams to understand requirements and target audience. Develop wireframes, mockups, and prototypes for websites and digital platforms. Design visual elements including logos, banners, UI components, and branding assets. Apply UX principles to enhance user interaction and experience. Maintain brand consistency across all design outputs. Optimize images and graphics for web performance. Assist in copy generation for branding and digital narratives, when required. Stay updated on design trends, tools, and technologies. Work with developers to ensure smooth integration of design assets. Conduct usability testing and implement feedback-driven improvements. Requirements: Bachelors degree in Graphic Design, UX/UI, or a related field (or equivalent experience). 5+ years of experience in digital or UI/UX design roles. Strong portfolio showcasing web and mobile design projects. Proficiency in Figma, Adobe Photoshop, Illustrator, and Sketch. Experience using Elementor and working with WordPress CMS. Knowledge of HTML, CSS, and responsive design. Strong visual and creative skills in layout, color, and typography. Ability to work on multiple projects and meet deadlines. Strong communication and collaboration skills. Experience with 3D design and generative AI tools is a plus. Key Skills: Digital Design, UI/UX Design, Figma, Adobe Photoshop, Adobe Illustrator, Sketch, Wireframing, Prototyping, User Experience, User Interface, Visual Design, Typography, Color Theory, Responsive Web Design, Elementor, WordPress, HTML, CSS, Graphic Design, Branding, CMS, Usability Testing, 3D Design, Generative AI
Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Remote
Full Time
Job Title: Senior Machine Learning Engineer Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Strong programming skills in Python and experience with ML frameworks. Proficiency in containerization (Docker) and orchestration (Kubernetes) technologies. Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI, GitHub Actions). Knowledge of data engineering concepts and experience building data pipelines. Strong understandings on Computational, Storage and Orchestration resources on cloud platforms. Deploying and managing ML models especially on GCP (cloud platform agnostic though) services such as Cloud Run, Cloud Functions, and Vertex AI. Implementing MLOps best practices, including model version tracking, governance, and monitoring for performance degradation and drift. Creating and using benchmarks, metrics, and monitoring to measure and improve services Collaborating with data scientists and engineers to integrate ML workflows from onboarding to decommissioning. Experience with MLOps tools like Kubeflow, MLflow, and Data Version Control (DVC). Manage ML models on any of the following: AWS (SageMaker), Azure (Machine Learning), and GCP (Vertex AI). Tech Stack : Aws or GCP or Azure Experience. (More GCP Specific) must have done Py spark, Databricks is good. ML Experience, Docker and Kubernetes.
Bengaluru
INR 15.0 - 30.0 Lacs P.A.
Remote
Full Time
Job Requirement for Offshore Data Engineer (with ML expertise) Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Technical Skills & Expertise: PySpark & Apache Spark: Extensive experience with PySpark and Spark for big data processing and transformation. Strong understanding of Spark architecture, optimization techniques, and performance tuning. Ability to work with Spark jobs in distributed computing environments like Databricks. Data Mining & Transformation: Hands-on experience in designing and implementing data mining workflows. Expertise in data transformation processes, including ETL (Extract, Transform, Load) pipelines. Experience in large-scale data ingestion, aggregation, and cleaning. Programming Languages: Python & Scala: Proficient in Python for data engineering tasks, including using libraries like Pandas and NumPy. Scala proficiency is preferred for Spark job development. Big Data Concepts: In-depth knowledge of big data frameworks and paradigms, such as distributed file systems, parallel computing, and data partitioning. Big Data Technologies: Cassandra & Hadoop: Experience with NoSQL databases like Cassandra and distributed storage systems like Hadoop. Data Warehousing Tools: Proficiency with Hive for data warehousing solutions and querying. ETL Tools: Experience with Beam architecture and other ETL tools for large-scale data workflows. Cloud Technologies (GCP): Expertise in Google Cloud Platform (GCP), including core services like Cloud Storage, BigQuery, and DataFlow. Experience with DataFlow jobs for batch and stream processing. Familiarity with managing workflows using Airflow for task scheduling and orchestration in GCP. Machine Learning & AI: GenAI Experience: Familiarity with Generative AI and its applications in ML pipelines. ML Model Development: Knowledge of basic ML model building using tools like Pandas, NumPy, and visualization with Matplotlib. ML Ops Pipeline: Experience in managing end-to-end ML Ops pipelines for deploying models in production, particularly LLM (Large Language Models) deployments. RAG Architecture: Understanding and experience in building pipelines using Retrieval-Augmented Generation (RAG) architecture to enhance model performance and output. Tech stack : Spark, Pyspark, Python, Scala, GCP data flow, Data composer (Air flow), ETL, Databricks, Hadoop, Hive, GenAI, ML Modeling basic knowledge, ML Ops experience , LLM deployment, RAG
Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Job Title : Full Stack Developer Location : Remote Experience : 5+ Years Job Description : We are looking for a skilled Full Stack Developer with strong experience in Python, React.js, and Node.js. Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and capable of working in a fast-paced environment. Key Responsibilities: Develop, test, and maintain scalable web applications. Collaborate with frontend and backend teams to design robust solutions. Ensure the performance, quality, and responsiveness of applications. Write clean, reusable, and well-documented code. Participate in code reviews and contribute to best practices. Required Skills: 5+ years of hands-on experience in Full Stack Development Proficiency in Python, React.js, and Node.js Good understanding of REST APIs and Microservices Experience with Git and version control workflows Familiarity with databases (SQL and NoSQL) Good to Have: Experience with Java Exposure to cloud platforms (AWS, GCP, or Azure) CI/CD, Docker, or containerization knowledge Primary Skill Set: Python, React.js, Node.js.
Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)
Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Job Title: Software Engineer GCP Data Engineering Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are seeking a Software Engineer with a strong background in GCP Data Engineering and a solid understanding of how to build scalable data processing frameworks. The ideal candidate will be proficient in data ingestion, transformation, and orchestration using modern cloud-native tools and technologies. This role requires hands-on experience in designing and optimizing ETL pipelines, managing big data workloads, and supporting data quality initiatives. Key Responsibilities: Design and develop scalable data processing solutions using Apache Beam, Spark, and other modern frameworks. Build and manage data pipelines on Google Cloud Platform (GCP) using services like Dataflow, Dataproc, Composer (Airflow), and BigQuery . Collaborate with data architects and analysts to understand data models and implement efficient ETL solutions. Leverage DevOps and CI/CD best practices for code management, testing, and deployment using tools like GitHub and Cloud Build. Ensure data quality, performance tuning, and reliability of data processing systems. Work with cross-functional teams to understand business requirements and deliver robust data infrastructure to support analytical use cases. Required Skills: 4 to 6 years of professional experience as a Data Engineer working on cloud platforms, preferably GCP . Proficiency in Java and Python with strong problem-solving and analytical skills. Hands-on experience with Apache Beam , Apache Spark , Dataflow , Dataproc , Composer (Airflow) , and BigQuery . Strong understanding of data warehousing concepts and ETL pipeline optimization techniques. Experience in cloud-based architectures and DevOps practices. Familiarity with version control (GitHub) and CI/CD pipelines . Preferred Skills: Exposure to modern ETL tools and data integration platforms. Experience with data governance, data quality frameworks , and metadata management. Familiarity with performance tuning in distributed data processing systems. Tech Stack: Cloud: GCP (Dataflow, BigQuery, Dataproc, Composer) Programming: Java, Python Frameworks: Apache Beam, Apache Spark DevOps: GitHub, CI/CD tools, Composer (Airflow) ETL/Data Tools: Data ingestion, transformation, and warehousing on GCP
Bengaluru
INR 8.0 - 18.0 Lacs P.A.
Remote
Full Time
Job Title: System & security support engineer L2/L3 Base Location: Bangalore Work Mode: Remote Experience Required: 5+ Years Job Summary: We are looking for a seasoned L2/L3 Systems/Support Engineer with a strong background in IT infrastructure support and cybersecurity operations . The ideal candidate should have hands-on experience in monitoring, troubleshooting, and securing enterprise systems. You will be responsible for providing advanced technical support, maintaining system health, and enhancing security posture across platforms. Key Responsibilities: Provide L2/L3 technical support for infrastructure systems including servers, endpoints, and networks. Monitor and respond to incidents related to system performance, availability, and security. Troubleshoot hardware, software, and network-related issues in a timely manner. Collaborate with security teams to investigate threats, perform vulnerability assessments, and support mitigation strategies. Maintain system documentation, configurations, and change logs. Support patch management, endpoint protection, and compliance initiatives. Coordinate with cross-functional teams for system upgrades, migrations, and recovery planning. Required Skills: Minimum of 5 years of experience in System/Support Engineering with exposure to Cybersecurity . Strong knowledge of Windows/Linux servers , Active Directory , and network protocols . Experience with security monitoring tools , SIEM , and incident response processes . Familiarity with endpoint protection, firewalls, intrusion detection/prevention systems. Proficient in troubleshooting Tier 2/3 technical issues across infrastructure and application layers. Strong communication skills and the ability to collaborate effectively across teams. Good to Have: Industry certifications such as CompTIA Security+ , CEH , Microsoft Certified , or Cisco Security certifications. Experience with cloud security practices (AWS, GCP, or Azure). Knowledge of ITIL practices and ticketing systems (e.g., ServiceNow, Jira). Scripting knowledge (PowerShell, Bash, Python) for automating routine tasks. Tech Stack: Operating Systems: Windows, Linux Security Tools: SIEM, EDR, IDS/IPS, Antivirus Networking: TCP/IP, DNS, VPN, Firewall configurations Systems & Platforms: Active Directory, VMware, Cloud (AWS/GCP/Azure) Support Tools: ServiceNow, Jira, Remote Desktop, SSH Scripting: PowerShell, Bash, Python (preferred)
Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Job Description: Job Title: Apache beam software engineer Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are looking for a Software Engineer with hands-on experience in Apache Beam , Google Cloud Dataflow , and Dataproc , focusing on building reusable data processing frameworks . This is not a traditional data engineering role. The ideal candidate will have strong software development skills in Java or Python and experience in building scalable, modular data processing components and frameworks for batch and streaming use cases. Key Responsibilities: Design and develop framework-level components using Apache Beam , GCP Dataflow , and Dataproc . Build scalable, reusable libraries and abstractions in Python or Java for distributed data processing. Work closely with architects to implement best practices for designing high-performance data frameworks. Ensure software reliability, maintainability, and testability through strong coding and automation practices. Participate in code reviews, architectural discussions, and performance tuning initiatives. Contribute to internal tooling or SDK development for data engineering platforms. Required Skills: 4 to 6 years of experience as a Software Engineer working on distributed systems or data processing frameworks. Strong programming skills in Java and/or Python . Deep experience with Apache Beam and GCP Dataflow . Hands-on experience with GCP Dataproc , especially for building scalable custom batch or streaming jobs. Solid understanding of streaming vs batch processing concepts. Familiarity with CI/CD pipelines , GitHub , and test automation. Preferred Skills: Experience with workflow orchestration tools such as Airflow (Composer) . Exposure to Pub/Sub and BigQuery (from a system integration perspective). Understanding of observability , logging , and error-handling in distributed data pipelines. Experience building internal libraries, SDKs, or tools to support data teams. Tech Stack: Cloud: GCP (Dataflow, Dataproc, Pub/Sub, Composer) Programming: Java, Python Frameworks: Apache Beam DevOps: GitHub, CI/CD (Cloud Build, Jenkins) Focus Areas: Framework/library development, scalable distributed data processing, component-based architecture
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.