Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Roles And Responsibilities Contributing to designing cloud architectures and integration modules for enterprise-level systems. Managing and mentoring the technical team and ensuring quality delivery of solutions as per the process Lead engagements with partners and customers, including stakeholder management, requirements gathering, and designing solutions along with development and delivery. Collaborate with Program Management, Engineering, User Experience, and Product teams to identify gaps and work with cross-functional teams to design solutions. Essential Skills Experience developing and managing scalable, high-performance production systems. Some experience working in Artificial Intelligence (AI) or RPA Developing, designing, and maintaining high-quality production applications written in NodeJS or Python, data structures, ML algorithms, and software design. Experience with complex API integrations and application development modules. Strong skills in designing database schema; both SQL and NoSQL databases. Experience with any cloud technologies like GCP/AWS/Azure leveraging serverless architectures and technologies like Cloud functions, AWS Lambda, Google Dataflow, Google Pub/Sub etc Design, build, manage and operate the continuous delivery framework and tools, and act as a subject matter expert on CI/CD for developer teams. Fullstack development background with experience on Front-end – Angular / jQuery or any JS frameworks Should possess strong problem-solving ability. The ability or potential to multitask, and prioritize. Experience in Test Driven Development & Agile methodologies. Good communication skills. Experience using tools like Git and Jira. Confluence is a plus. A team player who can collaborate with all stakeholders with strong interpersonal skills. Self-starter with a drive to technically mentor your cohort of developers Good To Have Experience in designing reusable and scalable architecture for cloud applications Experience in managing the design and production implementation of chat and voice bots Exposure to developing, maintaining, and monitoring microservices Experience in one or more: chat/voice bot development, machine learning, Natural Language Processing (NLP), and contact center technologies. Application Security
Posted 3 weeks ago
5.0 years
0 Lacs
India
Remote
Client Type: US Client Location: Remote The hourly rate is negotiable. About the Role We’re creating a new certification: Google AI Ecosystem Architect (Gemini & DeepMind) - Subject Matter Expert . This course is designed for technical learners who want to understand and apply the capabilities of Google’s Gemini models and DeepMind technologies to build powerful, multimodal AI applications. We’re looking for a Subject Matter Expert (SME) who can help shape this course from the ground up. You’ll work closely with a team of learning experience designers, writers, and other collaborators to ensure the course is technically accurate, industry-relevant, and instructionally sound. Responsibilities As the SME, you’ll partner with learning experience designers and content developers to: Translate real-world Gemini and DeepMind applications into accessible, hands-on learning for technical professionals. Guide the creation of labs and projects that allow learners to build pipelines for image-text fusion, deploy Gemini APIs, and experiment with DeepMind’s reinforcement learning libraries. Contribute technical depth across activities, from high-level course structure down to example code, diagrams, voiceover scripts, and data pipelines. Ensure all content reflects current, accurate usage of Google’s multimodal tools and services. Be available during U.S. business hours to support project milestones, reviews, and content feedback. This role is an excellent fit for professionals with deep experience in AI/ML, Google Cloud, and a strong familiarity with multimodal systems and the DeepMind ecosystem. Essential Tools & Platforms A successful SME in this role will demonstrate fluency and hands-on experience with the following: Google Cloud Platform (GCP) Vertex AI (particularly Gemini integration, model tuning, and multimodal deployment) Cloud Functions, Cloud Run (for inference endpoints) BigQuery and Cloud Storage (for handling large image-text datasets) AI Platform Notebooks or Colab Pro Google DeepMind Technologies JAX and Haiku (for neural network modeling and research-grade experimentation) DeepMind Control Suite or DeepMind Lab (for reinforcement learning demonstrations) RLax or TF-Agents (for building and modifying RL pipelines) AI/ML & Multimodal Tooling Gemini APIs and SDKs (image-text fusion, prompt engineering, output formatting) TensorFlow 2.x and PyTorch (for model interoperability) Label Studio, Cloud Vision API (for annotation and image-text preprocessing) Data Science & MLOps DVC or MLflow (for dataset and model versioning) Apache Beam or Dataflow (for processing multimodal input streams) TensorBoard or Weights & Biases (for visualization) Content Authoring & Collaboration GitHub or Cloud Source Repositories Google Docs, Sheets, Slides Screen recording tools like Loom or OBS Studio Required skills and experience: Demonstrated hands-on experience building, deploying, and maintaining sophisticated AI powered applications using Gemini APIs/SDKs within the Google Cloud ecosystem, especially in Firebase Studio and VS Code. Proficiency in designing and implementing agent-like application patterns, including multi-turn conversational flows, state management, and complex prompting strategies (e.g., Chain-of Thought, few-shot, zero-shot). Experience integrating Gemini with Google Cloud services (Firestore, Cloud Functions, App Hosting) and external APIs for robust, production-ready solutions. Proven ability to engineer applications that process, integrate, and generate content across multiple modalities (text, images, audio, video, code) using Gemini’s native multimodal capabilities. Skilled in building and orchestrating pipelines for multimodal data handling, synchronization, and complex interaction patterns within application logic. Experience designing and implementing production-grade RAG systems, including integration with vector databases (e.g., Pinecone, ChromaDB) and engineering data pipelines for indexing and retrieval. Ability to manage agent state, memory, and persistence for multi-turn and long-running interactions. Proficiency leveraging AI-assisted coding features in Firebase Studio (chat, inline code, command execution) and using App Prototyping agents or frameworks like Genkit for rapid prototyping and structuring agentic logic. Strong command of modern development workflows, including Git/GitHub, code reviews, and collaborative development practices. Experience designing scalable, fault-tolerant deployment architectures for multimodal and agentic AI applications using Firebase App Hosting, Cloud Run, or similar serverless/cloud platforms. Advanced MLOps skills, including monitoring, logging, alerting, and versioning for generative AI systems and agents. Deep understanding of security best practices: prompt injection mitigation (across modalities), secure API key management, authentication/authorization, and data privacy. Demonstrated ability to engineer for responsible AI, including bias detection, fairness, transparency, and implementation of safety mechanisms in agentic and multimodal applications. Experience addressing ethical challenges in the deployment and operation of advanced AI systems. Proven success designing, reviewing, and delivering advanced, project-based curriculum and hands-on labs for experienced software developers and engineers. Ability to translate complex engineering concepts (RAG, multimodal integration, agentic patterns, MLOps, security, responsible AI) into clear, actionable learning materials and real world projects. 5+ years of professional experience in AI-powered application development, with a focus on generative and multimodal AI. Strong programming skills in Python and JavaScript/TypeScript; experience with modern frameworks and cloud-native development. Bachelor’s or Master’s degree in Computer Science, Data Engineering, AI, or a related technical field. Ability to explain advanced technical concepts (e.g., fusion transformers, multimodal embeddings, RAG workflows) to learners in an accessible way. Strong programming experience in Python and experience deploying machine learning pipelines Ability to work independently, take ownership of deliverables, and collaborate closely with designers and project managers Preferred: Experience with Google DeepMind tools (JAX, Haiku, RLax, DeepMind Control Suite/Lab) and reinforcement learning pipelines. Familiarity with open data formats (Delta, Parquet, Iceberg) and scalable data engineering practices. Prior contributions to open-source AI projects or technical community engagement.
Posted 3 weeks ago
7.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 3 weeks ago
0 years
13 - 18 Lacs
India
On-site
Job Title: Pharmacist – Hospital (Oman) Location: Oman(Hospital) Job Type: Full-Time hr@meridiantradelinksuae.com +971 50 663 0283 Salary: OMR 500 (INR 115000.00) for Pharmacists OMR 550–650 (INR 115000-150000) for Licensed Pharmacists (based on qualifications and experience) About the Role: We are urgently hiring Pharmacists for a reputable hospital in Oman. The role involves dispensing medications accurately, counseling patients on medicine usage, and ensuring compliance with hospital and pharmacy standards. Key Responsibilities: Dispense prescribed medications with accuracy. Counsel patients regarding safe medication usage and side effects. Maintain accurate records and inventory of medications. Collaborate with doctors and nurses for optimal patient care. Requirements: Bachelor’s or Master’s in Pharmacy. Preferably candidates with Oman Pharmacist License. Candidates who have completed Prometric and Dataflow are highly preferred. Ability to work efficiently in a hospital setting. Strong communication and patient counseling skills. Benefits: Competitive tax-free salary based on license and experience. Opportunity to work in a reputable healthcare environment in Oman. Professional development opportunities. How to Apply: Interested candidates are invited to send their updated CV to: hr@meridiantradelinksuae.com with the subject line: “Pharmacist – Oman Application.” Job Types: Full-time, Permanent Pay: ₹115,000.00 - ₹150,000.00 per month
Posted 3 weeks ago
0 years
0 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist. In this role, you will: The DevOps Engineering job is responsible for developing automations across the Technology delivery lifecycle including construction, testing, release and ongoing service management, and monitoring of a product or service within a Technology team. They will be required to continually enhance their skills within a number of specialisms which include CI/CD, automation, pipeline development, security, testing, and operational support. This role will carry out some or all of the following activities: The role of the DevOps engineer is to facilitate the application teams across the Bank to deploy and their applications across GCP services like GKE Container, BigQuery, Dataflow, PubSub, Kafka The DevOps Engineer should be the go-to person in case application team faces any issue during Platform adoption, onboarding, deployment and environment troubleshooting. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable End to end accountability for a product or service, identifying and developing the most appropriate Technology solutions to meet customer needs as part of the Customer Journey Liaise with other engineers, architects, and business stakeholders to understand and drive the product or service’s direction. Analyze production errors to define and create tools that help mitigate problems in the system design stage and applying user-defined integrations, improving the user experience. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree in Computer Science or related disciplines 6 or more years of hands-on development experience building fully self-serve, observable solutions using infrastructure and Policy As A Code Proficiency developing with modern programming languages and and ability to rapidly develop proof-of-concepts Ability to work with geographically distributed and cross-functional teams Expert in code deployment tools (Jenkins, Puppet, Ansible, Git, Selenium, and Chef) Expert in automation tools (CloudFormation, Terraform, shell script, Helm, Ansible) Familiar with Containers (Docker, Docker compose, Kubernetes, GKE) Familiar with Monitoring (DATADOG, Grafana, Prometheus, AppDynamics, New Relic, Splunk) The successful candidate will also meet the following requirements: Good understanding of GCP Cloud or Hybrid Cloud approach implementations Good understanding and experience on MuleSoft / PCF/Any Gateway Server Implementations Hands on experience in Kong API Gateway platform Good understanding and experience on Middleware and MQ areas. Familiar with infrastructure support Apache Gateway, runtime Server Configurations, SSL Cert setup etc You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 3 weeks ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices.
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices.
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices.
Posted 3 weeks ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
CloudWerx is looking for a dynamic SENIOR ENGINEER, DATA to become a vital part of our vibrant DATA ANALYTICS & ENGINEERING TEAM , working in HYDERABAD, INDIA . Join the energy and come be part of the momentum! As a Senior Cloud Data Engineer you will be at the forefront of cloud technology, architecting and implementing cutting-edge data solutions that drive business transformation. You'll have the opportunity to work with a diverse portfolio of clients, from innovative startups to industry leaders, solving complex data challenges using the latest GCP technologies. This role offers a unique blend of technical expertise and client interaction, allowing you to not only build sophisticated data systems but also to consult directly with clients, shaping their data strategies and seeing the real-world impact of your work. If you're passionate about pushing the boundaries of what's possible with cloud data engineering and want to be part of a team that's shaping the future of data-driven decision making, this is your chance to make a significant impact in a rapidly evolving field. Our goal is to have a sophisticated team equipped with expert technical skills in addition to keen business acumen. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This is an incredible opportunity to get involved in an engineering-focused cloud consulting company that provides the most elite technology resources to solve the toughest challenges. Each member of our team adds unique value to the business and the customer. CloudWerx is committed to a culture where we attract the best talent in the industry. We aim to be second-to-none when it comes to cloud consulting and business acceleration. This role is a full-time opportunity in our Hyderabad Office. INSIGHT ON YOUR IMPACT Lead technical discussions with clients, translating complex technical concepts into clear, actionable strategies that align with their business goals. Architect and implement innovative data solutions that transform our clients' businesses, enabling them to harness the full power of their data assets. Collaborate with cross-functional teams to design and optimize data pipelines that process petabytes of data, driving critical business decisions and insights. Mentor junior engineers and contribute to the growth of our data engineering practice, fostering a culture of continuous learning and innovation. Drive the adoption of cutting-edge GCP technologies, positioning our company and clients at the forefront of the cloud data revolution. Identify opportunities for process improvements and automation, increasing the efficiency and scalability of our consulting services. Collaborate with sales and pre-sales teams to scope complex data engineering projects, ensuring technical feasibility and alignment with client needs. YOUR QUALIFICATION, YOUR INFLUENCE To be successful in the role, you must possess the following skills Proven experience (typically 4-8 years) in data engineering, with a strong focus on Google Cloud Platform technologies. Deep expertise in GCP data services, particularly tools like BigQuery, Cloud Composer, Cloud SQL, and Dataflow, with the ability to architect complex data solutions. Strong proficiency in Python and SQL, with the ability to write efficient, scalable, and maintainable code. Demonstrated experience in data modeling, database performance tuning, and cloud migration projects. Excellent communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. Proven ability to work directly with clients, understanding their business needs and translating them into technical solutions. Strong project management skills, including experience with Agile methodologies and tools like Jira. Ability to lead and mentor junior team members, fostering a culture of knowledge sharing and continuous improvement. Track record of staying current with emerging technologies and best practices in cloud data engineering. Experience working in a consulting or professional services environment, with the ability to manage multiple projects and priorities. Demonstrated problem-solving skills, with the ability to think creatively and innovatively to overcome technical challenges. Willingness to obtain relevant Google Cloud certifications if not already held. Ability to work collaboratively in a remote environment, with excellent time management and self-motivation skills. Cultural sensitivity and adaptability, with the ability to work effectively with diverse teams and clients across different time zones. Our Diversity and Inclusion Commitment At CloudWerx, we are dedicated to creating a workplace that values and celebrates diversity. We believe that a diverse and inclusive environment fosters innovation, collaboration, and mutual respect. We are committed to providing equal employment opportunities for all individuals, regardless of background, and actively promote diversity across all levels of our organization. We welcome all walks of life, as we are committed to building a team that embraces and mirrors a wide range of perspectives and identities. Join us in our journey toward a more inclusive and equitable workplace. Background Check Requirement All candidates for employment will be subject to pre-employment background screening for this position. All offers are contingent upon the successful completion of the background check. For additional information on the background check requirements and process, please reach out to us directly. Our Story CloudWerx is an engineering-focused cloud consulting firm born in Silicon Valley - in the heart of hyper-scale and innovative technology. In a cloud environment we help businesses looking to architect, migrate, optimize, secure or cut costs. Our team has unique experience working in some of the most complex cloud environments at scale and can help businesses accelerate with confidence.
Posted 3 weeks ago
6.0 years
0 Lacs
Greater Kolkata Area
On-site
GCP Data Engineer (6 Years a GCP Data Engineer(6 Years Experience), you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Skills Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development and solution reviews, mentor junior Data Engineering Specialists lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, and Dataproc (good to : Bachelors degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 6+ years of GCP data engineering experience. GCP Certified Data Engineer (preferred). (ref:hirist.tech)
Posted 3 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Apache Spark, Snowflake, data engineer, spark, kafka, azure, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools.
Posted 3 weeks ago
25.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Company PayPal has been revolutionizing commerce globally for more than 25 years. Creating innovative experiences that make moving money, selling, and shopping simple, personalized, and secure, PayPal empowers consumers and businesses in approximately 200 markets to join and thrive in the global economy. We operate a global, two-sided network at scale that connects hundreds of millions of merchants and consumers. We help merchants and consumers connect, transact, and complete payments, whether they are online or in person. PayPal is more than a connection to third-party payment networks. We provide proprietary payment solutions accepted by merchants that enable the completion of payments on our platform on behalf of our customers. We offer our customers the flexibility to use their accounts to purchase and receive payments for goods and services, as well as the ability to transfer and withdraw funds. We enable consumers to exchange funds more safely with merchants using a variety of funding sources, which may include a bank account, a PayPal or Venmo account balance, PayPal and Venmo branded credit products, a credit card, a debit card, certain cryptocurrencies, or other stored value products such as gift cards, and eligible credit card rewards. Our PayPal, Venmo, and Xoom products also make it safer and simpler for friends and family to transfer funds to each other. We offer merchants an end-to-end payments solution that provides authorization and settlement capabilities, as well as instant access to funds and payouts. We also help merchants connect with their customers, process exchanges and returns, and manage risk. We enable consumers to engage in cross-border shopping and merchants to extend their global reach while reducing the complexity and friction involved in enabling cross-border trade. Our beliefs are the foundation for how we conduct business every day. We live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that we work together as one global team with our customers at the center of everything we do – and they push us to ensure we take care of ourselves, each other, and our communities. Job Description Summary: What you need to know about the role As a senior member in Risk & Servicing Business Intelligence team, you will be responsible for building datasets that are key for measuring vital servicing and risk metrics for the company. You will get exposure to solving complex problems in big data environment, use data to provide meaningful insights and drive implementable solutions that are cloud optimized. This is a great opportunity to be part of critical business decisions being made and delight our customers with the best experiences. Willingness to mentor & develop more junior members on the team is an essential component for this role. Meet our team Business Intelligence team within Global Financial Crimes and Customer Protection organization caters to all risk and servicing related analytical needs for the organization. We enable business to take informed data driven decisions. We collaborate with a wide variety of partners in the company, including product experts, risk management specialists and finance specialists to bring our analytical insights to life, impacting the experience and security of millions of users around the globe. Job Description: Your way to impact Global Financial Crimes and Customer Protection (GFCCP) is responsible for fulfilling PayPal’s commitment to combat money laundering, terrorism financing, and related financial crimes around the world, including sanctions enforcement. This is achieved through a global network of best-in-class investigators and specialized teams dedicated to the development and implementation of strategic policies, advanced analytics, robust reporting, and cutting-edge technology. Your day-to-day Dive into extensive datasets, defining ETL pipelines to build datasets for tracking essential KPIs to drive actionable insights for informed business decisions. Develop controls and monitoring mechanisms to ensure performance and quality of the pipelines, data and it sources against business goal, regulatory requirements, and business priorities. Drive and lead data governance to ensure sanity of the data and its controls. Architect and oversee the entire data infrastructure lifecycle, ensuring seamless automation of processes from end to end, ensuring efficiency and accuracy. Lead or participate in ad-hoc projects and initiatives related to risk management, operations optimization, leveraging data-driven approaches to address specific business challenges or opportunities. Work on assignments of intermediate complexity with minimum supervision and constantly seek improvement within defined tasks. Good understanding of general business trends and directions to be able to put own work in a broad business context. Drive effective communication with collaborators to achieve the optimal level of decisioning and business results. Ability to facilitate and collaborate in proactive derivation of actionable insights to enhance and influence operational decision making What do you need to bring 5+ years of experience in data engineering with expertise in distributed computing(cloud/on-prem). High proficiency in Big Query, SQL, Python, Spark. Familiarity with core Google Cloud Platform (GCP) data services, such as BigQuery, Cloud Storage, and potentially Dataflow or Dataproc. Proficiency in data modelling and documentation. Experience in leading and delivering end-to-end Business Intelligence Solutions (requirement analysis, analyzing large, multi-dimensional data sets and synthesizing quality checks into actions). Highly acquaintance with statistics; exhibit critical and analytical thinking, with a focus on problem-solving and attention to detail. Industry experience in payments, e-commerce, or financial services is an advantage. Highly motivated, result-oriented self-starter who enjoys working in a fast-paced environment. Proven ability to function well independently as well as in a team. Strong interpersonal skills including ability to present insights and recommendations persuasively. Skill and confidence in dealing with people at all levels of the organization are essential Clear, strategic thinker with vision, with the ability to execute on priorities. "We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don’t hesitate to apply." For the majority of employees, PayPal's balanced hybrid work model offers 3 days in the office for effective in-person collaboration and 2 days at your choice of either the PayPal office or your home workspace, ensuring that you equally have the benefits and conveniences of both locations. Our Benefits: At PayPal, we’re committed to building an equitable and inclusive global economy. And we can’t do this without our most important asset—you. That’s why we offer benefits to help you thrive in every stage of life. We champion your financial, physical, and mental health by offering valuable benefits and resources to help you care for the whole you. We have great benefits including a flexible work environment, employee shares options, health and life insurance and more. To learn more about our benefits please visit https://www.paypalbenefits.com Who We Are: To learn more about our culture and community visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, pregnancy, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state, or local law. In addition, PayPal will provide reasonable accommodations for qualified individuals with disabilities. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at paypalglobaltalentacquisition@paypal.com. Belonging at PayPal: Our employees are central to advancing our mission, and we strive to create an environment where everyone can do their best work with a sense of purpose and belonging. Belonging at PayPal means creating a workplace with a sense of acceptance and security where all employees feel included and valued. We are proud to have a diverse workforce reflective of the merchants, consumers, and communities that we serve, and we continue to take tangible actions to cultivate inclusivity and belonging at PayPal. Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don’t hesitate to apply. REQ ID R0128180
Posted 3 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. Who We Are ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. Our network of brands include ACV Auctions, ACV Transportation, ClearCar, MAX Digital and ACV Capital within its Marketplace Products, as well as, True360 and Data Services. ACV Auctions in Chennai, India are looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles in corporate, operations, and product and technology. Our global product and technology organization spans product management, engineering, data science, machine learning, DevOps and program leadership. What unites us is a deep sense of customer centricity, calm persistence in solving hard problems, and a shared passion for innovation. If you're looking to grow, lead, and contribute to something larger than yourself, we'd love to have you on this journey. Let's build something extraordinary together. Join us in shaping the future of automotive! At ACV we focus on the Health, Physical, Financial, Social and Emotional Wellness of our Teammates and to support this we offer industry leading benefits and wellness programs. Who We Are Looking For The data engineering team's mission is to provide high availability and high resiliency as a core service to our ACV applications. The team is responsible for ETL’s using different ingestion and transformation techniques. We are responsible for a range of critical tasks aimed at ensuring smooth and efficient functioning and high availability of ACVs data platforms. We are a crucial bridge between Infrastructure Operations, Data Infrastructure, Analytics, and Development teams providing valuable feedback and insights to continuously improve platform reliability, functionality, and overall performance. We are seeking a talented data professional as a Senior Data Engineer to join our Data Engineering team. This role requires a strong focus and experience in software development, multi-cloud based technologies, in memory data stores, and a strong desire to learn complex systems and new technologies. It requires a sound foundation in database and infrastructure architecture, deep technical knowledge, software development, excellent communication skills, and an action-based philosophy to solve hard software engineering problems. What You Will Do As a Data Engineer at ACV Auctions you HAVE FUN !! You will design, develop, write, and modify code. You will be responsible for development of ETLs, application architecture, optimizing databases & SQL queries. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Design, develop, and maintain scalable ETL pipelines using Python and SQL to ingest, process, and transform data from diverse sources. Write clean, efficient, and well-documented code in Python and SQL. Utilize Git for version control and collaborate effectively with other engineers. Implement and manage data orchestration workflows using industry-standard orchestration tools (e.g., Apache Airflow, Prefect).. Apply a strong understanding of major data structures (arrays, dictionaries, strings, trees, nodes, graphs, linked lists) to optimize data processing and storage. Support multi-cloud application development. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in MySQL and Postgres database development. Influence company wide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Mentor junior data engineers. Assist/lead technical discussions/innovation including engineering tech talks Assist in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements. Participate in on-call rotation What You Will Need Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 4+ years of experience programming in Python 3+ years of experience with ETL workflow implementation (Airflow, Python) 3+ years work with continuous integration and build tools. 3+ years of experience with Cloud platforms preferably in AWS or GCP Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Deep Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Proficient in databases (RDB), SQL, and can contribute to schema definitions. Self-sufficient debugger who can identify and solve complex problems in code. Deep understanding of major data structures (arrays, dictionaries, strings). Experience with Domain Driven Design. Experience with containers and Kubernetes. Experience with database monitoring and diagnostic tools, preferably Data Dog. Hands-on skills and the ability to drill deep into the complex system design and implementation. Proficiency in SQL query writing and optimization. Experience with database security principles and best practices. Experience with in-memory data processing Experience working with data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Experience working with: SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python, Docker, Kubernetes Nice To Have Qualifications Experience with Airflow, Docker, Visual Studio, Pycharm, Redis, Kubernetes, Fivetran, Spark, Dataflow, Dataproc, EMR Experience with database monitoring and diagnostic tools, preferably DataDog Hands-on experience with Kafka or other event streaming technologies. Hands-on experience with micro-service architecture Our Values Trust & Transparency | People First | Positive Experiences | Calm Persistence | Never Settling At ACV, we are committed to an inclusive culture in which every individual is welcomed and empowered to celebrate their true selves. We achieve this by fostering a work environment of acceptance and understanding that is free from discrimination. ACV is committed to being an equal opportunity employer regardless of sex, race, creed, color, religion, marital status, national origin, age, pregnancy, sexual orientation, gender, gender identity, gender expression, genetic information, disability, military status, status as a veteran, or any other protected characteristic. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you have a disability or special need that requires reasonable accommodation, please let us know. Data Processing Consent When you apply to a job on this site, the personal data contained in your application will be collected by ACV Auctions Inc. and/or one of its subsidiaries ("ACV Auctions"). By clicking "apply", you hereby provide your consent to ACV Auctions and/or its authorized agents to collect and process your personal data for purpose of your recruitment at ACV Auctions and processing your job application. ACV Auctions may use services provided by a third party service provider to help manage its recruitment and hiring process. For more information about how your personal data will be processed by ACV Auctions and any rights you may have, please review ACV Auctions' candidate privacy notice here. If you have any questions about our privacy practices, please contact datasubjectrights@acvauctions.com.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Hybrid
Job Description: We are seeking an experienced Data Engineer for a contract/consulting engagement to design, build, and maintain scalable data infrastructure using Google Cloud Platform technologies and advanced analytics visualization. This role requires 5-8 years of hands-on experience in modern data engineering practices with a strong focus on cloud-native solutions and business intelligence. Key Responsibilities: Data Infrastructure & Engineering (70%) Experience in designing tables and working with complex queries in Google BigQuery Build and maintain data transformation workflows using Dataflow, Dataform Design and implement robust data pipelines using Apache Airflow for workflow orchestration and scheduling Architect scalable ETL/ELT processes handling large-scale data ingestion from multiple sources Optimize BigQuery performance through partitioning, clustering, and cost management strategies Collaborate with DevOps teams to implement CI/CD pipelines for data infrastructure Solid technical background with a complete understanding of Data Warehouse Modeling, architectures, OLAP, OLTP data sets, etc Experience with Java, Python will be plus Analytics & Visualization (30%) Create compelling data visualizations and interactive dashboards using Tableau Experience in designing Tableau models with Live & Extract data source. Good to have experience in Tableau Prep Partner with business stakeholders to translate requirements into analytical solutions Design and implement self-service analytics capabilities for end users Optimize Tableau workbooks for performance and user experience Integrate Tableau with BigQuery for real-time analytics and reporting Technical Skills Core Data Engineering (Must Have) 5-8 years of progressive experience in data engineering roles Expert-level proficiency in SQL with complex query optimization experience Hands-on experience with Google BigQuery for data warehousing and analytics Proven experience with Apache Airflow for workflow orchestration and pipeline management Working knowledge of Dataflow and Dataform for data transformation and modeling Experience with GCP services: Cloud Storage, Pub/Sub, Cloud Functions, Cloud Composer Visualization & Analytics Strong proficiency in Tableau for data modeling, data visualization and dashboard development Experience integrating Tableau with cloud data platforms Understanding of data visualization best practices and UX principles Knowledge of Tableau Server/Cloud administration and governance Additional Technical Requirements Experience with version control systems (Git) and collaborative development practices Knowledge of data modeling techniques (dimensional modeling, data vault) Understanding of data governance, security, and compliance frameworks Experience with infrastructure as code (Terraform preferred) Familiarity with scripting languages (Python/Java) for data processing Preferred Qualifications Google Cloud Professional Data Engineer certification Tableau Desktop Certified Professional or equivalent certification Experience with real-time data processing and streaming analytics Knowledge of machine learning workflows and MLOps practices Previous experience in agile development environments Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are looking for a seasoned Project Manager with a strong background in Google Cloud Platform (GCP) and DevOps methodologies. The ideal candidate will be responsible for planning, executing, and finalizing projects according to strict deadlines and within budget. This includes acquiring resources and coordinating the efforts of team members and third-party contractors or consultants in order to deliver projects according to plan. The GCP DevOps Project Manager will also define the project’s objectives and oversee quality control throughout its life cycle. Key Responsibilities: ● Lead end-to-end planning, execution, and delivery of Data Foundation initiatives across multiple workstreams (e.g., Data Lake, Observability, IAM, Metadata, Ingestion Pipelines). ● Coordinate across platform, engineering, data governance, cloud infrastructure, and business teams to ensure alignment on scope, dependencies, and delivery timelines. ● Own program-level tracking of deliverables, milestones, risks, and mitigation plans. ● Drive platform enablement efforts (e.g., GCP/AWS setup, Kafka, BigQuery, Snowflake, IAM, monitoring tooling) and ensure their operational readiness. ● Manage stakeholder communications, steering committee updates, and executive reporting. ● Define and manage program OKRs, KPIs, and success metrics. ● Lead technical discussions to assess readiness, unblock execution, and ensure architectural alignment. ● Support cross-team collaboration on data security, access management, observability (Grafana, Prometheus, SIEM), and operational automation. ● Manage vendor relationships and coordinate delivery with third-party partners where applicable. Required Skills and Qualifications ● 8+ years of experience in Technical Program Management or Engineering Program Management roles. ● Proven experience in leading data platform or data foundation programs in a cloud-native environment (GCP, AWS, or Azure). ● Strong knowledge of data platform components : data lakes, ingestion pipelines, metadata tools (e.g., Marquez, Collibra), observability (Grafana, Prometheus), lineage, and data access governance. ● Experience working with DevOps, Security, and Architecture teams to align on infrastructure and platform requirements. ● Familiarity with Agile/Scrum methodologies, Jira/Confluence, and project tracking tools. ● Excellent communication, stakeholder management, and leadership skills. Preferred Qualifications: ● Experience with GCP-native data services (BigQuery, Dataflow, Dataproc, Pub/Sub). ● Working knowledge of IAM models , RBAC/ABAC, and cloud-native security controls. ● Certification in cloud platforms (GCP, AWS, or Azure) or PMP/CSM. ● Exposure to DataOps , CI/CD pipelines , and infrastructure-as-code tools (e.g., Terraform). Thanks & Regards Prashant Awasthi Vastika Technologies PVT LTD 9711189829
Posted 3 weeks ago
3.0 - 8.0 years
7 - 13 Lacs
Hyderabad
Work from Office
Role - Machine Learning Engineer Required Skills & Experience 3+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting. Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts. Proven expertise in data blending, transformation, and feature engineering , adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data. Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable. Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R). Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision . Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch ). Familiarity with MLOps tools and practices , including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines. Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs. Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Interested candidates share cv to dikshith.nalapatla@motivitylabs.com
Posted 3 weeks ago
35.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About us One team. Global challenges. Infinite opportunities. At Viasat, we’re on a mission to deliver connections with the capacity to change the world. For more than 35 years, Viasat has helped shape how consumers, businesses, governments and militaries around the globe communicate. We’re looking for people who think big, act fearlessly, and create an inclusive environment that drives positive impact to join our team. What you'll do The Enterprise Architecture team is focused on providing solutions to enable an effective software engineering workforce that can scale to the business needs. This includes exploring how the business needs map to the application portfolio, business processes, APIs, and data elements across the organization. As a member of this team, you will build up a vast knowledge in software development, cloud application engineering, automation, and container orchestration. Our ideal candidate values communication, learning, adaptability, creativity, and ingenuity. They also enjoy working on challenging technical issues and use creative, innovative techniques to develop and automate solutions. This team is focused on providing our executive and business leadership with visibility into how the software organization is functioning and what opportunities lie to transform the business. In this position you will be an integral part of the Enterprise Architecture team and make meaningful impacts in our journey towards digital transformation. The day-to-day A Strong Data-Model and High-Quality Data are pre-requisites to provide better insights and enable solid data-driven decision making. This is also key to take advantage of various technological advances in Artificial Intelligence and Machine Learning. Your responsibilities will involve build out of data models for various aspects of our enterprise in conjunction with domain experts. Examples include but are not limited to Network, Capacity, Finance, Business Support Systems etc. Responsibilities also include working with software product teams to improve data quality across the organization. What you'll need Bachelor's degree or higher in Computer Science & Applications, Computer Science and Computer & Systems Engineering, Computer Science & Engineering, Computer Science & Mathematics, Computer Science & Network Security and Math & Computer Science, and/or a related field Solid understanding of Data Architecture and Data Engineering principles Experience building out data models Experience performing data analysis and presenting data in easy to comprehend manner. Experience in working with Relational Databases, NoSQL, Large Scale Data technologies (Kafka, Big Query, Snowflake etc) Experience with digital transformation across multiple cloud platforms like AWS and GCP. Experience in modernizing data platforms especially in GCP is highly preferred. Partner with members of Data Platform team and others to build out Data Catalog and map to the data model Detail Oriented to ensure that the catalog represents quality data Solid communication skills and ability to work on a distributed team Tenacity to remain focused on the mission and overcome obstacles Ability to perform hands-on work with development teams and guide them to building necessary data models. Experience setting up governance structure and changing the organization culture by influence What will help you on the job Experience with Cloud Technologies: AWS, GCP, and/or Azure, etc. Expertise in GCP data services like Cloud Pub/Sub, Dataproc, Dataflow, BigQuery, and related technologies preferred. Experience with Airflow, DBT and SQL. Experience with Open-source software like Logstash, ELK stack, Telegraf, Prometheus and OpenTelemetry is a plus. Passionate to deliver solutions that improve developer experience and promote API-first principles and microservices architecture. Experience with Enterprise Architecture and related principles
Posted 3 weeks ago
0 years
0 Lacs
India
On-site
Key Requirements Technical Skills Expert in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, and Cloud Functions. GCP Professional Data Engineer Certification is highly favourable. Advanced knowledge of SQL for complex data transformation and query optimization. Proven experience in Python for scalable data pipeline development and orchestration following best practices. Experience implementing Terraform for Infrastructure as Code (IaC) to automate GCP resource management. Knowledge of CI/CD pipelines and automated deployment practices. Experience with containerization technologies (e.g., Docker, Kubernetes) Experience building and optimizing batch and streaming data pipelines. Understanding of data governance principles, GCP security (IAM, VPC), and compliance requirements. Soft Skills Demonstrates a growth mindset by actively seeking to learn from peers and stakeholders, fostering a culture of open communication and shared knowledge. Works effectively across teams, including Data Science, Engineering, and Analytics, to understand their needs and deliver impactful data solutions. Actively participates in design discussions, brainstorming sessions, and cross-functional projects, always striving for continuous improvement and innovation. Builds strong relationships across the organization, using empathy and active listening to ensure alignment on goals and deliverables. Approaches challenges with a growth mindset , viewing obstacles as opportunities to innovate and improve processes. Applies a structured and analytical approach to solving complex problems, balancing immediate needs with long-term scalability and efficiency. Demonstrates resilience under pressure, maintaining a positive and solution-focused attitude when faced with tight deadlines or ambiguity. Actively seeks feedback and lessons learned from past projects to continuously refine problem-solving strategies and improve outcomes. Shares expertise generously, guiding team members in adopting best practices and helping them overcome technical challenges. Leads by example, demonstrating how to approach complex problems pragmatically while promoting curiosity and a willingness to explore new tools and technologies. Encourages professional development within the team, supporting individuals in achieving their career goals and obtaining certifications, especially within the Google Cloud ecosystem. Main duties and responsibilities Design, develop, and maintain scalable data pipelines using modern data engineering tools and technologies on our GCP stack. Build and optimize our lake house on Google Cloud Platform (GCP) Implement data ingestion, transformation, and loading processes for various data sources (e.g., databases, APIs, cloud storage) Ensure data quality, consistency, and security throughout the data pipeline Leverage GCP services (e.g., Dataflow, Dataproc, BigQuery, Cloud Storage) to build and maintain cloud-native data solutions Implement infrastructure as code (IaC) principles using Terraform to automate provisioning and configuration Manage and optimize cloud resources to ensure cost-efficiency and performance Design and implement efficient data models following a star schema approach to support analytical and operational workloads Collaborate with data analysts to develop advanced analytics solutions. Conduct data quality analysis to drive better data management on outputs in our Curated Layer. Mentor junior data engineers and provide technical guidance Contribute to the development of data engineering best practices and standards Collaborate with cross-functional teams to deliver complex data projects
Posted 3 weeks ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of relevant software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
```html About the Company Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning. About the Role Understands programming language like SQL, Python, R-Scala. Responsibilities Good Python skills. Experience from data visualisation tools such as Google Data Studio or Power BI. Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing. Strong Migration experience of production Hadoop Cluster to Google Cloud. Qualifications Good To Have:- Required Skills Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc. Preferred Skills None specified. Pay range and compensation package Not specified. Equal Opportunity Statement Not specified. ```
Posted 3 weeks ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of relevant software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Data Scientist – Recommender Systems Location: Bengaluru (Hybrid) Role Summary We’re seeking a skilled Data Scientist with deep expertise in recommender systems to design and deploy scalable personalization solutions. This role blends research, experimentation, and production-level implementation, with a focus on content-based and multi-modal recommendations using deep learning and cloud-native tools. Responsibilities Research, prototype, and implement recommendation models: two-tower, multi-tower, cross-encoder architectures Utilize text/image embeddings (CLIP, ViT, BERT) for content-based retrieval and matching Conduct semantic similarity analysis and deploy vector-based retrieval systems (FAISS, Qdrant, ScaNN) Perform large-scale data prep and feature engineering with Spark/PySpark and Dataproc Build ML pipelines using Vertex AI, Kubeflow, and orchestration on GKE Evaluate models using recommender metrics (nDCG, Recall@K, HitRate, MAP) and offline frameworks Drive model performance through A/B testing and real-time serving via Cloud Run or Vertex AI Address cold-start challenges with metadata and multi-modal input Collaborate with engineering for CI/CD, monitoring, and embedding lifecycle management Stay current with trends in LLM-powered ranking, hybrid retrieval, and personalization Required Skills Python proficiency with pandas, polars, numpy, scikit-learn, TensorFlow, PyTorch, transformers Hands-on experience with deep learning frameworks for recommender systems Solid grounding in embedding retrieval strategies and approximate nearest neighbor search GCP-native workflows: Vertex AI, Dataproc, Dataflow, Pub/Sub, Cloud Functions, Cloud Run Strong foundation in semantic search, user modeling, and personalization techniques Familiarity with MLOps best practices—CI/CD, infrastructure automation, monitoring Experience deploying models in production using containerized environments and Kubernetes Nice to Have Ranking models knowledge: DLRM, XGBoost, LightGBM Multi-modal retrieval experience (text + image + tabular features) Exposure to LLM-powered personalization or hybrid recommendation systems Understanding of real-time model updates and streaming ingestion
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
UST is looking for a talented GCP Data Engineer with 6 to 10 years of experience to join our team and play a crucial role in designing and implementing efficient data solutions on the Google Cloud Platform (GCP). The ideal candidate should possess strong data engineering skills, expertise in GCP services, and proficiency in data processing technologies, particularly PySpark. Responsibilitie s:Data Pipeline Developmen t:Design, implement, and optimize end-to-end data pipelines on GCP, focusing on scalability and performanc e.Develop and maintain ETL workflows for seamless data processin g.GCP Cloud Expertis e:Utilize GCP services such as BigQuery, Cloud Storage, and Dataflow for effective data engineerin g.Implement and manage data storage solutions on GC P.Data Transformation with PySpar k:Leverage PySpark for advanced data transformations, ensuring high-quality and well-structured outpu t.Implement data cleansing, enrichment, and validation processes using PySpar k. Requirem ents:Proven experience as a Data Engineer, with a strong emphasis on GCP.Proficiency in GCP services such as BigQuery, Cloud Storage, and Data flow.Expertise in PySpark for data processing and analytics is a must.Experience with data modeling, ETL processes, and data warehou sing.Proficiency in programming languages such as Python, SQL, or Scala for data proces sing.Relevant certifications in GCP or data engineering are plus. Skills GCP, PySpark
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough