Jobs
Interviews

1348 Bigquery Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Analyze application performance and implement enhancements as needed. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of application development methodologies.- Experience with version control systems such as Git.- Familiarity with RESTful APIs and web services. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Axway API Management Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Axway API Management Platform.- Strong understanding of API design and development principles.- Experience with application integration and data flow management.- Familiarity with cloud-based services and deployment strategies.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Axway API Management Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Description: 5+ years of experience in Java, Springboot, Microservices, ReactJS, product development and sustenance Troubleshooting and debugging of existing code when required. Proficient in code quality, security compliance and app performance mgmt Participation in agile planning process and estimation of planned tasks Good verbal written communication skills Good expertise in unit testing (Junit) Requirements: Qualifications & Experience • 5+ years of experience developing and designing software applications using Java • Expert understanding of core computer science fundamentals including data structures, algorithms, and concurrent programming • Expert in analyzing, designing, implementing and troubleshooting software solutions for highly transactional systems. • Expert in OOAD and design principals, implementing micro services architecture using JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow. • Experience working in Native & Hybrid Cloud environment. • Experience with Agile development methodology. • Proficiency in agile software development including technical skillsets such as programming (e.g., Python, Java), multi-tenant cloud technologies, and product management tools (e.g., Jira) • Strong collaboration and communication skills to effectively work across the product team with product and technology team members and clearly articulate technical ideas • Ability to translate strategic priorities as features and user stories into scalable solutions that are structured, efficient, and user-centric • Detail-oriented problem solver who can break down complex issues to deliver effectively • Excellent communication and team player with can-do attitude. • Ability to analyze user and business requirements to create technical design requirements and software architecture • Experience must also include: • Java • Java IDE like Eclipse or IntelliJ • Java EE Application servers like Apache Tomcat • Object-oriented design, Git, Maven, and a popular scripting language • JSON, XML, YAML, Terraform scripting languages Preferred Skills/Experience: • Champion of Agile Scrum methodologies • Experience continuous integration systems like Jenkins or GitHub CI • Experience with SAFe methodologies • Deep knowledge and understanding to create secure solutions by design • Multi-threaded backend environments with concurrent users • Experience with tools or languages like: • Ruby, Python, Perl, Node.js and bash scripting languages • Spring, Spring Boot • C, C++, Java and Java EE development experience • Oracle • Docker • Kubernetes Job Responsibilities: Key Responsibilities & Deliverables • Feature implementation and production ready code • Technical documentation and system diagrams • Debugging reports and fixes • Performance optimizations What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and usable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain robust data pipelines to support data processing and analytics.- Collaborate with data architects and analysts to design data models that meet business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with data warehousing concepts and technologies. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are robust, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain robust data pipelines to support data processing and analytics.- Collaborate with data architects and analysts to design data models that meet business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with data warehousing concepts and technologies. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. A typical day involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and optimize data workflows, ensuring that the data infrastructure supports the organization's analytical needs effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and data lake architectures.- Familiarity with data integration tools and ETL frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Noida

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 7.0 years

25 - 34 Lacs

Pune

Work from Office

Backend Engineer (Python, Node Js, Agentic AI) Exp : 3 - 7 yrs Location: Pune Min 3 yrs exp in Backend developer exposure to AI/ML Min 2+ yrs exp in backend development using Python and Node.js For JD, write to amit.gaine@talentxo.in

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The role involves working on Agentic AI Platform Delivery where you will be responsible for developing and maintaining autonomous software agents using modern LLM frameworks. You will also be required to build reusable components for business process automation, design agent orchestration, prompt engineering, and LLM integrations, as well as enable deployment across CRM systems and enterprise data platforms. Additionally, you will work on Generative AI & Model Optimization by fine-tuning LLMs/SLMs with proprietary NBFC data, and focusing on model distillation, quantization, and edge deployment readiness. Moreover, you will be creating self-learning systems by developing adaptive frameworks that learn from interaction outcomes and implementing lightweight models to support real-time decision-making. The ideal candidate for this position should hold a B.E./B.Tech/M.Tech degree in Computer Science or a related field, with a minimum of 3-7 years of experience in AI/ML roles. Proficiency in languages such as Python, Node.JS, JavaScript, React, and Java is required, along with experience in using tools/frameworks like LangChain, Semantic Kernel, LangGraph, and CrewAI. Familiarity with platforms such as GCP, MS Foundry, Copilot Studio, BigQuery, and Power Apps/BI is essential. Knowledge of Agent Tools like Agent Development Kit (ADK) and Multi-agent Communication Protocol (MCP), as well as a strong understanding of prompt engineering, LLM integration, and orchestration, are also expected. In addition to the technical requirements, the candidate should possess strong communication skills, teamwork abilities, and problem-solving capabilities. Being proactive, detail-oriented, and able to work under pressure are desirable traits for this role. Joining this dynamic team at Bajaj Finance Limited offers numerous perks, benefits, and a vibrant work culture. The company values its employees and recognizes their contributions, making it a rewarding and challenging environment to grow and excel. With a presence in over 500 locations across India, you will have the opportunity to be part of one of Asia's top 10 Large workplaces and contribute to the success and innovation of a leading Non-banking financial company.,

Posted 5 days ago

Apply

5.0 - 10.0 years

0 Lacs

maharashtra

On-site

You are a highly skilled and motivated Lead Data Scientist / Machine Learning Engineer sought to join a team pivotal in the development of a cutting-edge reporting platform. This platform is designed to measure and optimize online marketing campaigns effectively. Your role will involve focusing on data engineering, ML model lifecycle, and cloud-native technologies. You will be responsible for designing, building, and maintaining scalable ELT pipelines, ensuring high data quality, integrity, and governance. Additionally, you will develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experimenting with different algorithms and leveraging various models will be crucial in driving insights and recommendations. Furthermore, you will deploy and monitor ML models in production and implement CI/CD pipelines for seamless updates and retraining. You will work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translating complex model insights into actionable business recommendations and presenting findings to stakeholders will also be part of your responsibilities. Qualifications & Skills: Educational Qualifications: - Bachelors or Masters degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or related field. - Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: - Experience: 5-10 years with the mentioned skillset & relevant hands-on experience. - Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). - ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. - Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. - Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. - MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). - Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: - Experience with Graph ML, reinforcement learning, or causal inference modeling. - Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. - Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. - Experience with distributed computing frameworks (Spark, Dask, Ray). Location: - Bengaluru Brand: - Merkle Time Type: - Full time Contract Type: - Permanent,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled Data Governance Engineer to take charge of developing and overseeing robust data governance frameworks on Google Cloud Platform (GCP). Your role will involve leveraging your expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure the implementation of high-quality, secure, and compliant data practices aligned with organizational objectives. With a minimum of 4 years of experience in data governance, data management, or data security, you should possess hands-on proficiency with Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, Dataproc, and Google Data Catalog. Additionally, a strong command over metadata management, data lineage, and data quality tools like Collibra and Informatica is crucial. A deep understanding of data privacy laws and compliance frameworks, coupled with proficiency in SQL and Python for governance automation, is essential. Experience with RBAC, encryption, data masking techniques, and familiarity with ETL/ELT pipelines and data warehouse architectures will be advantageous. Your responsibilities will include developing and executing comprehensive data governance frameworks with a focus on metadata management, lineage tracking, and data quality. You will be tasked with defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards using GCP-native services like IAM, DLP, and KMS. Managing metadata repositories using tools such as Collibra, Informatica, Alation, or Google Data Catalog will also be part of your role. Collaborating with data engineering and analytics teams to ensure compliance with regulatory standards like GDPR, CCPA, SOC 2, and automating processes for data classification, monitoring, and reporting using Python and SQL will be key responsibilities. Supporting data stewardship initiatives, optimizing ETL/ELT pipelines, and data workflows to adhere to governance best practices will also be part of your role. At GlobalLogic, we offer a culture of caring, emphasizing inclusivity and personal growth. You will have access to continuous learning and development opportunities, engaging and meaningful work, as well as a healthy work-life balance. Join our high-trust organization where integrity is paramount, and collaborate with us to engineer innovative solutions that have a lasting impact on industries worldwide.,

Posted 5 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

Job Description: As a Backend Engineer at Lifesight located in Bangalore, you will be a key member of a fast-growing SaaS company focused on leveraging data & AI to enhance customer acquisition and retention. Within a team of 130 professionals serving 300+ clients across multiple global offices, you will play a crucial role in developing new software products while enhancing existing non-functional aspects. The ideal candidate should be self-motivated, self-managed, and a strong team player. Your responsibilities will include leading the development of modules/services, coding, designing, prototyping, and consulting on building scalable, reliable, and fault-tolerant systems. As a Senior Software Engineer for the platform team, you will have the opportunity to build core microservices and scalable APIs, as well as evolve architectures to handle millions of notifications daily. Additionally, you will engage in continuous refactoring, stay updated on the latest technologies in distributed systems, and collaborate with product management on feature roadmaps. Requirements: - 7+ years of hands-on experience in designing, developing, testing, and deploying large-scale applications, microservices (preferably in Java/Springboot) - Proficiency in Cloud, NoSQL stores (e.g., Google Cloud, Kubernetes, BigQuery, messaging systems) - Strong team player with a willingness to learn and a positive attitude - Experience in building low latency, high volume REST API requests handling - Familiarity with distributed caches like Redis - Ability to deliver results efficiently Bonus Points If You Have: - Experience in containerization technologies like Docker, Kubernetes - Previous work experience with cloud platforms, preferably GCP - Knowledge of NoSQL stores such as Cassandra, Clickhouse, BigQuery Benefits: Joining Lifesight as a Backend Engineer offers you the opportunity to contribute to a dynamic and highly-analytical team, where you can enhance your skills in Java, cloud engineering, and distributed systems. You will be part of an innovative company culture that values personal growth, well-being, and work-life balance. Additionally, you can expect to work in a non-bureaucratic environment, enjoy high levels of empowerment, and have the chance to shape your career within a profitable and growing organization. Competitive compensation, benefits, and a fun working atmosphere complete the package at Lifesight.,

Posted 5 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. We are seeking an experienced and highly skilled Senior Google Cloud Analytics & Vertex AI Specialist for the position of Associate Director with 12-15 years of experience, specifically focusing on Google Vertex AI. The ideal candidate will have a deep understanding of Google Cloud Platform (GCP) and extensive hands-on experience with Google Cloud analytics services and Vertex AI. The role involves leading projects, designing scalable data solutions, driving the adoption of AI and machine learning practices within the organization, and supporting pre-sales activities. A minimum of 2 years of hands-on experience with Vertex AI is required. Key Responsibilities: - Architect and Implement: Design and implement end-to-end data analytics solutions using Google Cloud services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. - Vertex AI Development: Develop, train, and deploy machine learning models using Vertex AI. Utilize Vertex AI's integrated tools for model monitoring, versioning, and CI/CD pipelines. Implement custom machine learning pipelines using Vertex AI Pipelines. Utilize Vertex AI Feature Store for feature management and Vertex AI Model Registry for model tracking. - Data Integration: Integrate data from various sources, ensuring data quality and consistency across different systems. - Performance Optimization: Optimize data pipelines and analytics processes for maximum efficiency and performance. - Leadership and Mentorship: Lead and mentor a team of data engineers and data scientists, providing guidance and support on best practices in GCP and AI/ML. - Collaboration: Work closely with stakeholders to understand business requirements and translate them into technical solutions. - Innovation: Stay updated with the latest trends and advancements in Google Cloud services and AI technologies, advocating for their adoption when beneficial. - Pre-Sales Support: Collaborate cross-functionally to understand client requirements, design tailored solutions, prepare and deliver technical presentations and product demonstrations, and assist in proposal and RFP responses. - Project Delivery: Manage and oversee the delivery of data analytics and AI/ML projects, ensuring timely and within budget completion while coordinating with cross-functional teams. Qualifications: - Experience: 12-15 years in data engineering, data analytics, and AI/ML with a focus on Google Cloud Platform. - Technical Skills: Proficient in Google Cloud services (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI), strong programming skills in Python and SQL, experience with machine learning frameworks (TensorFlow, PyTorch), data visualization tools (Looker, Data Studio). - Pre-Sales and Delivery Skills: Experience in supporting pre-sales activities, managing and delivering complex data analytics and AI/ML projects. - Certifications: Google Cloud Professional Data Engineer or Professional Machine Learning Engineer certification is a plus. - Soft Skills: Excellent problem-solving, communication, and leadership skills. Qualifications: - B.E./B.Tech/Post Graduate,

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Principal Engineer at our company, you will have the opportunity to lead the architecture and execution of our innovative GenAI-powered, self-serve marketing platforms. Working closely with the CEO, you will play a crucial role in shaping, building, and scaling products that revolutionize how marketers interact with data and AI. This position offers a unique chance to engage in real-world product development with traction, velocity, and high impact. Your responsibilities will include co-owning the product architecture and direction with the CEO, building GenAI-native full-stack platforms from MVP to scale utilizing LLMs, agents, and predictive AI. You will be in charge of the full stack, encompassing React for the frontend, Node.js/Python for the backend, GCP for infrastructure, BigQuery for data, and vector databases for AI. Leading a lean, high-caliber team, you will adopt a hands-on, unblock-and-coach mindset to drive rapid iteration with rigor while maintaining a balance between short-term delivery and long-term resilience. To excel in this role, you should possess 8-12 years of experience in building and scaling full-stack, data-heavy, or AI-driven products. Proficiency in React, Node.js, and Google Cloud is essential, with hands-on experience in GenAI tools being a bonus. You should demonstrate a track record of shipping products from ambiguity to impact, along with the ability to architect and construct scalable microservices and APIs that are robust, maintainable, and performant. Collaboration with cross-functional teams, promotion of engineering best practices, and mentorship of junior and mid-level engineers will also be key aspects of your role. Additionally, you will be responsible for proactively identifying tech debt, proposing scalable solutions, ensuring strong engineering processes through Agile practices and DevOps automation, and staying updated with emerging technology trends. Your contribution to hiring and talent development within the engineering team will further enhance our capabilities and drive innovation in our products.,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

Join GlobalLogic as a valuable member of the team working on a significant software project for a world-class company that provides M2M / IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your engagement will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Requirements - BA / BS degree in Computer Science, Mathematics, or a related technical field, or equivalent practical experience. - Proficiency in Cloud SQL and Cloud Bigtable. - Experience with Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub, and Genomics. - Familiarity with Google Transfer Appliance, Cloud Storage Transfer Service, and BigQuery Data Transfer. - Knowledge of data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and data processing algorithms (MapReduce, Flume). - Previous experience working with technical customers. - Proficiency in writing software in languages like Java or Python. - 6-10 years of relevant consulting, industry, or technology experience. - Strong problem-solving and troubleshooting skills. - Excellent communication skills. Job Responsibilities - Hands-on experience working with data warehouses, including technical architectures, infrastructure components, ETL / ELT, and reporting / analytic tools. - Experience in technical consulting. - Proficiency in architecting and developing software or internet-scale Big Data solutions in virtualized environments like Google Cloud Platform (mandatory) and AWS / Azure (good to have). - Familiarity with big data, information retrieval, data mining, machine learning, and building high availability applications with modern web technologies. - Working knowledge of ITIL and / or agile methodologies. - Google Data Engineer certification. What We Offer - Culture of caring: Prioritize a culture of caring, where people come first, fostering an inclusive environment of acceptance and belonging. - Learning and development: Commitment to continuous learning and growth, offering various programs, training curricula, and hands-on opportunities for personal and professional advancement. - Interesting & meaningful work: Engage in impactful projects that allow for creative problem-solving and exploration of new solutions. - Balance and flexibility: Embrace work-life balance with diverse career areas, roles, and work arrangements to support personal well-being. - High-trust organization: Join a high-trust organization with a focus on integrity, trustworthiness, and ethical practices. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with forward-thinking companies to create innovative digital products and experiences. Join the team in transforming businesses and industries through intelligent products, platforms, and services, contributing to cutting-edge solutions that shape the world today.,

Posted 6 days ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Chennai

Work from Office

• Experience in cloud-based systems (GCP, BigQuery) • Strong SQL programming skills. • Expertise in database programming and performance tuning techniques • Possess knowledge of data warehouse architectures, ETL, reporting/analytic tools,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for a skilled Data Governance Engineer to spearhead the development and supervision of robust data governance frameworks on Google Cloud Platform (GCP). You should have a deep understanding of data management, metadata frameworks, compliance, and security within cloud environments to ensure the adoption of high-quality, secure, and compliant data practices aligned with organizational objectives. The ideal candidate should possess: - Over 4 years of experience in data governance, data management, or data security. - Hands-on expertise with Google Cloud Platform (GCP) tools like BigQuery, Dataflow, Dataproc, and Google Data Catalog. - Proficiency in metadata management, data lineage, and data quality tools such as Collibra, Informatica. - Comprehensive knowledge of data privacy laws and compliance frameworks. - Strong skills in SQL and Python for governance automation. - Experience with RBAC, encryption, and data masking techniques. - Familiarity with ETL/ELT pipelines and data warehouse architectures. Your main responsibilities will include: - Developing and implementing comprehensive data governance frameworks emphasizing metadata management, lineage tracking, and data quality. - Defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards utilizing GCP-native services like IAM, DLP, and KMS. - Managing metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. - Collaborating with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. - Automating processes for data classification, monitoring, and reporting using Python and SQL. - Supporting data stewardship initiatives including the creation of data dictionaries and governance documentation. - Optimizing ETL/ELT pipelines and data workflows to adhere to governance best practices. At GlobalLogic, we offer: - A culture of caring that prioritizes inclusivity, acceptance, and personal connections. - Continuous learning and development opportunities to enhance your skills. - Engagement in interesting and meaningful work with cutting-edge solutions. - Balance and flexibility to help you integrate work and life effectively. - A high-trust organization committed to integrity and ethical practices. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to world-renowned companies, focusing on creating innovative digital products and experiences. Join us to collaborate on transforming businesses through intelligent products, platforms, and services.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

You are a Data Engineer with 3+ years of experience, proficient in SQL and Python development. You will be responsible for designing, developing, and maintaining scalable data pipelines to support ETL processes using tools like Apache Airflow, AWS Glue, or similar. Your role involves optimizing and managing relational and NoSQL databases such as MySQL, PostgreSQL, MongoDB, or Cassandra for high performance and scalability. You will write advanced SQL queries, stored procedures, and functions to efficiently extract, transform, and analyze large datasets. Additionally, you will implement and manage data solutions on cloud platforms like AWS, Azure, or Google Cloud, utilizing services such as Redshift, BigQuery, or Snowflake. Your contributions to designing and maintaining data warehouses and data lakes will support analytics and BI requirements. Automation of data processing tasks through script and application development in Python or other programming languages is also part of your responsibilities. As a Data Engineer, you will implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions is essential. Identifying and resolving performance bottlenecks in data systems, optimizing data storage, and retrieval are key aspects. Maintaining comprehensive documentation for data processes, pipelines, and infrastructure is crucial. Staying up-to-date with the latest trends in data engineering, big data technologies, and cloud services is expected from you. You should hold a Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. Proficiency in SQL, relational databases, NoSQL databases, Python programming, and experience with data pipeline tools and cloud platforms is required. Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Strong analytical and problem-solving skills with a focus on performance optimization and scalability are essential. Excellent verbal and written communication skills are necessary to convey technical concepts to non-technical stakeholders. You should be able to work collaboratively in cross-functional teams. Preferred certifications include AWS Certified Data Analytics, Google Professional Data Engineer, or similar. An eagerness to learn new technologies and adapt quickly in a fast-paced environment is a mindset that will be valuable in this role.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

punjab

On-site

You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,

Posted 6 days ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be working as a Technical Lead Data Engineer for a leading data and AI/ML solutions provider based in Gurgaon. In this role, you will be responsible for designing, developing, and leading complex data projects primarily on Google Cloud Platform and other modern data stacks. Your key responsibilities will include leading the design and implementation of robust data pipelines, collaborating with cross-functional teams to deliver end-to-end data solutions, owning project modules, developing technical roadmaps, and implementing data governance frameworks on GCP. You will be required to integrate GCP data services like BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, and GenAI with platforms such as Snowflake. Additionally, you will write efficient code in Python, SQL, and ETL/orchestration tools, utilize containerized solutions for scalable deployments, and apply expertise in PySpark, Kafka, and advanced data querying for high-volume data environments. Monitoring, optimizing, and troubleshooting system performance, reducing job run-times through architecture optimization, developing data warehouses, and mentoring team members will also be part of your role. To be successful in this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Extensive hands-on experience with Google Cloud Platform data services, Snowflake integration, strong programming skills in Python and SQL, proficiency in PySpark, Kafka, and data querying tools, and experience with containerized solutions using Google Kubernetes Engine are essential. Strong communication skills, documentation skills, experience with large distributed datasets, and the ability to balance short-term deliverables with long-term technical sustainability are also required. Prior leadership experience in data engineering teams and exposure to cloud data platforms are desirable. This role offers you the opportunity to lead high-impact data projects for reputed clients in a fast-growing data consulting environment, work with cutting-edge technologies, and collaborate in an innovative and growth-oriented culture.,

Posted 1 week ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Work from Office

skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies