Jobs
Interviews

279 Cloud Sql Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

9 - 13 Lacs

jaipur

Work from Office

Job Summary Auriga is seeking a Senior Fullstack Engineer to drive backend development, cloud architecture, and API design for our Composable DXP on GCP . This role involves mentoring engineers, leading architectural decisions, and collaborating across teams to deliver scalable, secure solutions. Expertise in modern backend technologies, security best practices, and cloud-native architectures is key. If you're driven by scalable cloud solutions and technical leadership , we look forward to connecting with you. Your Responsibilities Architect and develop scalable, reliable, and secure backend services using modern TypeScript and Node.js frameworks (NestJS preferred). Design, implement, and optimize APIs (REST and GraphQL), ensuring high performance, scalability, and efficient data exchange. Architect, manage, and optimize databases, primarily PostgreSQL and BigQuery, ensuring high availability and performance. Develop and maintain cloud-native applications on Google Cloud Platform (GCP), leveraging services such as Cloud Run, Pub/Sub, Cloud Storage, Cloud SQL and IAM. Implement event-driven and microservices-based architectures using Google Cloud Pub/Sub. Optimize API and data layer performance, collaborating with front-end engineers to ensure efficient data fetching and caching strategies. Lead and mentor backend engineers providing technical guidance, architecture reviews, and career development support. Work with our Platform Engineering teams to Implement CI/CD best practices using GitHub Actions and Terraform, ensuring automated, reliable deployments on GCP Cloud Run. Ensure security best practices across backend systems, ensuring compliance with OWASP, OAuth, JWT, and GCP security frameworks. Integrate third-party services such as Contentful, Talon.One, Algolia and Segment, ensuring seamless Composable architecture integrations. Collaborate with Tech Directors, Product Managers, and other teams to align technical execution with business goals. Stay updated on cutting-edge backend technologies, continuously improving our architecture, tooling, and processes Preferred Skills Strong proficiency in English (written and verbal communication) is required. Experience managing and mentoring other engineers. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. Expertise in TypeScript and Node.js, with deep experience in modern NestJS (9+) or similar backend frameworks. Strong database expertise with PostgreSQL (including performance tuning, indexing, partitioning) and understanding BigQuery. Experience designing, implementing, and securing RESTful and GraphQL APIs. Extensive experience with Google Cloud Platform (GCP), including Cloud Run,Pub/Sub, Cloud Storage, and IAM (Equivalent experience in AWS considered). Proficiency with Infrastructure as Code (Terraform) and CI/CD pipelines (GitHub Actions preferred). Demonstrated deep understanding of microservices, event-driven systems, and distributed architectures. Experience with authentication, authorization, and security best practices (OAuth, JWT, OWASP, API Gateway security, GCP security best practices). Ability to troubleshoot and optimize high-traffic applications running on GCP. Experience working in Agile environments, balancing priorities across multiple projects. NICE TO HAVES: Experience with Edge computing, serverless functions, and Cloudflare Workers. Knowledge of message brokers like Kafka or RabbitMQ for event-driven architectures. Experience implementing observability and monitoring tools (Datadog, Grafana, Prometheus or equivalent). Experience with headless eCommerce platforms such as commercetools.

Posted 1 week ago

Apply

10.0 - 20.0 years

15 - 25 Lacs

hyderabad

Work from Office

Were Hiring: Senior Database Administrator (PostgreSQL/Oracle GCP Cloud) | Hyderabad Are you a seasoned Database Administrator with a strong foundation in Oracle DBA and a proven track record in PostgreSQL on GCP ? Do you thrive on solving complex migration challenges and driving cloud transformation? If yes, we’d love to connect with you! Job Title: Senior Database Administrator (PostgreSQL/Oracle – GCP Cloud) About the Role We are seeking a highly experienced Senior Database Administrator (DBA) with strong expertise in PostgreSQL and Oracle , particularly in cloud-based environments on GCP (Google Cloud Platform) . The ideal candidate should have started their career as an Oracle DBA and grown into a PostgreSQL DBA with proven success in end-to-end database migration projects . This role requires not only technical excellence in PostgreSQL/AlloyDB but also the ability to drive database transformation initiatives, optimize performance, and ensure database reliability at scale. Key Responsibilities Lead and execute full lifecycle database migration projects from Oracle to PostgreSQL in GCP Cloud . Administer, monitor, and optimize PostgreSQL and AlloyDB clusters to ensure high availability, scalability, and performance. Design and implement backup, recovery, and disaster recovery strategies in line with enterprise standards. Perform database tuning, indexing strategies, query optimization, and partitioning for large-scale workloads. Ensure security, compliance, and best practices for data governance across multi-database environments. Collaborate with development and infrastructure teams to support data-driven applications and analytics. Provide mentorship, technical guidance, and knowledge sharing within the DBA team. Key Requirements 12+ years of experience in database administration, with strong expertise in Oracle and PostgreSQL . Must have started as an Oracle DBA and currently working as a PostgreSQL DBA in a cloud-native environment. Hands-on experience with GCP Cloud (must) and AlloyDB (preferred; willingness to adopt if not yet experienced). Proven track record of leading at least 1–2 full lifecycle database migration projects from Oracle to PostgreSQL on GCP. Strong knowledge of database performance tuning, query optimization, partitioning, replication, and clustering . Flexibility to work with other databases such as MySQL and emerging database technologies. Solid understanding of security, backup/recovery, and HA/DR solutions in cloud environments. Excellent communication skills with the ability to interact with stakeholders and present technical solutions clearly. Good to Have Knowledge of Kubernetes, Terraform, or automation frameworks for database provisioning. Exposure to BigQuery, Dataflow, or other GCP data services . Experience with multi-cloud database migration strategies . Why Join Us? Opportunity to work on cutting-edge GCP Cloud migration projects . Be part of a team driving enterprise-scale digital transformation . Competitive compensation with opportunities for growth and leadership.

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

mumbai

Work from Office

The primary objective of this role is to design, develop, and maintain databases that meet the organization's requirements for storing and analyzing financial data. You will also be responsible for ensuring data integrity, security, and performance across various database platforms. Your tasks Design, develop, and optimize relational and non-relational databases to support the organization's financial data needs. Implement data models, schemas, and indexing strategies to optimize database performance and scalability. Collaborate with data engineering and software development teams to integrate database solutions into our applications and services. Perform database tuning, monitoring, and troubleshooting to ensure high availability and reliability. Implement data security measures, including access control and encryption, to protect sensitive financial information. Develop and maintain documentation for database design, configuration, and best practices. Stay current with emerging database technologies and trends to drive continuous improvement and innovation . You need to have Bachelor's degree in software engineering, Computer Science or a related field. Minimum 5+ years of experience as a database developer. Proven experience as a database developer or administrator, with expertise in relational databases such as MySQL and non-relational databases such as MongoDB, Elasticsearch, and Redis. Strong SQL skills and experience with database optimization techniques. Experience working with large datasets and complex data models in a financial or similar domain. Proficiency in database performance tuning, monitoring, and troubleshooting. Excellent problem-solving and analytical skills, with the ability to collaborate effectively in a team environment. Familiarity with data security best practices and compliance standards (e.g., GDPR, PCI DSS). Capability to work in multiple projects simultaneously. Experience with cloud-based database platforms such as Amazon RDS, Google Cloud SQL, or Azure Cosmos DB. Knowledge of distributed database systems and big data technologies (e.g., Hadoop, Spark). Experience with data warehousing solutions and ETL processes. Familiarity with DevOps practices and tools for database automation and CI/CD. Previous experience in the financial services industry or a similar regulated environment.

Posted 1 week ago

Apply

5.0 - 8.0 years

25 - 40 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 25 to 40 LPA Exp: 5 to 10 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon /Pune/Bangalore Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

6.0 - 10.0 years

20 - 22 Lacs

indore, pune, chennai

Work from Office

We are looking for a skilled and experienced Data Engineer to support data platform modernization initiatives, with a focus on advisory and implementation within the Google Cloud Platform (GCP). The ideal candidate will have strong expertise in working with relational and NoSQL databases such as SQL Server, Oracle, and optionally MongoDB, along with hands-on experience in data pipeline development, cloud migration, and performance optimization. In this role, you will be responsible for advising and supporting data engineering efforts on GCP, including services like Compute Engine, Cloud SQL, and BigQuery. You will play a key role in designing scalable data solutions, defining backup and disaster recovery strategies, and ensuring observability across data systems. A significant part of the role involves providing guidance on GCVE (Google Cloud VMware Engine) environments and creating robust, cloud-native data migration plans. You will collaborate with cross-functional teams, including cloud architects and business stakeholders, to ensure data solutions align with business goals. Your input will help shape data architecture, optimize storage and compute costs, and ensure data integrity, security, and high availability.

Posted 1 week ago

Apply

6.0 - 10.0 years

20 - 22 Lacs

hyderabad, ahmedabad, bengaluru

Work from Office

We are looking for a skilled and experienced Data Engineer to support data platform modernization initiatives, with a focus on advisory and implementation within the Google Cloud Platform (GCP). The ideal candidate will have strong expertise in working with relational and NoSQL databases such as SQL Server, Oracle, and optionally MongoDB, along with hands-on experience in data pipeline development, cloud migration, and performance optimization. In this role, you will be responsible for advising and supporting data engineering efforts on GCP, including services like Compute Engine, Cloud SQL, and BigQuery. You will play a key role in designing scalable data solutions, defining backup and disaster recovery strategies, and ensuring observability across data systems. A significant part of the role involves providing guidance on GCVE (Google Cloud VMware Engine) environments and creating robust, cloud-native data migration plans. You will collaborate with cross-functional teams, including cloud architects and business stakeholders, to ensure data solutions align with business goals. Your input will help shape data architecture, optimize storage and compute costs, and ensure data integrity, security, and high availability.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS) Understanding the business requirements and provide the real-time solutions Following the project development tools like JIRA, Confluence and GIT Write python/shell scripts to automate operations and server management Build and maintain operations tools for monitoring, notifications, trending, and analysis. Define, create, test, and execute operations procedures. Document current and future configuration processes and policies Requirements To be successful in this role, you should meet the following requirements: Experience 3 to 6 years Hadoop knowledge, NiFi/Kafka experience Python/Java at intermediate level Must have good Experience/knowledge on GCP components like GCS, BigQuery, AirFlow, Cloud SQL, PubSub/Kafka, DataFlow and Google Cloud SDK Experience in Java, Spring Boot framework and JPA/Hibernate will be preferable Understanding of Terraform script, Shell script Should have experience on any of the RDBMS GCP Data Engineer certifications is an added advantage Knowledge in Data Warehouse / ETL and BigData Technologies like Hive, Spark, NiFi Flexible to work on Linux/Unix environment for handling support & execution activities Good understanding on DWH/Data ingestion/Data Engineering concepts Good to have knowledge on Jenkins, Ansible, git, CI/CD Flexible to adopt new technologies/skills Good to have knowledge on GCP, Scheduling tools like (Control-m, TWS) You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

hyderabad, telangana, india

On-site

Job Title: Senior Google Cloud Engineer What You Will Do Let's do this. Let's change the world. In this vital role, you will be responsible for designing, building, and maintaining scalable, secure, and reliable Google Cloud infrastructure. Architect, implement, and manage a highly available Google Cloud environment. Design and implement VPC, Cloud DNS, VPN, Cloud Interconnect, Cloud CDN, and IAM policies to enforce security standards and processes. Implement robust security practices and enforce security policies using Identity and Access Management (IAM), VPC Service Controls, and Cloud Security Command Center. Architect solutions with cost optimization in mind using Google Cloud Billing and Cloud Cost Management tools. Infrastructure as Code (IaC) & Automation Deploy and maintain Infrastructure as Code (IaC) and Site Reliability Engineering (SRE) principles using tools like Terraform and Google Cloud Deployment Manager. Automate deployment, scaling, and monitoring using GCP-native tools and scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce best practices in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement Google Cloud Security Command Center, Cloud Armor, and Cloud IDS for threat detection and response. Monitoring & Performance Optimization Set up Google Cloud Monitoring, Cloud Logging, Cloud Trace, and Cloud Profiler to enable proactive monitoring, trace analysis, and performance tuning of GCP resources. Implement autoscaling, Cloud Load Balancing, and caching strategies for performance optimization. Troubleshoot cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard processes for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What We Expect of You We are all different, yet we all use our unique contributions to serve patients. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security. Blending cloud engineering and operations expertise, the individual will ensure that our cloud environment is running efficiently and securely while also being responsible for the day-to-day operational management, support, and maintenance of the cloud infrastructure. Must-Have Skills Deep hands-on experience with GCP (IAM, Compute Engine, Google Kubernetes Engine (GKE), Cloud Functions, Cloud Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, Cloud Firestore, Cloud Load Balancing, VPC, etc.). Expertise in Terraform for GCP infrastructure automation. Strong knowledge of GCP networking (VPC, Cloud DNS, VPN, Cloud Interconnect, Cloud CDN). Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, GitLab, etc.). Strong troubleshooting and debugging skills in cloud networking, storage, and security. Good-to-Have Skills Prior experience with containerization (Docker, Kubernetes) and serverless architectures. Familiarity with cloud CDK, Ansible, or Packer for cloud automation. Exposure to hybrid and multi-cloud environments (AWS, Azure). Familiarity with HPC, DGX Cloud. Basic Qualifications Bachelor's degree in Computer Science, IT, or a related field with 6-8 years of hands-on cloud experience. Professional Certifications (Preferred) Certifications in GCP (e.g., Google Cloud Certified Professional Cloud Architect and Cloud DevOps Engineer) are a plus. Terraform Associate Certification. Soft Skills Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams. Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Responsibilities Develop technical solutions for Data Engineering and work between 1 PM and 10 PM IST to enable more overlap time with European and North American counterparts. This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards. Design and deploying data pipelines with automated data lineage. Develop, reusable Data Engineering patterns. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability. Position Opportunities The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals: Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe. Explore and implement leading edge technologies, tooling and software development best practices. Experience of managing data warehousing and product delivery within a financially regulated environment. Experience of collaborative development practices within an open-plan, team-designed environment. Experience of working with third party suppliers / supplier management. Continued personal and professional development with support and encouragement for further certification. Qualifications Essential: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). 5+ years of SQL development experience. 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team. Experience developing with micro service architecture from container orchestration framework. Designing pipelines and architectures for data processing. Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support. Evidence of a proactive mindset to problem solving and willingness to take the initiative. Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines. Desired: Professional Certification in GCP (e.g., Professional Data Engineer). Data engineering or development experience gained in a regulated, financial environment. Experience with Teradata to GCP migrations is a plus. Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam. Experience of coaching and mentoring Data Engineers. Experience with data security, governance, and compliance best practices in the cloud. An understanding of current architecture standards and digital platform services strategy.,

Posted 1 week ago

Apply

5.0 - 10.0 years

1 - 1 Lacs

chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Data Engineering Engineer II Location: Chennai Work Type: Hybrid Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: Collaborate with business and technology stakeholders to understand current and future data requirements Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks Ensure optimum performance and identify improvement opportunities Skills Required: Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Skills Preferred: GenAI Experience Required: Engineer 2 Exp: 4+ years Data Engineering work experience Experience Preferred: Strong proficiency and hands-on experience in both Python(Must-have) and Java(Nice to have). Experience building and maintaining data pipelines (batch or streaming) preferably on Cloud platforms(especially GCP). Experience with at least one major distributed data processing framework (e.g., DBT, DataForm, Apache Spark, Apache Flink, or similar). Experience with workflow orchestration tools (e.g., Apache Airflow, Qlik replicate etc). Experience working with relational databases (SQL) and understanding of data modeling principles. Experience with cloud platforms (Preferably GCP. AWS or Azure will also do) and relevant data services (e.g., BigQuery, GCS, Data Factory, Dataproc, Dataflow, S3, EMR, Glue etc.). Experience with data warehousing concepts and platforms (BigQuery, Snowflake, Redshift etc.). Understanding of concepts related to integrating or deploying machine learning models into production systems. Experience working in an Agile development environment & hands-on in any Agile work management tool(Rally, JIRA etc.). Experience with version control systems, particularly Git. Solid problem-solving, debugging, and analytical skills. Excellent communication and collaboration skills. Experience working in a production support team (L2/L3) for operational support. Preferred Skills and Qualifications (Nice to Have): Familiarity with data quality and data governance concepts. Experience building and consuming APIs (REST, gRPC) related to data or model serving. Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field. Education Required: Bachelor's Degree Education Preferred: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

18 - 20 Lacs

chennai

Remote

We are seeking a highly experienced Cloud Database & Infrastructure Advisor with deep expertise in SQL Server and Oracle databases, and working knowledge of MongoDB (optional). The ideal candidate will act as a trusted advisor in guiding customers through critical infrastructure decisions, focusing on Google Cloud Platform (GCP) services including Compute Engine, Cloud SQL, Observability tools, Backup and Disaster Recovery (DR) strategies, and Google Cloud VMware Engine (GCVE). This role is predominantly advisory, requiring strong architectural insight and practical experience in helping enterprises plan and execute cloud migration strategies. You will work directly with customer technical teams and senior stakeholders to assess current-state infrastructure, recommend best-fit GCP services, and guide the migration and modernization journey. A major part of your role will involve advising on high availability, data replication, backup configurations, and observability frameworks using native GCP tools. You will provide governance and best practices for deploying SQL Server and Oracle workloads on GCP, ensuring performance, security, and scalability in both hybrid and fully cloud-native environments. Experience with GCVE is highly desirable, as you will be expected to offer guidance on VMware-based migration paths and workloads, ensuring seamless integration and minimal disruption during cloud transitions. Your insights will help customers align infrastructure decisions with their business continuity and compliance requirements. Candidates must have excellent communication and interpersonal skills, with the ability to clearly articulate technical concepts to both technical and non-technical audiences. A background in cloud advisory or enterprise IT consulting is highly preferred.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

20 - 25 Lacs

chennai

Work from Office

Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow 4) Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data 5) Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks 6) Ensure optimum performance and identify improvement opportunities

Posted 2 weeks ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

chennai

Work from Office

Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Experience Required: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

You have over 10 years of experience in data engineering, specializing in cloud-based solutions. Your role involves designing solutions, reviewing team work, and providing guidance. Proficiency in Google Cloud Platform (GCP) and its various data services such as BigQuery, DBT & Streaming, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer is essential. Your track record should demonstrate your ability to create scalable data pipelines and architectures. Experience with ETL tools, processes, and implementing ETL processes to transfer data to GCP warehouses like BigQuery is required. Your technical skills should include proficiency in DBT & Streaming, Dataflow, Cloud Storage, Cloud Composer, BigQuery, Cloud SQL, Cloud Firestore, Cloud Bigtable, Airflow, and GCP Cloud Data Catalog. You must be adept at SQL, database design, and optimization. Strong programming skills in Python, Java, or other relevant languages are necessary. Familiarity with data modeling, data warehousing, big data processing frameworks, data visualization tools like Looker and Data Studio, and machine learning workflows and tools will be advantageous. In addition to technical expertise, soft skills are crucial for this role. You should possess strong analytical and problem-solving skills, excellent communication and collaboration abilities, and the capacity to work efficiently in a fast-paced environment while managing multiple tasks. Leadership skills are key as you will be guiding and mentoring team members, providing assistance, and breaking down milestones into technical tasks. Estimation skills are also essential for successful project management.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Microsoft SQL Server Administration Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an MS SQL DBA you will be responsible for overall SQL Database Administration with complete infrastructure management, ensuring availability and performance of the databases that support the system. To proactively monitor the database systems to ensure secure services with minimum downtime. Responsible for troubleshooting and problem solving of DB. Responsible for improvement and maintenance of the databases. Working knowledge on Database architecture, Database Backup & Restore, DB maintenance, Handling BAU activities. Perform many related database functions, including designing, implementing, and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Roles & ResponsibilitiesIn depth, SQL Database Administration experience.Responsible for ensuring the availability and performance of the databases in the system.Practical experience in database monitoring.Proactively monitor the database systems to ensure services with minimum downtimeGood knowledge of SQL Database ArchitectureL2 and L3 Task hands on experience and troubleshootingProvide SME supportTroubleshooting performance tuning issues.Good knowledge on implementing Database level security.Good interpersonal, written, and oral communication skills.Should be able to document technical skills.Able to prioritize and execute tasks in a high-pressure environment.Experience working in a team-oriented, collaborative environment.Manage MS SQL Server databases Configure and maintain database servers and processesMonitor system's health and performanceMust have good experience on SQL Upgrades, SQL MigrationsMust have good experience on SQL Server PatchingMust have good experience with Performance Tuning and Optimization (PTO) Knowledge of High Availability (HA) and Disaster Recovery (DR) options for SQL ServerEnsure high levels of performance, availability, sustainability and security.Analyze, solve, and correct issues in real time.Provide suggestions for solutionsRefine and automate regular processes, track issues, and document changesAssist developers with query tuningProvide 24x7 support for critical production systems Professional & Technical Skills: Good communication skillsStrong understanding of database management and optimization.Experience with performance tuning and troubleshooting database issues.Flexible with rotation shifts 24*7 environmentGood analytical Skill Ability to document technical knowledge Good to have Cloud experience:Azure PaaS SQL , AWS SQL RDS , Google Cloud etc.PowerShell Script knowledgeDB Automations Additional Information:- The candidate should have a minimum of 5 years of experience in MS SQL DB Administration.- This position is based at our Bangalore office.- A minimum of 15 years of full-time education is required.- Experience in Google Cloud SQL, Chron Jobs will be an advantage. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Aptiv, you will play a crucial role in designing, developing, and implementing a cost-effective, scalable, reusable, and secured Ingestion framework. Your primary responsibility will be to work closely with business leaders, stakeholders, and source system Subject Matter Experts (SMEs) to understand and define the business needs, translate them into technical specifications, and ingest data into Google Cloud Platform, specifically BigQuery. You will be involved in designing and implementing processes for data ingestion, transformation, storage, analysis, modeling, reporting, monitoring, availability, governance, and security of high volumes of structured and unstructured data. Your role will involve developing and deploying high-throughput data pipelines using the latest Google Cloud Platform (GCP) technologies, serving as a specialist in data engineering and GCP data technologies, and engaging with clients to understand their requirements and translate them into technical data solutions. You will also be responsible for analyzing business requirements, creating source-to-target mappings, enhancing ingestion frameworks, and transforming data according to business rules. Additionally, you will develop capabilities to support enterprise-wide data cataloging, design data solutions with a focus on security and privacy, and utilize Agile and DataOps methodologies in project delivery. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Data & Analytics, or a similar relevant field, along with at least 4 years of hands-on IT experience in a similar role. You should possess proven expertise in SQL, including subqueries, aggregations, functions, triggers, indexes, and database optimization, as well as deep experience working with various Google Data Products such as BigQuery, Dataproc, Data Catalog, Dataflow, Cloud SQL, among others. Experience in tools like Qlik replicate, Spark, and Kafka is also required. Strong communication skills, the ability to work with globally distributed teams, and knowledge of statistical methods and data modeling are essential for this role. Experience with designing and creating Tableau, Qlik, or Power BI dashboards, as well as knowledge of Alteryx and Informatica Data Quality, will be beneficial. Aptiv provides an inclusive work environment where individuals can grow and develop, irrespective of gender, ethnicity, or beliefs. Safety is a core value at Aptiv, aiming for a world with zero fatalities, zero injuries, and zero accidents. The company offers a competitive health insurance package to support the physical and mental health of its employees. Additionally, Aptiv provides benefits such as personal holidays, healthcare, pension, tax saver scheme, free onsite breakfast, discounted corporate gym membership, and access to transportation options at the Grand Canal Dock location. If you are passionate about data engineering, GCP technologies, and driving value creation through data analytics, Aptiv offers a challenging and rewarding opportunity to grow and make a meaningful impact in a dynamic and innovative environment.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

gurgaon, haryana, india

On-site

About GoKwik GoKwik is a growth operating system designed to power D2C and eCommerce brands from checkout optimisation and reducing return-to-origin (RTO), to payments, retention, and post-purchase engagement. Today, GoKwik enables over 12,000 merchants worldwide, processes around $2 billion in GMV, and is strengthening its AI-powered infrastructure. Backed by RTP Global, Z47, Peak XV, and Think Investments and bolstered by a $13 million growth round in June 2025 (total funding: $68 million), GoKwik is scaling aggressively across India and the UK. Why This Role Matters At GoKwik, data isnt just stored, its the lifeline of trust and growth for thousands of merchants. As a Senior Database Reliability Engineer, youll ensure our mission-critical databases are highly available, scalable, and resilient by design. From building HA/DR strategies and automating DB-as-code to optimising performance in high-concurrency environments, youll directly shape how reliably and securely billions in GMV flow through our systems. This isnt about being reactive, its about designing reliability into the foundation of our data infrastructure. What You&aposll Own Availability & Scalability - Manage and Optimise critical databases (MySQL, Postgres, MongoDB, etc.) for always-on performance at scale Resilience by design - Architect and implement HA/DR, backup, and recovery strategies that ensure zero data loss and minimal downtime Automation everywhere - Partner with DevOps to enable DB-as-code deployments, and automate patching, upgrades, and scaling with Terraform/Ansible Availability & Scalability - Manage and Optimise critical databases (MySQL, Postgres, MongoDB, etc.) for always-on performance at scale Performance optimisation - Fine-tune SQL queries, indexing, and schema designs for efficiency in high-concurrency workloads Observability & drills - Monitor databases proactively and lead cross-functional disaster recovery exercises to validate resilience Cost & vendor management - Take ownership of database licensing, vendor relationships, and spend optimisation Governance & compliance - Enforce data retention, privacy, and governance policies to meet regulatory standards Who You Are 3 - 7 years of experience as a DBA / Database Engineer in high-scale SaaS or eCommerce environments Strong expertise in relational (MySQL/Postgres) and NoSQL (MongoDB) systems Hands-on with replication, sharding, clustering for scale and resiliency Experienced with cloud-managed DB services (RDS, Aurora, Cloud SQL, Mongo Atlas) Skilled in automating DB provisioning & management using tools like Terraform, Ansible, or similar Strong problem-solver, comfortable debugging performance bottlenecks in production at scale At higher experience levels: proven ability to design resilient DB architectures, run DR drills, mentor peers, and drive best practices across teams How You&aposll Thrive At GoKwik You treat merchant data as sacred, ensuring stability without compromise You believe in blameless learning and continuous improvement when issues arise You balance uptime with velocity, enabling teams to ship fast without risking reliability You stay proactive, spotting risks early and designing systems to prevent outages You thrive in a high-trust, high-ownership culture, where database engineers are enablers, not gatekeepers Why GoKwik At GoKwik, we arent just building tools, were rewriting the playbook for eCommerce in India. We exist to solve some of the most complex challenges faced by digital-first brands: low conversion rates, high RTO, and poor post-purchase experience. Our checkout and conversion stack powers 500+ leading D2C brands and marketplaces and were just getting started. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

This position is for a Senior Software Engineer responsible for developing and deploying Node JS Backend in a long-term software project for a US client. You will be working in Trivandrum, India, collaborating with the existing project team on technical and management aspects. Your responsibilities will include requirement elicitation, software architecture designing, implementation, code reviews, and supporting deployment in a Cloud environment. It is crucial to take each assigned task to completion while ensuring the quality of deliverables. Self-initiatives, decision-making skills, self-directing capabilities, and a go-getter attitude are essential for success in this role. You will be responsible for performing software requirements analysis, determining functional and non-functional requirements, analyzing requirements to create solutions and software architecture design, writing high-quality code, and deploying applications in a Cloud environment by selecting relevant Cloud services. Effective communication with stakeholders to clarify requirements and expectations is vital. Timely delivery of the product with high quality is a key aspect of this role. Collaboration with stakeholders, including customers, is necessary to ensure the successful execution of the project. Managing priority changes and conflicts gracefully and addressing customer escalations promptly are part of your responsibilities. Proactively suggesting tools and systems to enhance quality and productivity is encouraged, along with staying updated on relevant technology and process advancements. To qualify for this role, you should have more than three years of experience in NodeJS development, expertise in developing Web APIs and RESTful services, familiarity with relational databases like MySQL and PostgreSQL, non-relational databases such as MongoDB, and experience with code quality tools and unit testing. Knowledge of Kubernetes, Docker, GCP services (e.g., Cloud Healthcare API, GKE, Cloud Run, Cloud functions, Firestore, Cloud SQL, etc.), deploying, scaling, and monitoring applications in GCP, proficiency in code versioning tools like git, understanding of software development lifecycles (SDLC), version control, and traceability, experience in Agile development methodology, and proficiency in various development tools related to designing, coding, debugging, testing, bug tracking, collaboration, and source control are required. Additional knowledge of the Healthcare domain and protocols like DICOM and HL7 will be an added advantage.,

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

mumbai, maharashtra, india

On-site

We are seeking a highly skilled and motivated GCP Data Architect to join our team. Google Cloud Platform (GCP) Data Architect would be responsible for designing and implementing cloud-based solutions for enterprise-level clients using GCP. The role involves understanding clients business requirements and translating them into technical solutions that meet their needs. The GCP Data Architect should have a strong understanding of cloud architecture, including compute, networking, storage, security, and automation. They should also have a deep understanding of GCP services, such as App Engine, Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud SQL, and Cloud Pub/Sub and tools such as Dataform, Composer, Looker and PowerBI. Job Description: Key Responsibilities : Leading technical discussions with clients to understand their business requirements and recommend appropriate cloud-based solutions. Creating technical specifications and architecture documents for GCP-based solutions. Designing and implementing cloud solutions using GCP services. Advising clients on best practices for cloud architecture, security, and governance. Troubleshooting and resolving issues with cloud-based solutions. Collaborating with cross-functional teams, including sales, product, engineering, and support, to ensure successful project delivery. Staying up to date with the latest developments in cloud technologies and GCP services. Qualifications and Certifications : Education: Bachelors or masters degree in computer science, Information Technology, Engineering, or a related field. Experience: Minimum 10 years of experience in designing and implementing cloud-based solutions using GCP. Experience in architecting and designing solutions in one or more of the following areas: Infrastructure Modernization, Data Management and Analytics, Application Modernization, or DevOps. Strong understanding of cloud architecture, including compute, networking, storage, security, and automation. Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform, Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. Experience with cloud automation tool such as Terraform. Excellent communication skills and the ability to work effectively with clients and cross-functional teams. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, Kubernetes Engine (GKE), Hands-on experience with CI/CD tools such as Jenkins, Github or Bitbucket. Proficiency in Terraform, Ansible, or Google Cloud Deployment Manager for infrastructure automation. Experience in using Docker and Kubernetes for containerization and orchestration. Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer Strong understanding of Agile/Scrum methodologies Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Job Description: Sr. Software Engineer This opportunity is within dentsu&aposs integrated global capability that delivers world-class, agile services across the dentsu network - Dentsu Creative, Dentsu Media and Merkle. In this role, you will be part of Merkle, enabling delivery of innovative data and technology solutions for Merkle&aposs global brands. About Merkle Merkle, a dentsu company, is a leading data-driven customer experience management (CXM) company that specializes in the delivery of unique, personalized customer experiences across platforms and devices. For more than 30 years, Fortune 1000 companies and leading nonprofit organizations have partnered with Merkle to maximize the value of their customer portfolios. The companys heritage in data, technology, and analytics forms the foundation for its unmatched skills in understanding consumer insights that drive hyper-personalized marketing strategies. Its combined strengths in consulting, creative, media, analytics, data, identity, CX/commerce, technology, and loyalty & promotions drive improved marketing results and competitive advantage. With more than 14,000 employees, Merkle has offices throughout the Americas, EMEA, and APAC. For more information, visit www.merkle.com. Job Description: The Full-stack Developer within the Digital Innovation & Product Hub will be supporting in the build of proprietary software. Collaborating closely with the Product Manager, Lead Full-stack Developer and Frontend Developer, they will support in the scoping, building and deployment. They will also help to maintain standards and best practices across the project. Key Responsibilities: Leading the development of features across the full software life cycle - from conception to scoping, deployment and ongoing maintenance Working alongside UX/UI designers to develop user experience Maintaining development best practices (including infrastructure and documentation) Maintaining any tools or scripts as well as debugging production issues Skills: Professional Good communication skills and ability to communicate complex ideas Strong attention to detail and highly organized Ability to self-manage, working as part of the wider team Believe in clean coding and simple solutions Any knowledge around Performance Marketing (desired) Technical Experience with the following: Python HTML5, CSS and JS Google APIs Google Cloud Platform BigQuery Cloud Functions Cloud Run Cloud Scheduler Cloud SQL Mongo SQL React Node Express Well versed in agile methodologies, git and version control best practices Experience in testing responsiveness, performance, and resilience of applications Qualifications: Bachelors or Master Degree in Computer Science >= 3 years of IT experience Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 15 Lacs

chennai, bengaluru, mumbai (all areas)

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a Sr. Developer with 8 to 12 years of experience, specializing in BigQuery Data Transfer Service, Python, Cloud SQL, and BigQuery. Your role involves contributing to data-driven projects, ensuring high-quality deliverables and impactful outcomes in a hybrid work model with a day shift. Your responsibilities include leading the development of data solutions using BigQuery Data Transfer Service, optimizing SQL queries in Cloud SQL, utilizing Python for data processing and automation, collaborating with teams to deliver solutions, ensuring data integrity through testing, developing ETL pipelines, monitoring data workflows, implementing data security practices, improving data architecture, mentoring junior developers, participating in code reviews, documenting progress, and staying updated with industry trends. Key Qualifications: - Strong experience in BigQuery Data Transfer Service - Proficiency in Python for data tasks - Expertise in Cloud SQL and SQL query optimization - Nice to have experience in BigQuery and data warehousing - Knowledge of data security best practices - Experience in mentoring junior developers Certifications Required: - Google Cloud Professional Data Engineer Certification - Python Programming Certification,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for designing and implementing cloud-native and hybrid solutions using GCP services such as Compute Engine, Kubernetes (GKE), Cloud Functions, BigQuery, Pub/Sub, Cloud SQL, and Cloud Storage. Additionally, you will define cloud adoption strategies, migration plans, and best practices for performance, security, and scalability. You will also be required to implement and manage Terraform, Cloud Deployment Manager, or Ansible for automated infrastructure provisioning. The ideal candidate should have expertise as a GCP data architect with network domain skills in GCP (DataProc, cloud composer, data flow, BQ), python, spark Py spark, and hands-on experience in the network domain, specifically in 4G, 5G, LTE, and RAN technologies. Knowledge and work experience in these areas are preferred. Moreover, you should be well-versed in ETL architecture and data pipeline management. This is a full-time position with a day shift schedule from Monday to Friday. The work location is remote, and the application deadline is 15/04/2025.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies