Jobs
Interviews

30 Cloudsql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

Role Overview: You are required to work as a GCP Data Architect with a total experience of 12+ years. Your relevant experience for engagement should be 10 years. Your primary responsibilities will include maintaining architecture principles, guidelines, and standards, data warehousing, programming in Python/Java, working with Big Data, Data Analytics, and GCP Services. You will be responsible for designing and implementing solutions in various technology domains related to Google Cloud Platform Data Components like BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Key Responsibilities: - Maintain architecture principles, guidelines, and standards - Work on Data Warehousing projects - Program in Python and Java for various data-related tasks - Utilize Big Data technologies for data processing and analysis - Implement solutions using GCP Services such as BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Qualifications Required: - Strong experience in Big Data including data modeling, design, architecting, and solutioning - Proficiency in programming languages like SQL, Python, and R-Scala - Good Python skills with experience in data visualization tools such as Google Data Studio or Power BI - Knowledge of A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Data Engineering, and ETL Data Processing - Migration experience of production Hadoop Cluster to Google Cloud will be an added advantage Additional Company Details: The company is looking for individuals who are experts in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, etc. Relevant certifications such as Google Professional Cloud Architect will be preferred.,

Posted 5 days ago

Apply

1.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a GCP Data Engineer, you will play a crucial role in the development, optimization, and maintenance of data pipelines and infrastructure. Your proficiency in SQL and Python will be pivotal in the management and transformation of data. Moreover, your familiarity with cloud technologies will be highly beneficial as we strive to improve our data engineering processes. You will be responsible for building scalable data pipelines. This involves designing, implementing, and maintaining end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from various sources. It is essential to ensure that these data pipelines are reliable, scalable, and performance-oriented. Your expertise in SQL will be put to use as you write and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Collaboration with analysts and data scientists will be necessary to provide structured data for analysis. Experience with cloud platforms, particularly GCP services such as BigQuery, DataFlow, GCS, and Postgres, will be valuable. Leveraging cloud services to enhance data processing and storage capabilities, as well as integrating tools into the data ecosystem, will be part of your responsibilities. Documenting data pipelines, procedures, and best practices will be essential for knowledge sharing within the team. You will collaborate closely with cross-functional teams to understand data requirements and deliver effective solutions. The ideal candidate for this role should have at least 3 years of experience with SQL and Python, along with a minimum of 1 year of experience with GCP services like BigQuery, DataFlow, GCS, and Postgres. Additionally, 2+ years of experience in building data pipelines from scratch in a highly distributed and fault-tolerant manner is required. Comfort with a variety of relational and non-relational databases is essential. Proven experience in building applications in a data-focused role, both in Cloud and Traditional Data Warehouse environments, is preferred. Familiarity with CloudSQL, Cloud Functions, Pub/Sub, Cloud Composer, and a willingness to learn new tools and techniques are desired qualities. Furthermore, being comfortable with big data and machine learning tools and platforms, including open-source technologies like Apache Spark, Hadoop, and Kafka, will be advantageous. Strong oral, written, and interpersonal communication skills are crucial for effective collaboration in a dynamic environment with undefined problems. If you are an inquisitive, proactive individual with a passion for data engineering and a desire to continuously learn and grow, we invite you to join our team in Chennai, Tamil Nadu, India.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Cloud Site Reliability Engineer (SRE) at our esteemed organization, you will be responsible for ensuring the reliability, scalability, and performance of our cloud infrastructure. With a team of highly skilled professionals, you will contribute to the maintenance and optimization of our cloud-based applications. This role offers a unique opportunity to work in a dynamic environment that encourages continuous learning and innovation. With a total of 6-10 years of experience in the IT industry, including a minimum of 2-3 years as a Cloud SRE/Engineer, you possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Your expertise lies in managing applications that utilize services such as Cloud Build, Cloud Functions, GKE (Google Kubernetes Engine), Logging, Monitoring, GCS (Google Cloud Storage), CloudSQL, and IAM (Identity and Access Management). Your proficiency in Python, along with experience in a secondary language like Golang or Java, enables you to effectively manage codebases and configurations. You have a strong background in GKE/Kubernetes, Docker, and implementing CI/CD pipelines using GCP Cloud Build and other cloud-native services. Your familiarity with Infrastructure as Code (IaC) tools like Terraform, coupled with knowledge of security best practices and RBAC, sets you apart in this role. In this position, you will play a crucial role in defining, monitoring, and achieving Service Level Objectives (SLOs) and Service Level Agreements (SLAs). Your experience with source control tools like GitHub Enterprise and monitoring tools such as Grafana, Prometheus, Splunk, and GCP native logging solutions will be instrumental in maintaining the reliability of our cloud infrastructure. Moreover, your commitment to continuous improvement and automation of manual tasks, as well as your willingness to provide additional support when necessary, will be highly valued in our organization. Additionally, having experience in secrets management using tools like Hashi Corp Vault and knowledge of tracing tools like Google Tracing and Honeycomb are considered advantageous. If you are a dedicated professional with a passion for cloud infrastructure and a keen interest in ensuring the reliability and performance of cloud-based applications, we encourage you to apply for this full-time position. Join us in our journey towards innovation and excellence in cloud engineering. Benefits include health insurance, Provident Fund, and a work schedule aligned with US shift timings. The work location is in-person, with a hybrid work mode of 2 days in the office. We look forward to welcoming talented individuals who are ready to make a significant impact in the cloud engineering domain.,

Posted 1 week ago

Apply

9.0 - 11.0 years

0 Lacs

hyderabad, telangana, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: Key DevOps and Cross Functional Engineer role responsible for delivering quality and performance efficient code components. This needs to be supported with requirement gathering, application design, development and testing through to production implementation, and support/maintenance in production subsequently. Ensure strong operating environment in collaboration with multiple stake holders and Third-party management teams. Provide key inputs on latest DevOps and Agile development methodologies, innovation and transformation, automations and have the ability to evolve, modify and adapt in a dynamic environment. Liaise with other engineers, architects, and business stakeholders to understand and drive the product or service's direction. Working with Ops, Dev and Test Engineers to ensure operational issues (performance, operator intervention, alerting, design defect related issues, etc.) are identified and addressed at all stages of a product or service release/change. Provide support in identification and resolution of all incidents associated with the IT service, as directed by leadership of the DevOps team. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Responsible for automating the continuous integration/continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement Requirements To be successful in this role, you should meet the following requirements: Graduate degree/ Bachelor of engineering with 9+ years IT experience. Strong Experience on Google Cloud Platform (GCP): Could self-investigate cloud services, able to adopt to development and production environments. Strong knowledge of Infrastructure as Code: Understand terraform language, able to create / debug code and integrate the GCP service creation with the code. Experience on Continuous Integration / Continuous Delivery (CI/CD): able to adopt / debug the terraform script to CICD for production change. Strong knowledge on GKE Orchestration (Kubernetes), GKE Containers, Terraform Strong Knowledge of CloudSQL / Postgres, Unix/Python Scripting. Should be quick learner and good troubleshooting skills. Experience in Performance Testing, Query and performance Optimization, Control M related knowledge. Excellent communication and problem-solving skills, and the ability to work independently as part of a team. Strong Analytical and Critical thinking skills - ability to work under pressure and resolve complex issues. JIRA & Wiki Collaboration (e.g. confluence). Agile Methodologies (e.g. SCRUM). You'll achieve more when you join HSBC. HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As the Senior GCP DevOps Lead (Spring Boot), you will be responsible for solving complex problems and ensuring timely resolution while adhering to SLA agreements. Your role will involve implementing system change requests, managing application releases effectively, and diagnosing, analyzing, as well as fixing bugs in the system promptly. It will be crucial for you to conduct root cause analysis and impact assessments for issues encountered during project deployment and support cycles. Collaboration with development and support teams will be essential to ensure smooth application deployment and support. Additionally, you will perform thorough testing of applications and change releases to maintain quality standards. Documenting daily tasks, processes, and resolutions will also be a part of your responsibilities to build and maintain a comprehensive knowledge base for support activities. Your mandatory skills should include strong experience in Java and Spring / Spring Boot frameworks, hands-on expertise in Google Cloud Platform (GCP) with a focus on Kubernetes and Managed Instance Groups (MIG), experience with BigQuery, Jenkins, G3, and CloudSQL, as well as working knowledge of MSSQL Server databases. Experience in developing and managing Kubernetes Operators using Java Spring will be highly beneficial for this role. It would be advantageous to have experience with MongoDB. This is a full-time position with the work location being in person at either Hyderabad or Pune. The working schedule is from Monday to Friday during day shifts. If you meet the mandatory experience requirements of 7 years in Spring Boot and 5 years in Google Cloud Platform, we are looking for someone who can start immediately to fill this position. Additionally, benefits include Provident Fund.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeler with 6 to 9 years of experience, you will be based in Chennai and required to work from the client office. The budget for this position is 10.5 LPA. Your role will involve utilizing your skills in GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, and BigQuery. You will be responsible for hands-on data modelling for OLTP and OLAP systems. It is essential to have an in-depth understanding of Conceptual, Logical, and Physical data modelling. Additionally, your expertise should extend to Indexing, partitioning, data sharding, with practical experience in these areas. A strong understanding of the variables that impact database performance for near-real-time reporting and application interaction is crucial. Prior experience with at least one data modelling tool, preferably DBSchema, is required. Individuals with functional knowledge of the mutual fund industry will be preferred. A good grasp of GCP databases such as AlloyDB, CloudSQL, and BigQuery is necessary for this role.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The Senior GCP DevOps Lead (Spring Boot) position based in Hyderabad / Pune requires 8 to 10 years of experience. As a Senior GCP DevOps Lead, you will be responsible for solving complex problems, implementing system change requests, and managing application releases effectively. Your role involves diagnosing, analyzing, and fixing bugs promptly, conducting root cause analysis, and impact assessments for project deployment and support issues. Collaborating with development and support teams is crucial to ensure smooth application deployment and support. Your responsibilities also include thorough testing of applications and change releases to maintain quality standards. It is important to document daily tasks, processes, and resolutions to build and maintain a comprehensive knowledge base for support activities. Mandatory Skills: - Strong experience in Java and Spring / Spring Boot frameworks. - Hands-on expertise in Google Cloud Platform (GCP) with a focus on Kubernetes and Managed Instance Groups (MIG). - Familiarity with BigQuery, Jenkins, G3, and CloudSQL. - Working knowledge of MSSQL Server databases. - Experience in developing and managing Kubernetes Operators using Java Spring. Good to Have Skills: - Experience with MongoDB is considered an added advantage.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have at least 6+ years of experience in Java, Springboot, Microservices, ReactJS, product development, and sustenance. Troubleshooting and debugging existing code will be part of your responsibilities. It is essential to be proficient in code quality, security compliance, and application performance management. Your role will also involve participation in the agile planning process and estimation of planned tasks. Good verbal and written communication skills are necessary, along with expertise in unit testing (Junit). As part of your key responsibilities and deliverables, you will be responsible for feature implementation and delivering production-ready code. Technical documentation and system diagrams, debugging reports, and fixes, as well as performance optimizations, will also be expected from you. Qualifications and Experience: - 6+ years of experience in developing and designing software applications using Java - Expert understanding of core computer science fundamentals such as data structures, algorithms, and concurrent programming - Experience in analyzing, designing, implementing, and troubleshooting software solutions for highly transactional systems - Proficiency in OOAD and design principles, implementing microservices architecture using various technologies including JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow - Experience working in Native & Hybrid Cloud environments - Familiarity with Agile development methodology - Strong collaboration and communication skills to work effectively across product and technology teams - Ability to translate strategic priorities into scalable and user-centric solutions - Detail-oriented problem solver with excellent communication skills and a can-do attitude - Experience with Java, Java IDEs like Eclipse or IntelliJ, Java EE application servers, object-oriented design, Git, Maven, scripting languages, JSON, XML, YAML, Terraform, etc. Preferred Skills/Experience: - Experience with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies - Deep knowledge of creating secure solutions by design, multi-threaded backend environments, and tools/languages like Ruby, Python, Perl, Node.js, bash scripting languages, Spring, Spring Boot, C, C++, Docker, Kubernetes, Oracle, etc. Working with GlobalLogic offers a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. You'll have the chance to collaborate with innovative clients and work on cutting-edge solutions that shape the world today. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating impactful digital products and experiences, collaborating with clients to transform businesses through intelligent products and services.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Developer with over 6 years of experience in Java, Springboot, Microservices, and ReactJS, you will be responsible for troubleshooting and debugging existing code as necessary. Your proficiency in ensuring code quality, security compliance, and application performance management will be crucial to the success of the projects. You will actively participate in the agile planning process and estimate planned tasks while possessing good verbal and written communication skills. Additionally, your expertise in unit testing, particularly with Junit, will be essential in ensuring the overall quality of the software. Your key responsibilities will include feature implementation and delivering production-ready code, along with creating technical documentation and system diagrams. You will also be tasked with generating debugging reports, implementing fixes, and optimizing performance to enhance the overall efficiency of the systems. To excel in this role, you should have a solid foundation in core computer science fundamentals, including data structures, algorithms, and concurrent programming. Your experience should demonstrate a deep understanding of software design principles, microservices architecture, and various technologies such as JEE, Spring, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, and Data Flow. Experience with Native & Hybrid Cloud environments, Agile development methodologies, and proficiency in programming languages like Python and Java will be beneficial. You are expected to collaborate effectively with the product and technology teams, translating strategic priorities into scalable and user-centric solutions. Your attention to detail and problem-solving skills will be critical in addressing complex issues and delivering effective solutions. Strong communication skills and a proactive, team-oriented attitude are essential for success in this role. Preferred skills and experiences include familiarity with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies, and creating secure solutions by design. Experience with multi-threaded backend environments, Docker, Kubernetes, and scripting languages like Ruby, Python, Perl, Node.js, and bash will be advantageous. At GlobalLogic, we value a culture of caring, continuous learning and development, meaningful work, balance, flexibility, and integrity. As part of our team, you will have the opportunity to work on impactful projects, grow personally and professionally, and collaborate with forward-thinking clients on cutting-edge solutions that shape the world today. Join us and be a part of our commitment to engineering impact and transforming businesses through intelligent digital products and services.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Job Title: Data Modeller Experience: 6+ Years Location: Bangalore Work Mode: Onsite Job Role: We are seeking a skilled Data Modeller with expertise in designing data models for both OLTP and OLAP systems. The ideal candidate will have deep knowledge of data modelling principles and a strong understanding of database performance optimization, especially in near-real-time reporting environments. Prior experience with GCP databases and data modelling tools is essential. Responsibilities: ? Design and implement data models (Conceptual, Logical, and Physical) for complex business requirements ? Develop scalable OLTP and OLAP models to support enterprise data needs ? Optimize database performance through effective indexing, partitioning, and data sharding techniques ? Work closely with development and analytics teams to ensure alignment of models with application and reporting needs ? Use data modelling tools like Erwin, DBSchema, or similar to create and maintain models ? Implement best practices for data quality, governance, and consistency across systems ? Leverage GCP database solutions such as AlloyDB, CloudSQL, and BigQuery ? Collaborate with business stakeholders, especially within the mutual fund domain (preferred), to understand data requirements Requirements: ? 6+ years of hands-on experience in data modelling for OLTP and OLAP systems ? Strong command over data modelling fundamentals (Conceptual, Logical, Physical) ? In-depth knowledge of indexing, partitioning, and data sharding strategies ? Experience with real-time and near-real-time reporting systems ? Proficiency in data modelling tools preferably DBSchema or Erwin ? Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery ? Functional understanding of the mutual fund industry is a plus ? Must be willing to work from Chennai office presence is mandatory Technical Skills: Data Modelling (Conceptual, Logical, Physical), OLTP, OLAP, Indexing, Partitioning, Data Sharding, Database Performance Tuning, Real-Time/Near-Real-Time Reporting, DBSchema, Erwin, AlloyDB, CloudSQL, BigQuery.

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have proficiency in GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, and BigQuery. As a Data Modeller, you will be responsible for hands-on data modelling for OLTP and OLAP systems. Your role will require an in-depth understanding of Conceptual, Logical, and Physical data modelling. It is essential to have a strong grasp of Indexing, partitioning, and data sharding, supported by practical experience in these areas. Moreover, you must possess a solid understanding of the variables that impact database performance for near-real-time reporting and application interaction. Experience with at least one data modelling tool, preferably DBSchema, is necessary. Individuals with functional knowledge of the mutual fund industry will be preferred for this role. Additionally, a good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery would be beneficial for your responsibilities.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeling Engineer specializing in Near Real-time Reporting, you will be responsible for creating robust and optimized schemas to facilitate near real-time data flows for operational and analytical purposes within Google Cloud environments. Your primary focus will be on designing models that ensure agility, speed, and scalability to support high-throughput, low-latency data access needs. Your key responsibilities will include designing data models that align with streaming pipelines, developing logical and physical models tailored for near real-time reporting, implementing strategies such as caching, indexing, and materialized views to enhance performance, and ensuring data integrity, consistency, and schema quality during rapid changes. To excel in this role, you must possess experience in building data models for real-time or near real-time reporting systems, hands-on expertise with GCP platforms such as BigQuery, CloudSQL, and AlloyDB, and a solid understanding of pub/sub, streaming ingestion frameworks, and event-driven design. Additionally, proficiency in indexing strategies and adapting schemas in high-velocity environments is crucial. Preferred skills for this position include exposure to monitoring, alerting, and observability tools, as well as functional familiarity with financial reporting workflows. Moreover, soft skills like proactive adaptability in fast-paced data environments, effective verbal and written communication, and a collaborative, solution-focused mindset will be highly valued. By joining our team, you will have the opportunity to design the foundational schema for mission-critical real-time systems, contribute to the performance and reliability of enterprise data workflows, and be part of a dynamic GCP-focused engineering team. Skills required for this role include streaming ingestion frameworks, BigQuery, reporting, modeling, AlloyDB, pub/sub, CloudSQL, Google Cloud Platform (GCP), data management, real-time reporting, indexing strategies, and event-driven design.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller with 6-9 years of experience, you will be responsible for hands-on data modelling for both OLTP and OLAP systems. Your role will involve having an in-depth knowledge of Conceptual, Logical, and Physical data modelling, along with a strong understanding of indexing, partitioning, and data sharding. Practical experience in these areas is essential. You will need to possess a strong understanding of variables that impact database performance, specifically for near-real-time reporting and application interaction. It is expected that you have working experience with at least one data modelling tool, with a preference for DBSchema. Additionally, individuals with functional knowledge of the mutual fund industry will be considered a plus. Having a good understanding of GCP databases such as AlloyDB, CloudSQL, and BigQuery is necessary for this role. This position is full-time and requires you to work from the client's office in Chennai. Benefits include health insurance and the opportunity to work from home. The work schedule is during the day shift, with the work location being in person. If you meet the above requirements and are looking to contribute your expertise in data modelling within a dynamic environment, we encourage you to apply for this position.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a Database Performance & Data Modeling Specialist with a primary focus on optimizing schema structures, tuning SQL queries, and ensuring that data models are well-prepared for high-volume, real-time systems. Your responsibilities include designing data models that balance performance, flexibility, and scalability, conducting performance benchmarking to identify bottlenecks and propose improvements, analyzing slow queries to recommend indexing, denormalization, or schema revisions, monitoring query plans, memory usage, and caching strategies for cloud databases, and collaborating with developers and analysts to optimize application-to-database workflows. You must possess strong experience in database performance tuning, especially in GCP platforms like BigQuery, CloudSQL, and AlloyDB. Proficiency in schema refactoring, partitioning, clustering, and sharding techniques is essential. Familiarity with profiling tools, slow query logs, and GCP monitoring solutions is required, along with SQL optimization skills including query rewriting and execution plan analysis. Preferred skills include a background in mutual fund or high-frequency financial data modeling, hands-on experience with relational databases like PostgreSQL, MySQL, distributed caching, materialized views, and hybrid model structures. Soft skills that are crucial for this role include being precision-driven with an analytical mindset, a clear communicator with attention to detail, and possessing strong problem-solving and troubleshooting abilities. By joining this role, you will have the opportunity to shape high-performance data systems from the ground up, play a critical role in system scalability and responsiveness, and work with high-volume data in a cloud-native enterprise setting.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Modeler with 6 to 9 years of experience in GCP environments, you will play a crucial role in designing schema architecture, creating performance-efficient data models, and guiding teams on cloud-based data integration best practices. Your expertise will be focused on GCP data platforms such as BigQuery, CloudSQL, and AlloyDB. Your responsibilities will include architecting and implementing scalable data models for cloud data warehouses and databases, optimizing OLTP/OLAP systems for reporting and analytics, supporting cloud data lake and warehouse architecture, and reviewing and optimizing existing schemas for cost and performance on GCP. You will also be responsible for defining documentation standards, ensuring model version tracking, and collaborating with DevOps and DataOps teams for deployment consistency. Key Requirements: - Deep knowledge of GCP data platforms including BigQuery, CloudSQL, and AlloyDB - Expertise in data modeling, normalization, and dimensional modeling - Understanding of distributed query engines, table partitioning, and clustering - Familiarity with DBSchema or similar tools Preferred Skills: - Prior experience in BFSI or asset management industries - Working experience with Data Catalogs, lineage, and governance tools Soft Skills: - Collaborative and consultative mindset - Strong communication and requirements gathering skills - Organized and methodical approach to data architecture challenges By joining our team, you will have the opportunity to contribute to modern data architecture in a cloud-first enterprise, influence critical decisions around GCP-based data infrastructure, and be part of a future-ready data strategy implementation team.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a versatile Data Model Developer with 6 to 9 years of experience, proficient in designing robust data models across cloud (GCP) and traditional RDBMS environments. Your role involves collaborating with cross-functional teams to develop schemas that cater to both operational systems and analytical use cases. Your key responsibilities include designing and implementing scalable data models for cloud (GCP) and traditional RDBMS, supporting hybrid data architectures integrating real-time and batch workflows, collaborating with engineering teams for seamless schema implementation, documenting conceptual, logical, and physical models, assisting in ETL and data pipeline alignment with schema definitions, and monitoring and refining performance through partitioning and indexing strategies. You must have experience with GCP data services like BigQuery, CloudSQL, AlloyDB, proficiency in relational databases such as PostgreSQL, MySQL, or Oracle, solid grounding in OLTP/OLAP modeling principles, familiarity with schema design tools like DBSchema, ER/Studio, and SQL expertise for query performance optimization. Preferred skills include experience working in hybrid cloud/on-prem data architectures, functional knowledge in BFSI or asset management domains, and knowledge of metadata management and schema versioning. Soft skills required for this role include adaptability to cloud and legacy tech stacks, clear communication with engineers and analysts, and strong documentation and collaboration skills. Joining this role will allow you to contribute to dual-mode data architecture (cloud + on-prem), solve real-world data design challenges in regulated industries, and have the opportunity to influence platform migration and modernization.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Functional Data Modeler in the Mutual Fund industry, you will play a vital role in designing data models that accurately represent fund structures, NAV calculations, asset allocation, and compliance workflows. Your expertise in data modeling, combined with a deep understanding of mutual funds and BFSI domains, will be instrumental in creating schemas that meet operational and regulatory requirements. Your responsibilities will include collaborating with business analysts and product teams to translate functional requirements into effective data structures. It will be crucial for you to ensure that the data models you design comply with data privacy regulations, regulatory reporting standards, and audit requirements. Additionally, you will be responsible for building OLTP and OLAP data models to support real-time and aggregated reporting needs and documenting metadata, lineage, and data dictionaries for business use. To excel in this role, you must have a strong domain expertise in Mutual Fund/BFSI operations and a proven track record in data modeling for financial and regulatory systems. Proficiency in schema design on GCP platforms such as BigQuery and CloudSQL, as well as hands-on experience with modeling tools like DBSchema or ER/Studio, are essential skills required for this position. Preferred skills for this role include experience working with fund management platforms or reconciliation engines and familiarity with financial compliance standards such as SEBI and FATCA. Soft skills like strong business acumen and effective documentation capabilities will also be valuable in liaising between functional and technical teams successfully. By joining our team, you will have the opportunity to own critical financial data architecture, influence domain-driven modeling for financial ecosystems, and be part of a fast-paced data transformation journey in the BFSI sector. If you are looking to make a significant impact in the field of data modeling within the mutual fund industry, this role is perfect for you.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,

Posted 2 months ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What well offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities DesigningandimplementationofGCPinfrastructureusingIaC Automationofrecurringprocesses WorkingcloselywithdevelopmentteamsandimplementingCI/CDpipelinesforbuilding,testing and deployingapplications Containerizingapplicationsandorchestratingcontainers Designingandimplementationofapplicationenvironmentstoeasedevelopment,testing,and Release processes Monitoringtheinfrastructureandapplicationforimprovements Maintainingandupgradingcurrentprocesses Cost-cuttinganalysis Your skills and experience Experienceinworkingwith GoogleCloudPlatform Experienceincontainerizationandorchestration (Docker,GKE,ArtifactRegistry,CloudRun,CloudSQL) ExperiencewithIaC(Terraform) ExperienceinwritingCI/CDforapplicationsandinfrastructure(GitHubworkflows,Jenkins,etc.) Experienceinusingmonitoringtools(CloudMonitoring) Knowledgeofatleastonescriptinglanguage BasicDevSecOpsskills Experienceinuserandpermissionsmanagement(IAM) 3-5 Years of Experience as DevOps Engineer CertificationinGCPPreffered How well support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 3 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies