Jobs
Interviews

279 Cloud Sql Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

8 - 10 Lacs

Hyderabad, Telangana, India

On-site

Entity: Google Cloud Architect Experience on GCP architecture, landing zone and foundation design and implementation. Experience in GCVE Experience creating infrastructure using Terraform. Experience in writing modules and resources and automating resource creation using cloud build. Experience in GCP services like VPC network, firewalls, compute engine, app engine, Kubernetes engine, cloud SQL, Cloud storage, filestore, monitoring, and logging. Experience in establishing hybrid connectivity from on-prem to GCP (including VPN and Interconnect) Experience with DevOps practices and Infrastructure-as-Code (IaC) scripts Experience with Observability / Logging & Monitoring Experience with designing and implementing infrastructure components such as Kubernetes (GKE) Design and delivery of backup and DR (BDR) solution in GCP Experience with database modernization (CloudSQL) Understanding of GCP networking Data Platform Architect Proven experience as a Data Architect, with a focus on hyperscalers (AWS, GCP) Strong understanding of GCP services such as BigQuery, Cloud Storage, Dataproc, Dataflow, and Pub/Sub. Experience with data modeling, ETL processes, and data warehousing. Experience with data cataloging, classification, labeling, sensitive data inspection and redaction and Data Loss Prevention (DLP) on GCP. Knowledge of security best practices and regulatory compliance. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and collaboration skills Data Center Architect Data Center Design: Develop and implement comprehensive data center solutions, including server, storage, network, and infrastructure components. Ensure scalability, performance, and reliability to meet business needs. Infrastructure Planning & Optimization: Assess existing infrastructure, identify areas of improvement, and design solutions that enhance performance, availability, and scalability while reducing operational costs. Technology Integration: Lead the integration of cloud services, virtualization technologies, and on-premise systems, ensuring seamless operations across multiple platforms and regions. Security & Compliance: Ensure that data center designs adhere to the latest security standards, policies, and regulatory compliance (e.g., ISO, PCI DSS, GDPR). Perform vulnerability assessments and recommend security enhancements. Disaster Recovery & High Availability: Design and implement disaster recovery (DR) and business continuity solutions to minimize downtime and data loss. Ensure systems are highly available and resilient. Capacity Planning: Analyze workload trends and performance data to ensure the data center can meet future demand. Provide recommendations on hardware and software upgrades to support growth. Collaboration & Leadership: Work closely with various teams, including IT operations, network engineers, cloud architects, and third-party vendors, to ensure the data center infrastructure aligns with business objectives. Documentation & Compliance: Maintain detailed documentation of the data center architecture, configurations, and processes. Ensure designs are audit-ready and compliant with industry standards. Network Architect Network Design & Architecture: Design, develop, and implement scalable and secure network solutions, including LAN, WAN, VPN, and cloud networking. Evaluate the current network setup and recommend improvements to enhance performance, security, and cost-efficiency. Develop detailed network blueprints and architectural models to support business requirements and future scalability. Technology Integration & Innovation: Integrate new networking technologies (such as SDN, SD-WAN, cloud networking) with legacy systems while ensuring optimal performance. Lead the adoption of emerging technologies to improve network automation, monitoring, and operational efficiency. Collaborate with cloud architects to ensure seamless hybrid cloud network integration. Network Security: Design and enforce security protocols, including firewalls, encryption, access control, and network segmentation, to protect against cyber threats. Ensure that network designs comply with industry standards and regulatory requirements (e.g., PCI DSS, ISO 27001, GDPR). Capacity Planning & Scalability: Analyze network traffic patterns, usage, and performance to predict future capacity needs and plan for expansion. Implement strategies to optimize network performance during peak times and reduce latency. Disaster Recovery & High Availability: Design and implement disaster recovery solutions and high-availability architectures to ensure minimal downtime and uninterrupted network operations. Develop failover strategies and redundancy plans for critical systems and applications. Collaboration & Leadership: Collaborate with IT, security, and cloud teams to ensure network architecture aligns with organizational goals. Act as a technical lead, guiding network engineers and support teams in troubleshooting, deployment, and maintenance. Engage with stakeholders to understand business needs and translate them into technical network requirements. Documentation & Compliance: Create and maintain detailed documentation of the network architecture, including diagrams, configurations, and operational procedures. Ensure the network design is compliant with industry standards and organizational policies.

Posted 1 month ago

Apply

10.0 - 12.0 years

9 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

GCP- Certified, Designing and Architecture, GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) containers and orchestration Terraform, Cloud Build, Cloud Functions, or other GCP-native tools IAM, VPCs, firewall rules, service accounts, and Cloud Identity Grafana any Monitoring tool Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. For the role at 66degrees, we are seeking a senior contractor to engage in a 2.5-month remote assignment with the potential to extend. Candidates with the required skills and the ability to work independently as well as within a team environment are encouraged to apply. As part of the responsibilities, you will be expected to facilitate, guide, and influence the client and teams towards an effective architectural pattern. You will serve as an interface between business leadership, technology leadership, and the delivery teams. Your role will involve performing Migration Assessments and producing Migration Plans that include Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, and Application Waves. Additionally, you will be responsible for designing a solution architecture on Google Cloud to support critical workloads. This will include Heterogeneous Oracle Migrations to Postgres or Spanner. You will need to design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users and Security. You will oversee migration activities and provide troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, and Technology reviews and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy. Your personal information is collected, used, and shared in accordance with the California Consumer Privacy Act (CCPA).,

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Comprehensive understanding of BGPConfigure and troubleshoot in large global environments. Comprehensive understanding of routing conceptsInter-VR routing, policy-based routing, routing protocols, route filtering, path selection, access-lists etc. Comprehensive understanding of switching conceptsVLANs, Layer 2, Mac-forwarding, vlan trunking, VRRP, Gratuitous ARP. Comprehensive understanding of firewall/security conceptsL2-L7, all versions of NAT, failover scenarios, zonal concepts, IPSec, L7 encryption concepts, URL filtering, DNS, security profiles and rules, proxying. Comprehensive understanding of Load Balancing conceptsCloud LB and conventional LB and their differences in functionality. Good understanding of Public Cloud platformsPreferably GCP; specifically Networking, Firewalling, IAM and how they relate to Cloud Native services (PSA, Cloud SQL, GCVE, Cloud Interconnects, BMS, FileStore, Netapp, etc). Good understanding of Infrastructure as Code (IAC) to provision resourcesMust be able to customize and optimize the codebase to simplify deliveries. Good understanding of Linux administrationUsing Linux to bridge technical gaps in Windows and understanding the tools available to troubleshoot network connectivity. Understanding of APIsin order to expedite data collection and configuration to eliminate human error. Understanding of DevOpshow it can improve delivery and operation. Primary Skills Skills Required Network Security, Switches, Router, firewalls, and cloud. Products JuniperMX, SRX, QFX Palo AltoPhysical and virtual firewalls, Panorama. Tools Terraform Algosec or similar tool for traffic flow governance. Mandatory languages Python HCL (HashiCorp Configuration Language)

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills.

Posted 2 months ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Kolkata, Bengaluru, Mumbai (All Areas)

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements : Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics

Posted 2 months ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Work with the team in capacity of GCP Data Engineer on day to day activities Solve problems at hand with utmost clarity and speed Train and coach other team members Ability to turn around quickly Work with Data analysts and architects to help them solve any specific issues with tooling/processes Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake Design and build production data pipelines from ingestion to consumption within a big data architecture GCP BQ modeling and performance tuning techniques RDBMS and No-SQL database experience Knowledge on orchestrating workloads on cloud Implement Data warehouse & Big/Small data designs, data lake solutions with very good data quality capabilities Understanding and knowledge of deployment strategies CI/CD.

Posted 2 months ago

Apply

5.0 - 7.0 years

14 - 17 Lacs

Pune

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence.

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for experienced professionals with strong expertise in Google Cloud Platform (GCP) database services. The role involves designing, implementing, and troubleshooting scalable database solutions on GCP. Responsibilities: - Proven experience as a Subject Matter Expert in Google Cloud native databases and managed SQL solutions or a similar role. - In-depth knowledge of Google Cloud Platform (GCP) and its database tools, including Cloud SQL, BigQuery, and Spanner. - Strong analytical and problem-solving skills. - Excellent communication and presentation skills. - Proficiency in relevant programming languages such as SQL, Python, or Go. - Familiarity with cloud-native architectures and database best practices. - Provide technical expertise on GCP database tools. - Design and support cloud-native database architectures. - Resolve complex database issues. - Collaborate with cross-functional teams. Good to Have: - Google Cloud certifications. - Experience in DB migration. - Knowledge of data security/compliance.,

Posted 2 months ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise, digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our team to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact is seeking a highly accomplished and visionary Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect, specializing in Data and Artificial Intelligence. This pivotal role will be instrumental in driving Genpact%27s growth in the GCP ecosystem by leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering executive-level client relationships. You will operate at the intersection of business strategy and cutting-edge technology, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: . Executive Solutioning & Strategy: Lead the end-to-end technical pre-sales cycle for Genpact%27s most strategic data and AI opportunities on GCP. Engage at the CXO level and with senior business and IT stakeholders to deeply understand their strategic objectives, pain points, and competitive landscape. . Architectural Leadership: Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures on Google Cloud Platform. This includes expertise in BigQuery, Dataflow, Dataproc, Vertex AI (MLOps, Generative AI), Cloud AI services, Looker, Pub/Sub, Cloud Storage, Data Catalog, and other relevant GCP services. . Value Proposition & Storytelling: Develop and deliver highly impactful presentations, workshops, and proof-of-concepts (POCs) that clearly demonstrate the business value and ROI of Genpact%27s data and AI solutions on GCP. Craft compelling narratives that resonate with both technical and non-technical audiences. . Deal Ownership & Closure: Work collaboratively with sales teams to own the technical solutioning and commercial structuring of deals from qualification to closure. Lead the estimation, negotiation, and transition of deals to the delivery organization, ensuring alignment and seamless execution. . Technical Deep Dive & Expertise: Provide deep technical expertise on Google Cloud%27s Data & AI portfolio, staying at the forefront of new service offerings, product roadmaps, and competitive differentiators. Act as the subject matter expert in client discussions and internal enablement. . Cross-Functional Collaboration: Partner effectively with Genpact%27s sales, delivery, product development, and industry vertical teams to ensure that proposed solutions are innovative, deliverable, and aligned with market demands and Genpact%27s capabilities. . Thought Leadership: Contribute to Genpact%27s market presence and intellectual property through whitepapers, conference presentations, industry events, and client advisory sessions. Position Genpact as a leader in data-driven transformation on GCP. . Team Mentorship & Enablement: Provide mentorship and technical guidance to junior pre-sales architects and delivery teams, fostering a culture of continuous learning and excellence in GCP Data & AI. Qualifications we seek in you! Minimum Qualifications . progressive experience in data, analytics, artificial intelligence, and cloud technologies, with a strong focus on technical pre-sales, solution architecture, or consulting leadership roles. . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. . Deep and demonstrable expertise across the Google Cloud Data & AI stack: o Core Data Services: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. o AI/ML Services: Vertex AI (including MLOps, Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow, Recommendation AI. o BI & Visualization: Looker, Data Studio. . Proven track record of successfully leading and closing multi-million dollar deals involving complex data and AI solutions on cloud platforms. . Exceptional executive presence with the ability to engage, influence, and build trusted relationships with C-level executives and senior stakeholders. . Strong commercial acumen and experience in structuring complex deals, including pricing models, risk assessment, and contract negotiation. . Outstanding communication, presentation, and storytelling skills, with the ability to articulate complex technical concepts into clear, concise business benefits. . Demonstrated ability to lead cross-functional teams and drive consensus in dynamic and ambiguous environments. . Bachelor%27s degree in Computer Science, Engineering, or a related technical field. Master%27s degree or MBA preferred. . Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). . Ability to travel as required to client sites and internal meetings. Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Objectives of this role The primary objective of this role is to design, develop, and maintain databases that meet the organization's requirements for storing and analyzing financial data. You will also be responsible for ensuring data integrity, security, and performance across various database platforms. Your tasks Design, develop, and optimize relational and non-relational databases to support the organization's financial data needs. Implement data models, schemas, and indexing strategies to optimize database performance and scalability. Collaborate with data engineering and software development teams to integrate database solutions into our applications and services. Perform database tuning, monitoring, and troubleshooting to ensure high availability and reliability. Implement data security measures, including access control and encryption, to protect sensitive financial information. Develop and maintain documentation for database design, configuration, and best practices. Stay current with emerging database technologies and trends to drive continuous improvement and innovation . You need to have Bachelor's degree in software engineering, Computer Science or a related field. Minimum 5+ years of experience as a database developer. Proven experience as a database developer or administrator, with expertise in relational databases such as MySQL and non-relational databases such as MongoDB, Elasticsearch, and Redis. Strong SQL skills and experience with database optimization techniques. Experience working with large datasets and complex data models in a financial or similar domain. Proficiency in database performance tuning, monitoring, and troubleshooting. Excellent problem-solving and analytical skills, with the ability to collaborate effectively in a team environment. Familiarity with data security best practices and compliance standards (e.g., GDPR, PCI DSS). Capability to work in multiple projects simultaneously. Experience with cloud-based database platforms such as Amazon RDS, Google Cloud SQL, or Azure Cosmos DB. Knowledge of distributed database systems and big data technologies (e.g., Hadoop, Spark). Experience with data warehousing solutions and ETL processes. Familiarity with DevOps practices and tools for database automation and CI/CD. Previous experience in the financial services industry or a similar regulated environment. About Us NSE Cogencis is a leading provider of data, news and actionable insights and analytics. Professionals across commercial banks, asset management companies, insurance companies, conglomerates and large corporate use our products to trade, to manage funds and hedge risks. As part of NSE Group and 100% subsidiary of NSE Data, we play an important role in Indian financial market ecosystem. Curiosity is our biggest asset and its in our DNA. Our curiosity to understand the market trends and challenges faced by todays market professional drives us to build and manage the most comprehensive database on Indian financial market, bring exclusive market moving news on our platform and continuously upgrade our analytical capability. It is CURIOSITY that drives everything we do at Cogencis. Together we learn, innovate and thrive professionally. We are an equal opportunity employer, and we strive to create a workplace that is not only employee friendly but puts our employees at the centre of our organisation. Wellbeing and mental health of our employees are a clear priority for us at NSE Cogencis.

Posted 2 months ago

Apply

5.0 - 7.0 years

6 - 7 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: This role is for a proactive Full Stack Software Engineer responsible for creating products to host Supply Chain Analytics algorithms. You will ensure software engineering excellence while developing web applications and tools, employing practices like pair programming and Test-Driven Development (TDD) within an Agile environment. Key responsibilities include acting as a change agent, mentoring teams on Agile methodologies, and contributing to Client's institutional knowledge. Strong written and oral communication skills are essential for interacting with Client leadership, along with a self-starting approach. Required Skills: Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related technical field. 5-7+ years of software engineering and testing experience, including Agile methodologies and Jira. Technical requirements include 3+ years in Python, Java, and Spring Boot development 3+ years with REST APIs; and 3+ years developing web-based UIs using JavaScript, React, Angular, Vue, or TypeScript, along with Pub Sub, APIGEE, and Cloud Storage. Experience with relational (e.g., PostgreSQL, SQL Server), NoSQL, and columnar databases (e.g., BigQuery) is necessary. At least 1 year of experience developing and deploying to cloud platforms such as Google Cloud Platform, Pivotal Cloud Foundry, Amazon Web Services, and Microsoft Azure is also required. A passion for clean code and a strong desire for continuous learning are key. Desired Skills: Full-stack expertise, automated testing (Unit, Integration, E2E), Cloud Computing/Infrastructure experience (especially Google Cloud Platform, Cloud Run containerization, and Google Cloud Storage), and proficiency with Continuous Integration/Continuous Delivery tools like Jenkins, Tekton, or Gradle. Skills Required: Big Query,, Python, Angular, Relational Databases, Google Cloud Platform, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 2 months ago

Apply

5.0 - 6.0 years

5 - 6 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer Practitioner Location: Chennai Work Type: Hybrid Position Description: We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataform,Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at client. Basic Qualifications: Bachelors or Masters degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Strong understating of Database concepts and experience with multiple database technologies optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud Google Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java- Experience programming engineering transformation in Python or a similar language. 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories Deep understanding of data service ecosystems including data warehousing, lakes and Marts User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Skills Required: Data Architecture, Data Warehousing, DataForm, Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, Dataform, PubSub Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 2 months ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in

Posted 2 months ago

Apply

5.0 - 9.0 years

10 - 20 Lacs

Pune

Hybrid

Role & responsibilities Minimum of 5 years of experience in a DevOps, SRE, or Infrastructure Engineering role. • Solid understanding of Terraform and experience maintaining reusable module libraries. • Hands-on experience managing workloads on Kubernetes (preferably GKE). • Working knowledge of CI/CD tools such as GitHub Actions and Helm. • Familiarity with Google Cloud services, including networking, Cloud SQL (Postgres), and container security. • Competence in observability tooling, especially Datadog dashboards and alert configurations. • Strong operational mindset with attention to detail in release processes and deployment integrity. Desirable Experience • Exposure to GitOps tool. • Experience developing or integrating Kubernetes operators. • Familiarity with service-level indicators (SLIs), service-level objectives (SLOs), and structured alerting. Tools and Expectations • Terraform / HCP Terraform - Core to infrastructure provisioning. Required to build, refactor, and maintain reusable infrastructure modules across environments, enforce naming/tagging standards, and leverage state management for drift detection and rollback. • GitHub / GitLab / GitHub Actions - Central to CI/CD workflows. Expected to enforce secure release procedures, set up integration with code quality tools, and prevent direct changes to critical branches. • Helm - Used for Kubernetes application packaging and deployment. Must implement pre/post deployment logic, rollback plans, and chart lifecycle automation. • GKE / Kubernetes - Platform for hosting applications. The engineer must manage node pools, service networking, security contexts, and namespace segmentation. • GCP Services (CloudSQL, VPC, IAM) - Backend for infrastructure workloads.

Posted 2 months ago

Apply

3.0 - 4.0 years

3 - 7 Lacs

Mumbai

Work from Office

Job Summary We are seeking an experienced and motivated Data Engineer to join our growing team, preferably with experience in the Banking, Financial Services, and Insurance (BFSI) sector. The ideal candidate will have a strong background in designing, building, and maintaining robust and scalable data infrastructure. You will play a crucial role in developing our data ecosystem, ensuring data quality, and empowering data-driven decisions across the organization. This role requires hands-on experience with the Google Cloud Platform (GCP) and a passion for working with cutting-edge data technologies. Responsibilities Design and Develop End-to-End Data Engineering Pipelines: Build, and maintain scalable and reliable data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. Implement Data Quality and Governance: Establish and enforce processes for data validation, transformation, auditing, and reconciliation to ensure data accuracy, completeness, and consistency. Build and Maintain Data Storage Solutions: Design, implement, and manage data vault and data mart to support business intelligence, analytics, and reporting requirements. Orchestrate and Automate Workflows: Utilize workflow management tools to schedule, monitor, and automate complex data workflows and ETL processes. Optimize Data Infrastructure: Continuously evaluate and improve the performance, reliability, and cost-effectiveness of our data infrastructure and pipelines. Collaborate with Stakeholders: Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and deliver effective data solutions. Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and architectures. Key Skills Python: Proficient in Python for data engineering tasks, including scripting, automation, and data manipulation. PySpark: Strong experience with PySpark for large-scale data processing and analytics. SQL: Expertise in writing complex SQL queries for data extraction, transformation, and analysis. Tech Stack (Must Have) Google Cloud Platform (GCP): Dataproc: For managing and running Apache Spark and Hadoop clusters. Composer (Airflow): For creating, scheduling, and monitoring data workflows. Cloud Functions: For event-driven serverless data processing. Cloud Run: For deploying and scaling containerized data applications. Cloud SQL: For managing relational databases. BigQuery: For data warehousing, analytics, and large-scale SQL queries. Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3+ years of proven experience in a Data Engineer role. Demonstrable experience with the specified "must-have" tech stack. Strong problem-solving skills and the ability to work independently and as part of a team. Excellent communication and interpersonal skills. Good to Have Experience in the BFSI (Banking, Financial Services, and Insurance) domain. Apache NiFi: Experience with data flow automation and management. Qlik: Familiarity with business intelligence and data visualization tools. AWS: Knowledge of Amazon Web Services data services. DevOps and FinOps: Understanding of DevOps principles and practices (CI/CD, IaC) and cloud financial management (FinOps) to optimize cloud spending.

Posted 2 months ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. In this role, you will be a senior contractor engaged on a 2.5-month remote assignment with the potential to extend. We are looking for candidates with required skills who can work independently as well as within a team environment. Your responsibilities will include facilitating, guiding, and influencing the client and teams towards an effective architectural pattern. You will become an interface between business leadership, technology leadership, and the delivery teams. Additionally, you will perform Migration Assessments and Produce Migration Plans that encompass Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, Application Waves, designing solution architecture on Google Cloud to support critical workloads, and Heterogeneous Oracle Migrations to Postgres or Spanner. You will design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users, and Security. Your role will also involve overseeing migration activities and providing troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, Technology reviews, and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy and handles personal information in accordance with the California Consumer Privacy Act (CCPA).,

Posted 2 months ago

Apply

5.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Hybrid

Job Description 5+ yrs of IT experience Good understanding of analytics tools for effective analysis of data Should be able to lead teams Should have been part of the production deployment team, Production Support team Experience with Big Data tools- Hadoop, Spark, Apache Beam, Kafka, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Experience with any DW tools like BQ, Redshift, Synapse, or Snowflake Experience in ETL and Data Warehousing. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra, etc. Experience with cloud platforms like AWS, GCP and Azure. Experience with workflow management using tools like Apache Airflow. Roles & Responsibilities Develop high-performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on the GCP cloud using GCP data stores such as BigQuery Should be able to handle the deployment process Optimizing data pipelines for performance and cost for large-scale data lakes. Writing complex, highly optimized queries across large data sets and creating data processing layers. Closely interact with Data Engineers to identify the right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs, and other Data/ML engineers Research new use cases for existing data. Preferred: Need to be Aware of Design Best practices for OLTP and OLAP Systems Should be part of team designing the DB and pipeline Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling Worked on heterogeneous migration projects

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 18 Lacs

Chennai

Hybrid

Role & responsibilities Bachelors Degree 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strongexperience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API

Posted 2 months ago

Apply

8.0 - 13.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Your Impact: We are looking for an experienced PostgreSQL database administrator who will be responsible for the performance, availability, security, and backup/recovery of clusters of PostgreSQL instances, along with the opportunity to learn to support Oracle and/or MS SQL instances. What the role offers: Setup and manage highly available Crunchy Data HA-based PostgreSQL clusters. Patch, upgrade, and maintain PostgreSQL software. Implement minimal downtime database upgrade using various technologies. Design and implement application specific data migration solution for database upgrade to minimize customer impact. Establish PostgreSQL best practices across various deployments Act as a Tech Lead within the team to drive our PostgreSQL delivery roadmap and strategy Pro-actively review database metrics, identify bottleneck, and tune the database/query. Configure and customize monitoring configurations for PostgreSQL databases. Implement backup/recovery strategies with point-in-time restore capability to meet customer's SLA. Periodically perform data restore to ensure recoverability. Implement/maintain data replication to disaster recovery environment and execute disaster recovery exercise annually. Automate routine tasks such as software installation, standby database validation, log rotation, security auditing. Develop and maintain documented procedures to ensure consistent and effective database operations in the team. Respond to page-outs as part of on-call rotation, perform incident recovery, root cause analysis, and identify and implement corrective actions. Act as PostgreSQL SME supporting your peers to provide expertise, input, and insights as needed Support database environments used by customer-facing OpenText applications aligned to our multi-tenant SaaS stack of products What you need to succeed: Bachelors Degree in Computer Engineering or related Should have at least 8 years of Information technology experience 3+ years of PostgreSQL operations experience Expert skills in setting up and managing PostgreSQL HA environment Expert skills in PostgreSQL troubleshooting and performance management Expert skills in PostgreSQL backup/recovery Strong Unix skills, especially in writing automation scripts for remote execution Proficiency in writing and optimizing SQL statements Ability to thrive in a fast-paced environment working on projects against strict deadlines Experience supporting enterprise level database environments Additional Value-Added Qualifications: Skills in Oracle database administration Skills in MS SQL administration Experience with Terraform, Ansible, or other automation technologies Experience with GCP CloudSQL or AWS RDS services Strong understanding of ITIL principles, certification is a plus Experience with database monitoring through tools such as Nagios, Zabbix, or New Relic.

Posted 2 months ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Provision MySQL instances, both in clustered and non-clustered configurations Ensure performance, security, and availability of databases Work with the various Engineering groups and ensure database changes are in-line with operational standards and meet the strategies needed to scale Data Mining and Data Analysis. Prepare documentations and specifications Handle common database procedures, such as upgrade, backup, recovery, migration, etc. Profile server resource usage, optimize and tweak as necessary Collaborate with other team members and stakeholders Skills and Qualifications Strong experience in writing SQL queries, Cloud SQL, procedures and functions. (Mandatory) Experience in administering MySQL replication, configuration, and deployment strategies (Mandatory) Should have experience in data modeling and database design Experience in scripting preferred ( shell, Python, etc ) Must be highly proficient in all aspects of database administration, including backup/recovery/replication, clustering, advanced performance tuning, and proactive monitoring Experience in handling databases as a service when production environments are in the cloud. Metadata management and repository usage Ensuring data integrity & Performance management and tuning General systems management and networking skills. Strong Knowledge with database architecture design, including data partitioning Strong understanding of distributed systems, different levels of data consistency Experience in any NoSQL database such as MongoDB is preferred. Expert knowledge in maintaining, building, supporting, tuning, and monitoring production in MySQL database servers. Understand data locking concepts and the different levels of locking in MySQL

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies