Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
5.0 - 9.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 1 month ago
5.0 - 9.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 1 month ago
4.0 - 8.0 years
20 - 35 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 4 to 8 years Location: Gurgaon Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 1 month ago
7.0 - 10.0 years
15 - 30 Lacs
Pune
Hybrid
Looking for 7–10 yrs exp (4+ in data modeling, 2–3 in Data Vault 2.0). Must know DBT, Dagster/Airflow, GCP (BigQuery, CloudSQL), and data modeling. DV 2.0 hands-on is a must. Docker is a plus.
Posted 1 month ago
5.0 - 8.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
We are seeking a highly skilled and experienced Senior Cloud Native Developer to join our team and drive the design, development, and delivery of cutting-edge cloud-based solutions on Google Cloud Platform (GCP). This role emphasizes technical expertise, best practices in cloud-native development, and a proactive approach to implementing scalable and secure cloud solutions. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, adhering to architecture standards and best practices Code and implement Java applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, and Pub/Sub Select appropriate GCP services to address functional and non-functional requirements Demonstrate deep expertise in GCP PaaS, Serverless, and Database services Ensure compliance with security and regulatory standards across all cloud solutions Optimize cloud-based solutions to enhance performance, scalability, and cost-efficiency Stay updated on emerging cloud technologies and trends in the industry Collaborate with cross-functional teams to architect and deliver successful cloud implementations Leverage foundational knowledge of GCP AI services, including Vertex AI, Code Bison, and Gemini models when applicable Requirements 5+ years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in using GCP services, including GKE, CloudRun, Functions, Firestore, Firebase, and Cloud SQL Knowledge of advanced GCP services, such as Apigee, Spanner, Memorystore, Service Mesh, Gemini Code Assist, Vertex AI, and Cloud Monitoring Solid understanding of cloud security best practices and expertise in implementing security controls in GCP Proficiency in cloud architecture principles and best practices, with a focus on scalable and reliable solutions Experience with automation and configuration management tools, particularly Terraform, along with a strong grasp of DevOps principles Familiarity with front-end technologies like Angular or React Nice to have Familiarity with GCP GenAI solutions and models, including Vertex AI, Codebison, and Gemini models Background in working with front-end frameworks and technologies to complement back-end cloud development Capability to design end-to-end solutions integrating modern AI and cloud technologies
Posted 1 month ago
4.0 - 9.0 years
10 - 15 Lacs
Pune
Work from Office
DevOps Engineer (Google Cloud Platform) About Us IntelligentDX is a dynamic and innovative company dedicated to changing the Software landscape in the Healthcare industry. We are looking for a talented and experienced DevOps Engineer to join our growing team and help us build and maintain our scalable, reliable, and secure cloud infrastructure on Google Cloud Platform. Job Summary We are seeking a highly skilled DevOps Engineer with 4 years of hands-on experience, specifically with Google Cloud technologies. The ideal candidate will be responsible for designing, implementing, and maintaining our cloud infrastructure, ensuring the scalability, reliability, and security of our microservices-based software services. You will play a crucial role in automating our development and deployment pipelines, managing cloud resources, and supporting our engineering teams in delivering high-quality applications. Responsibilities Design, implement, and manage robust, scalable, and secure cloud infrastructure on Google Cloud Platform (GCP). Implement and enforce best practices for GCP Identity and Access Management (IAM) to ensure secure access control. Deploy, manage, and optimize applications leveraging Google Cloud Run for serverless deployments. Configure and maintain Google Cloud API Gateway for efficient and secure API management. Implement and monitor security measures across our GCP environment, including network security, data encryption, and vulnerability management. Manage and optimize cloud-based databases, primarily Google Cloud SQL, ensuring data integrity, performance, and reliability. Lead the setup and implementation of new applications and services within our GCP environment. Troubleshoot and resolve issues related to Cross-Origin Resource Sharing (CORS) configurations and other API connectivity problems. Provide ongoing API support to development teams, ensuring smooth integration and operation. Continuously work on improving the scalability and reliability of our software services, which are built as microservices. Develop and maintain CI/CD pipelines to automate software delivery and infrastructure provisioning. Monitor system performance, identify bottlenecks, and implement solutions to optimize resource utilization. Collaborate closely with development, QA, and product teams to ensure seamless deployment and operation of applications. Participate in on-call rotations to provide timely support for critical production issues. Qualifications Required Skills & Experience Minimum of 4 years of hands-on experience as a DevOps Engineer with a strong focus on Google Cloud Platform (GCP). Proven expertise in GCP services, including: GCP IAM: Strong understanding of roles, permissions, service accounts, and best practices. Cloud Run: Experience deploying and managing containerized applications. API Gateway: Experience in setting up and managing APIs. Security: Solid understanding of cloud security principles, network security (VPC, firewall rules), and data protection. Cloud SQL: Hands-on experience with database setup, management, and optimization. Demonstrated experience with the setup and implementation of cloud-native applications. Familiarity with addressing and resolving CORS issues. Experience providing API support and ensuring API reliability. Deep understanding of microservices architecture and best practices for their deployment and management. Strong commitment to building scalable and reliable software services. Proficiency in scripting languages (e.g., Python, Bash) and automation tools. Experience with Infrastructure as Code (IaC) tools (e.g., Terraform, Cloud Deployment Manager). Familiarity with containerization technologies (e.g., Docker, Kubernetes). Excellent problem-solving skills and a proactive approach to identifying and resolving issues. Strong communication and collaboration abilities. Preferred Qualifications GCP certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with monitoring and logging tools (e.g., Cloud Monitoring, Cloud Logging, Prometheus, Grafana). Knowledge of other cloud platforms (AWS, Azure) is a plus. Experience with Git and CI/CD platforms (e.g., GitLab CI, Jenkins, Cloud Build). What We Offer Health insurance, paid time off, and professional development opportunities. Fun working environment Flattened hierarchy, where everyone has a say Free snacks, games, and happy hour outings If you are a passionate DevOps Engineer with a proven track record of building and managing robust systems on Google Cloud Platform, we encourage you to apply!
Posted 1 month ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 month ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
8.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 1 month ago
8.0 - 12.0 years
50 - 65 Lacs
Bengaluru
Work from Office
Job Title: Staff Engineer Gen-AI Experience: 8.0 Year To 10.0 Year CTC Salary: 50.00 LPA To 65.00 LPA Location: Bengaluru/Bangalore Job Description Build Gen-AI native products: Architect, build, and ship platforms powered by LLMs, agents, and predictive AI. Stay hands-on: Design systems, write code, debug, and drive product excellence. Lead with depth: Mentor a high-caliber team of full stack engineers. Speed to market: Rapidly ship and iterate on MVPs to maximize learning and feedback. Own the full stack: From backend data pipelines to intuitive UIsfrom Airflow to React from BigQuery to embeddings. Scale what works: Ensure scalability, security, and performance in multi-tenant, cloud-native environments (GCP). Collaborate deeply: Work closely with product, growth, and leadership to align tech with business priorities. What You Bring 8+ years of experience building and scaling full-stack, data-driven products Proficiency in backend (Node.js, Python) and frontend (React), with solid GCP experience Strong grasp of data pipelines, analytics, and real-time data processing Familiarity with Gen-AI frameworks (LangChain, LlamaIndex, OpenAI APIs, vector databases) Proven architectural leadership and technical ownership Product mindset with a bias for execution and iteration Our Tech Stack Cloud: Google Cloud Platform Backend: Node.js, Python, Airflow Data: BigQuery, Cloud SQL AI/ML: TensorFlow, OpenAI APIs, custom agents Frontend: React.js Interested professional can share Resume at harshita.g@recex.co Thanks & Regards Harshita Recex
Posted 1 month ago
12.0 - 15.0 years
12 - 15 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Modeler Key Responsibilities As a Data Modeler, you will: Data Model Design: Perform hands-on data modeling for both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) systems, covering Conceptual, Logical, and Physical data modeling. Database Performance Optimization: Apply a strong understanding of indexing, partitioning, and data sharding, with practical experience, to optimize database performance, especially for near-real-time reporting and application interaction. Tool Utilization: Work with at least one data modeling tool, preferably DBSchema or Erwin. GCP Database Understanding: Leverage a good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery. Collaboration: (Implied) Collaborate with teams to understand requirements and translate them into data models. Mandatory Skills & Experience Technical Proficiency: Data Modeling: Hands-on experience in data modeling for OLTP and OLAP systems . Modeling Concepts: In-depth knowledge of Conceptual, Logical, and Physical data modeling . Database Performance: Strong understanding of indexing, partitioning, data sharding , with practical experience. Performance Variables: Strong understanding of variables impacting database performance for near-real-time reporting and application interaction. Modeling Tools: Working experience on at least one data modeling tool, preferably DBSchema or Erwin . GCP Databases: Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery . Experience & Qualifications: Demonstrated experience in applying database optimization techniques (indexing, partitioning, sharding). Essential Professional Skills Domain Knowledge (Plus): Functional knowledge of the mutual fund industry will be a plus.
Posted 1 month ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
We are currently seeking a Google Kubernetes Engine Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Description: Primary Skill: Cloud-Infrastructure-Google Cloud Platform Minimum work experience: 8+ yrs Total Experience: 8+ Years Must have GCP Solution Architect Certification & GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. Manage Kubernetes Objects Declarative and imperative paradigms for interacting with the Kubernetes API. Managing Secrets Managing confidential settings data using Secrets. Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. Configure networking for your cluster. Hands-on experience with terraform. Ability to write reusable terraform modules. Hands-on Python and Unix shell scripting is required. understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. Experience with GCP Services and writing cloud functions. Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus Experience using Docker within container orchestration platforms such as GKE. Knowledge of setting up splunk Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating
Posted 1 month ago
8.0 - 13.0 years
30 - 35 Lacs
Noida
Work from Office
Must have GCP Solution Architect Certification & GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. Manage Kubernetes Objects Declarative and imperative paradigms for interacting with the Kubernetes API. Managing Secrets Managing confidential settings data using Secrets. Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. Configure networking for your cluster. Hands-on experience with terraform. Ability to write reusable terraform modules. Hands-on Python and Unix shell scripting is required. understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. Experience with GCP Services and writing cloud functions. Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus Experience using Docker within container orchestration platforms such as GKE. Knowledge of setting up splunk Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes
Posted 1 month ago
7.0 - 9.0 years
8 - 15 Lacs
Hyderabad
Hybrid
Role & Responsibilities Role Overview : We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements : • Proficiency in ETL, Batch, and Streaming Process • Experience with BigQuery, Cloud Storage, and CloudSQL • Strong programming skills in Python, SQL, and Apache Beam for data processing • Understanding of data modeling and schema design for analytics • Knowledge of data governance, security, and compliance in GCP • Familiarity with machine learning workflows and integration with GCP ML tools • Ability to optimize performance within data pipelines Functional Requirements : • Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features • Experience in leading and mentoring peers within an existing development team • Strong communication skills to craft and communicate robust solutions • Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations • Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification o Engineering Grad / Postgraduate CRITERIA o Proficient in ETL, Python, and Apache Beam for data processing efficiency. o Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. o Strong collaboration skills with cross-functional teams for data product development. o Comprehensive knowledge of data governance, security, and compliance in GCP. o Experienced in optimizing performance within data pipelines for efficiency.
Posted 1 month ago
1.0 - 6.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Req ID: 328302 We are currently seeking a AWS Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title: Digital Engineering Sr Associate NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Basic Qualifications 1 years' experience in AWS Infra Preferred Experience Excellent communication and collaboration skills. AWS certifications are preferred. Expertise in AWS cloud EC2, creating, Managing, Patching, trouble shooting. Good Knowledge on Access and Identity Management Monitoring Tools - CloudWatch (New Relic/other monitoring), logging AWS Storage "“ EBS, EFS, S3, Glacier, Adding the disk, extending the disk. AWS backup and restoration Strong understanding of networking concepts to create VPC, Subnets, ACL, Security Groups, and security best practices in cloud environments. Knowledge of PaaS to IaaS migration strategies Scripting experience (must be fluent in a scripting language such as Python) Detail-oriented self-starter capable of working independently. Knowledge of IaaC Terraform and best practice. Experience with container orchestration utilizing ECS, EKS, Kubernetes, or Docker Swarm Experience with one or more of the following Configuration Management ToolsAnsible, Chef, Salt, Puppet infrastructure, networking, AWS databases. Familiarity with containerization and orchestration tools, such as Docker and Kubernetes. Bachelor"™s degree in computer science or a related field Any of the AWS Associate Certifications GCP Knowledge Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Listener. You listen to the needs of the customer and make those the priority throughout development.
Posted 1 month ago
7.0 - 12.0 years
6 - 10 Lacs
Noida
Work from Office
Req ID: 327205 We are currently seeking a Lead Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Grade 8 At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company"™s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. Preferred Experience ""¢ Ideal candidate has been supporting traditional server based relational databases (Postgresql and MongoDB) for over 7+ years out of which last 4+ years in public cloud environments (GCP). "¢ Hands-on experience with PostgreSQL/MongoDB, including installation, configuration, performance tuning, and troubleshooting. "¢ Demonstrated expertise in managing PostgreSQL databases on AZURE, GCP and AWS RDS. "¢ Experience with features such as automated backups, maintenance, and scaling - Postgresql "¢ Ability to analyze and optimize complex SQL queries for performance improvement. "¢ Proficiency in setting up and managing monitoring tools for PostgreSQL on GCP. "¢ Experience with configuring alerts based on performance metrics. "¢ Experience in implementing and testing backup and recovery strategies for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Knowledge and experience in designing and implementing disaster recovery plans for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Good Understanding of database security principles and best practices. "¢ Proven ability to identify and resolve performance bottlenecks in PostgreSQL databases. "¢ Experience in optimizing database configurations for better performance. "¢ Able to provide 24*7 shift hours support at L2/L3 level "¢ Experience in updating KB articles, Problem Management articles, and SOPs/runbooks "¢ Passion for delivering timely and outstanding customer service "¢ Great written and oral communication skills with internal and external customers "¢ Strong ITIL foundation experience "¢ Ability to work independently or no direct supervision. "¢ Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources. Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources." Basic Qualifications " "¢ 7+ years of overall operational experience "¢ 4+ years of GCP experience as a cloud DBA (Postgresql/Mongo DB) "¢ 3+ years of experience working in diverse cloud support database environments in a 24*7 production support model "¢ Query fine tuning - MongoDB "¢ Shell scripts for Monitoring like "˜slow queries"™, replication lag, nodes fails, disk usage. etc "¢ Backup and restores (Backups should be automated with shell scripts/Ops Manager) "¢ Database Health check (Complete review of Database slow queries, fragmentation, index usage. etc) "¢ Upgrades (Java version, Mongo version. etc) "¢ Maintenance (Data Centre outages etc) "¢ Architecture design as per the Application requirement "¢ Writing best practices documents for shading, replication for Dev/App teams "¢ Log rotation/ maintenance (mongos, mongodb, config etc) "¢ Segregation of duties (User Management "“ designing User roles and responsibilities) "¢ Designing DR (Disaster Recovery)/COB (Continuity of Business) plans as applicable "¢ Database Profiling, Locks, Memory Usage, No of connections, page fault etc., "¢ Export and Import of Data to and From MongoDB, Run time configuration of MongoDB, "¢ Data Managements in MongoDB Capped Collections Expired data from TTL, "¢ Monitoring of Various issues related with Database, "¢ Monitoring at Server, Database, Collection Level, and Various Monitoring Tools related to MongoDB, "¢ Database software Installation and Configuration in accordance with Client defined standards. "¢ Database Migrations and Updates "¢ Capacity management- MongoDB "¢ Hands on experience in Server Performance tuning and Recommendations "¢ High availability solutions and recommendations "¢ Hands on experience in Root cause analysis for business impacting issues. "¢ Experience with SQL,SQL Developer,TOAD,Pgadmin,mongo db atlas "¢ Experience with python / powershell scripting - preferred "¢ Secondary skill in MySQL/oracle - preferred "¢ Installation, configuration and upgrading of postgresql server software and related products " Preferred Certifications Azure fundamentals certification (AZ-900) - REQUIRED Google Cloud Associate Engineer - REQUIRED Azure Database Certification (DP-300) - preferred AWS Certified Database Specialty - preferred Postgresql certification a plus MongoDB certification a plus B.Tech/BE/MCA in Information Technology degree or equivalent experience
Posted 1 month ago
1.0 - 6.0 years
1 - 5 Lacs
Noida, Chennai, Bengaluru
Work from Office
Req ID: 328301 We are currently seeking a AWS Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title: Digital Engineering Sr Associate NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Basic Qualifications 1 years' experience in AWS Infra Preferred Experience Excellent communication and collaboration skills. AWS certifications are preferred. Expertise in AWS cloud EC2, creating, Managing, Patching, trouble shooting. Good Knowledge on Access and Identity Management Monitoring Tools - CloudWatch (New Relic/other monitoring), logging AWS Storage "“ EBS, EFS, S3, Glacier, Adding the disk, extending the disk. AWS backup and restoration Strong understanding of networking concepts to create VPC, Subnets, ACL, Security Groups, and security best practices in cloud environments. Knowledge of PaaS to IaaS migration strategies Scripting experience (must be fluent in a scripting language such as Python) Detail-oriented self-starter capable of working independently. Knowledge of IaaC Terraform and best practice. Experience with container orchestration utilizing ECS, EKS, Kubernetes, or Docker Swarm Experience with one or more of the following Configuration Management ToolsAnsible, Chef, Salt, Puppet infrastructure, networking, AWS databases. Familiarity with containerization and orchestration tools, such as Docker and Kubernetes. Bachelor"™s degree in computer science or a related field Any of the AWS Associate Certifications GCP Knowledge Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Listener. You listen to the needs of the customer and make those the priority throughout development.
Posted 1 month ago
10.0 - 15.0 years
6 - 9 Lacs
Noida
Work from Office
Req ID: 327207 We are currently seeking a Staff Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Grade 10 At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company"™s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. Preferred Experience ""¢ Ideal candidate has been supporting traditional server based relational databases (Postgresql and MongoDB) for over 10+ years out of which last 5+ years in public cloud environments (GCP). "¢ Hands-on experience with PostgreSQL/MongoDB, including installation, configuration, performance tuning, and troubleshooting. "¢ Demonstrated expertise in managing PostgreSQL databases on AZURE, GCP and AWS RDS. "¢ Experience with features such as automated backups, maintenance, and scaling - Postgresql "¢ Ability to analyze and optimize complex SQL queries for performance improvement. "¢ Proficiency in setting up and managing monitoring tools for PostgreSQL on GCP. "¢ Experience with configuring alerts based on performance metrics. "¢ Experience in implementing and testing backup and recovery strategies for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Knowledge and experience in designing and implementing disaster recovery plans for PostgreSQL databases on AWS RDS/AZURE SQL/GCP Cloud SQL. "¢ Good Understanding of database security principles and best practices. "¢ Proven ability to identify and resolve performance bottlenecks in PostgreSQL databases. "¢ Experience in optimizing database configurations for better performance. "¢ Able to provide 24*7 shift hours support at L2/L3 level "¢ Experience in updating KB articles, Problem Management articles, and SOPs/runbooks "¢ Passion for delivering timely and outstanding customer service "¢ Great written and oral communication skills with internal and external customers "¢ Strong ITIL foundation experience "¢ Ability to work independently or no direct supervision. "¢ Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources. Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members. "¢ Work directly with end customer, business stakeholders as well as technical resources." Basic Qualifications " "¢ 10+ years of overall operational experience "¢ 5+ years of GCP experience as a cloud DBA (Postgresql/Mongo DB) "¢ 3+ years of experience working in diverse cloud support database environments in a 24*7 production support model "¢ Query fine tuning - MongoDB "¢ Shell scripts for Monitoring like "˜slow queries"™, replication lag, nodes fails, disk usage. etc "¢ Backup and restores (Backups should be automated with shell scripts/Ops Manager) "¢ Database Health check (Complete review of Database slow queries, fragmentation, index usage. etc) "¢ Upgrades (Java version, Mongo version. etc) "¢ Maintenance (Data Centre outages etc) "¢ Architecture design as per the Application requirement "¢ Writing best practices documents for shading, replication for Dev/App teams "¢ Log rotation/ maintenance (mongos, mongodb, config etc) "¢ Segregation of duties (User Management "“ designing User roles and responsibilities) "¢ Designing DR (Disaster Recovery)/COB (Continuity of Business) plans as applicable "¢ Database Profiling, Locks, Memory Usage, No of connections, page fault etc., "¢ Export and Import of Data to and From MongoDB, Run time configuration of MongoDB, "¢ Data Managements in MongoDB Capped Collections Expired data from TTL, "¢ Monitoring of Various issues related with Database, "¢ Monitoring at Server, Database, Collection Level, and Various Monitoring Tools related to MongoDB, "¢ Database software Installation and Configuration in accordance with Client defined standards. "¢ Database Migrations and Updates "¢ Capacity management- MongoDB "¢ Hands on experience in Server Performance tuning and Recommendations "¢ High availability solutions and recommendations "¢ Hands on experience in Root cause analysis for business impacting issues. "¢ Experience with SQL,SQL Developer,TOAD,Pgadmin,mongo db atlas "¢ Experience with python / powershell scripting - preferred "¢ Secondary skill in MySQL/oracle - preferred "¢ Installation, configuration and upgrading of postgresql server software and related products "¢ Secondary skill - DB2 is a plus. " Preferred Certifications Azure fundamentals certification (AZ-900) - REQUIRED Google Cloud Associate Engineer - REQUIRED Azure Database Certification (DP-300) - preferred AWS Certified Database Specialty - preferred Postgresql certification a plus MongoDB certification a plus B.Tech/BE/MCA in Information Technology degree or equivalent experience
Posted 1 month ago
6.0 - 11.0 years
22 - 35 Lacs
Chennai
Hybrid
Role & responsibilities Candidates with minimum 6+ years of experience Designs, develops, solves problems, debugs, evaluates, modifies, deploys, and documents software and systems that meet the needs of customer-facing applications, business applications. Develop and maintain large data processing pipeline and marketing campaign optimisation strategies Digital Marketing Campaign Tracking using Adobe Marketing products Digital Marketing Campaign Tracking using Adobe Marketing products Build Campaign Analysis dashboards for Marketing and Finance teams using Adobe Analytics. Design and implement a solution for the implementation of tracking with cookies using Adobe marketing suite, manage tag containers, variables and implement content tagging Use Cloud SQL and Dataproc to migrate existing workloads to Google Cloud or any other cloud. Proficient in BigQuery to carry out batch and interactive data analysis. Hands on in writing code, conducting code reviews and testing in ongoing sprints and familiar with proof of concepts and automation tools
Posted 1 month ago
5.0 - 7.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough