Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
11 - 15 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders Design and deliver front-to-back technical solutions and integrate into business processes Participate in hands-on coding, code reviews, architectural decisions, and reviews Work in an Agile Systems Development Life Cycle Skills Must have Overall 6-9 years of experience as a Java Developer 6+ Years of Experience developing in Core Java and Spring Framework Google Cloud Platform Experience Worked with the latest features of Java 8, 11, and 17 in Development Solid understanding of Data Structures Good hands-on coding skills Experience in Kafka or other messaging Knowledge of key APIsJPA, JTA, CDI, etc. Knowledge of various design and architectural patterns Understanding of microservices architecture Containerization solutions (e.g. Docker, Kubernetes, OpenShift) Building tools (e.g., Maven, Gradle) Version Control (e.g., Git) Continuous Integration systems (e.g., TeamCity, Jenkins) English Upper-Intermediate Be well versed with concepts of references, class instances, methods, objects, constructors, mutable and immutable class concepts, functional interfaces, array lists, linked lists, Hashmap, collections, the difference between recoverable and non-recoverable exceptions, Inversion Control, design a data structure that supports Insert, Delete, Search in constant time complexity, etc. Nice to have Banking Domain
Posted 4 weeks ago
5.0 - 9.0 years
13 - 17 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Develop solutions following established technical design, application development standards, and quality processes in projects Assess the impacts on technical design because of the changes in functional requirements Perform independent code reviews and execute unit tests on modules developed by self and other junior team members on the project. Write well-designed, efficient, and testable code Interact with other stakeholders, not limited to end-user clients, the project manager or scrum master, Business Analysts, offshore development, testing, and other cross-functional teams Skills Must have Must have 10+ Years of Java Development Experience with 3+ Years Architecture design experience BS/MS degree in Computer Science, Software Engineering, or a related subject Google Cloud Platform Experience Comfortable with practicing TDD and pair programming. Well-versed in DevOps Good knowledge of object-oriented design principles and Hands-on experience with object-oriented programming Good knowledge of Java standard library Hands-on experience with Spring and/or Spring Boot is a big plus. Experience in agile software development Well versed with Solution Architecture and principles like below, but not limited to SOLID Hexagonal, Ports and Adapter Cloud Native Microservices patterns Experience in Large enterprise System Integrations and Architecture Strong understanding and hands-on experience with design in Scalability, High Availability, Reliability, Resiliency, Secure, and performant systems Should have good presentation, documentation, and communication skills Knowledge of Linux is a plus Knowledge of cloud platforms is a plus Desirable to have knowledge of TOGAF, Zachman frameworks Good to have an understanding of Application security frameworks and standards, eg, OWASP, NIST 4+ progressive years of experience in building and implementing model-driven, enterprise-level business solutions and applications in PRPC Excellent time management and organization skills, as well as the ability to manage multiple competing priorities Exceptional interpersonal skills and the ability to communicate, partner, and collaborate Dedication to achieving outstanding customer results with a team-oriented drive and a demonstrated ability to lead by example Exposure to a variety of technologies, including object-oriented techniques/principles, database design, application & web servers Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & IT stakeholders Ability to match technology solutions to customer needs Nice to have Banking Domain
Posted 4 weeks ago
3.0 - 6.0 years
9 - 14 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders Design and deliver front-to-back technical solutions and integrate into business processes Participate in hands-on coding, code reviews, architectural decisions, and reviews Work in an Agile Systems Development Life Cycle Skills Must have 8+ years of experience in implementing and testing practices across the full Software Development lifecycle 6+ years of experience in Java Development/Maintenance/Testing Technology knowledge and experience in automation testing using Java language is highly needed Good Experience in BDD Cucumber Good Experience in Selenium Experience with industry-standard test tools (e.g., HP ALM 11). Experience in planning and executing testing across projects. Customer and service orientation to support interaction with team resources and clients. Performance and productivity-oriented to drive toward quality testing outcomes and results Proactively initiate, develop, and maintain effective working relationships with team members Demonstrated ability to work with a variety of people and achieve results Nice to have Banking Domain
Posted 4 weeks ago
3.0 - 7.0 years
12 - 16 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities CI/CD setup, management, and improvementautomate build and deployment pipelines and procedures. Application infrastructure support, including hardware, networks, OS, frameworks, and related tools. Test environments support. Follow internal security operations procedures. Application and infrastructure audit compliance. Deploy application releases, provide SL3 application on-call rota/weekend support. Perform root cause analysis of production errors and resolve technical issues. Cover monitoring and alerting needs. Self-motivated individual to perform tasks on his own. Certificate installations/renewal. Compliance-related changes for the application based on internal compliance requirements. Skills Must have 6+ years of experience as a DevOps/SRE engineer or in a similar software engineering role. 6+ years of experience with Linux diagnosis and support, deep bash knowledge. 5+ years of experience with Python coding. Strong TCP/IP. 5+ years experience with CI/CDJenkins, build/test full flow support (support, development, and improvement), Groovy, Artifactory. Application release cycle experiencerelease engineer, SL3 support. IaCansible, git. Docker experience. Proficiency with monitoring. Basic SQL. Analytical thinking. Good verbal and writing communication skills. Proactive approach to identifying problems, performance bottlenecks, and areas for improvement. Be independent in assigned tasks and projects, work as a team in the responsibility area. Nice to have Banking Domain
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer at HSBC, you will play a crucial role in designing, implementing, and managing scalable, secure, and reliable cloud infrastructure on the Google Cloud Platform (GCP). Your responsibilities will include collaborating with development teams to optimize applications for cloud deployment, setting up and configuring various GCP services, and ensuring compliance with security policies and best practices. To excel in this role, you should have proven experience as a Cloud Engineer with a strong focus on GCP. Your expertise should include a deep understanding of cloud architecture and services such as Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery. Additionally, you will be expected to automate infrastructure provisioning using tools like Terraform, Google Cloud Deployment Manager, or similar, and implement CI/CD pipelines for efficient software delivery. The successful candidate will possess proficiency in scripting languages like Python and Bash, as well as the ability to troubleshoot and resolve issues related to cloud infrastructure and services. Google Cloud certifications, particularly the Google Cloud Professional Cloud Architect certification, are considered a plus. Staying updated with the latest GCP features, services, and best practices is essential for this role. Any knowledge of other cloud platforms like AWS or Azure will be an added advantage. If you are a skilled and experienced Google Cloud Engineer seeking a career where your expertise is valued, HSBC offers an inclusive and diverse environment where employees are respected, valued, and provided with opportunities for continuous professional development and growth. Join us at HSBC and realize your ambitions in a workplace that prioritizes employee well-being and career advancement. For more information about career opportunities at HSBC, visit www.hsbc.com/careers.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As Ford Motor Company embarks on a significant multi-year Platform Lifecycle Management (PLM) program to modernize critical IT applications across the enterprise, you have a unique opportunity to join the team as a GenAI Technical Manager (LL6). In this role, you will play a pivotal part in driving the practical adoption of Generative AI (GenAI) at Ford, with a focus on creating accelerators for the PLM modernization effort and enhancing the Ford Developer Experience (DX). Your expertise will be crucial in leading the technical development and implementation of GenAI solutions within this strategic program. You will collaborate closely with various teams including PLM program leaders, GDIA (Global Data Insight & Analytics), architecture teams, PDOs (Product Driven Organizations), and engineering teams to design, build, and deploy cutting-edge GenAI tools and platforms. Your responsibilities will include leading the technical design, development, testing, and deployment of GenAI solutions, translating GenAI strategy into actionable projects, managing the technical lifecycle of GenAI tools, and overseeing the integration of GenAI capabilities into existing workflows and processes. As a technical expert in GenAI models and frameworks, you will provide guidance to development teams, architects, and stakeholders on best practices, architecture patterns, security considerations, and ethical AI principles. You will stay updated on the evolving GenAI landscape, evaluate new tools and models, and lead the development of GenAI-powered accelerators and tools to automate and streamline processes within the PLM program. Collaboration and stakeholder management will be key aspects of your role, requiring effective communication of complex technical concepts to diverse audiences. You will also lead proof-of-concept projects with emerging GenAI technologies, champion experimentation and adoption of successful tools and practices, and mentor junior team members in GenAI development tasks. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Software Engineering, Artificial Intelligence, or a related field, along with 8-10+ years of experience in software development/engineering with a focus on AI/ML and Generative AI solutions. Deep practical expertise in GenAI, strong software development foundation, and familiarity with enterprise application context are essential qualifications. Preferred qualifications include GCP certifications, experience with Agile methodologies, and familiarity with PLM concepts or the automotive industry. If you are passionate about innovation in the AI space, possess strong analytical and strategic thinking skills, and excel in a fast-paced, global environment, we invite you to join us in shaping the future of AI at Ford Motor Company.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,
Posted 1 month ago
7.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Description Looking for an experienced GCP Cloud/DevOps Engineer and or OpenShift to design, implement, and manage cloud infrastructure and services across multiple environments. This role requires deep expertise in Google Cloud Platform (GCP) services, DevOps practices, and Infrastructure as Code (IaC). Candidate will be deploying, automating, and maintaining high-availability systems, and implementing best practices for cloud architecture, security, and DevOps pipelines. Requirements Bachelor&aposs or master&aposs degree in computer science, Information Technology, or a similar field Must have 7 + years of extensive experience in designing, implementing, and maintaining applications on GCP and OpenShift Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team Key Responsibilities Cloud Infrastructure Design and Deployment: Architect, design, and implement scalable, reliable, and secure solutions on GCP. Deploy and manage GCP services in both development and production environments, ensuring seamless integration with existing infrastructure. Implement and manage core services such as BigQuery, Datafusion, Cloud Composer (Airflow), Cloud Storage, Data Fusion, Compute Engine, App Engine, Cloud Functions and more. Infrastructure as Code (IaC) and Automation Develop and maintain infrastructure as code using Terraform or CLI scripts to automate provisioning and configuration of GCP resources. Establish and document best practices for IaC to ensure consistent and efficient deployments across environments. DevOps And CI/CD Pipeline Development Create and manage DevOps pipelines for automated build, test, and release management, integrating with tools such as Jenkins, GitLab CI/CD, or equivalent. Work with development and operations teams to optimize deployment workflows, manage application dependencies, and improve delivery speed. Security And IAM Management Handle user and service account management in Google Cloud IAM. Set up and manage Secrets Manager and Cloud Key Management for secure storage of credentials and sensitive information. Implement network and data security best practices to ensure compliance and security of cloud resources. Performance Monitoring And Optimization Monitoring & Security: Set up observability tools like Prometheus, Grafana, and integrate security tools (e.g., SonarQube, Trivy). Networking & Storage: Configure DNS, networking, and persistent storage solutions in Kubernetes. Set up monitoring and logging (e.g., Cloud Monitoring, Cloud Logging, Error Reporting) to ensure systems perform optimally. Troubleshoot and resolve issues related to cloud services and infrastructure as they arise. Workflow Orchestration Orchestrate complex workflows using Argo Workflow Engine. Containerization: Work extensively with Docker for containerization and image management. Optimization: Troubleshoot and optimize containerized applications for performance and security. Technical Skills Expertise with GCP and OCP (OpenShift) services, including but not limited to Compute Engine, Kubernetes Engine (GKE), BigQuery, Cloud Storage, Pub/Sub, Datafusion, Airflow, Cloud Functions, and Cloud SQL. Proficiency in scripting languages like Python, Bash, or PowerShell for automation. Familiarity with DevOps tools and CI/CD processes (e.g. GitLab CI, Cloud Build, Azure DevOps, Jenkins) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in operating and further developing applications and functionalities on the cloud platform. Your focus will also extend to regulatory topics surrounding partner/client relationships. We are seeking individuals who can contribute to this contemporary and emerging Cloud application area. Key Responsibilities: - Ensure optimum service level to supported business lines - Oversee resolution of incidents and problems within the team - Assist in managing business stakeholder relationships - Define and manage OLAs with relevant stakeholders - Monitor team performance, adherence to processes, and alignment with business SLAs - Manage escalations and work with relevant functions to resolve issues quickly - Identify areas for improvement and implement best practices in your area of expertise - Mentor and coach Production Management Analysts within the team - Fulfill Service Requests, communicate with Service Desk function, and participate in major incident calls - Document tasks, incidents, problems, changes, and knowledge bases - Improve monitoring of applications and implement automation of tasks Skills and Experience: - Service Operations Specialist experience in a global operations context - Extensive experience supporting complex application and infrastructure domains - Ability to manage and mentor Service Operations teams - Strong ITIL/best practice service context knowledge - Proficiency in interface technologies, communication protocols, and ITSM tools - Bachelor's Degree in IT or Computer Science related discipline - ITIL certification and experience with ITSM tool ServiceNow preferred - Knowledge of Banking domain and regulatory topics - Experience with databases like BigQuery and understanding of Big Data and GCP technologies - Proficiency in tools like GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow - Architectural skills for big data solutions and interface architecture Area-Specific Tasks/Responsibilities: - Handle Incident/Problem Management and Service Request Fulfilment - Analyze and resolve incidents escalated from 1st Level Support - Support the resolution of high-impact incidents and escalate when necessary - Provide solutions for open problems and support service transition for new projects/applications Joining our team, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career progression. We value diversity and promote a positive, fair, and inclusive work environment at Deutsche Bank Group. Visit our company website for more information.,
Posted 1 month ago
6.0 - 10.0 years
1 - 1 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 1 month ago
9.0 - 14.0 years
7 - 14 Lacs
Hyderabad, Pune
Hybrid
Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world&aposs leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role As a Data Engineer specializing in AI/ML, you&aposll be instrumental in designing, building, and maintaining the data infrastructure crucial for training, deploying, and serving our advanced AI and Machine Learning models. You&aposll work closely with Data Scientists, ML Engineers, and Cloud Architects to ensure data is accessible, reliable, and optimized for high-performance AI/ML workloads, primarily within the Google Cloud ecosystem. Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines to ingest, transform, and load data from various sources into data lakes and data warehouses, specifically optimized for AI/ML consumption. AI/ML Data Infrastructure: Architect and implement the underlying data infrastructure required for machine learning model training, serving, and monitoring within GCP environments. Google Cloud Ecosystem: Leverage a broad range of Google Cloud Platform (GCP) data services including, BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Vertex AI, Composer (Airflow), and Cloud SQL. Data Quality & Governance: Implement best practices for data quality, data governance, data lineage, and data security to ensure the reliability and integrity of AI/ML datasets. Performance Optimization: Optimize data pipelines and storage solutions for performance, cost-efficiency, and scalability, particularly for large-scale AI/ML data processing. Collaboration with AI/ML Teams: Work closely with Data Scientists and ML Engineers to understand their data needs, prepare datasets for model training, and assist in deploying models into production. Automation & MLOps Support: Contribute to the automation of data pipelines and support MLOps initiatives, ensuring seamless integration from data ingestion to model deployment and monitoring. Troubleshooting & Support: Troubleshoot and resolve data-related issues within the AI/ML ecosystem, ensuring data availability and pipeline health. Documentation: Create and maintain comprehensive documentation for data architectures, pipelines, and data models. Qualifications 1-2+ years of experience in Data Engineering, with at least 2-3 years directly focused on building data pipelines for AI/ML workloads. Deep, hands-on experience with core GCP data services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Composer/Airflow. Strong proficiency in at least one relevant programming language for data engineering (Python is highly preferred).SQL skills for complex data manipulation, querying, and optimization. Solid understanding of data warehousing concepts, data modeling (dimensional, 3NF), and schema design for analytical and AI/ML purposes. Proven experience designing, building, and optimizing large-scale ETL/ELT processes. Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop) and concepts. Exceptional analytical and problem-solving skills, with the ability to design solutions for complex data challenges. Excellent verbal and written communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
13 - 18 Lacs
Pune
Work from Office
About The Role : Job Title Technical-Specialist Big Data (PySpark) Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Big Data platform for at least 5 years. Hands own experience in Spark (Hive, Impala). Hands own experience in Python Programming language. Preferably, experience in BigQuery , Dataproc , Composer , Terraform , GKE , Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of DevOps. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .
Posted 1 month ago
8.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
Join GlobalLogic as a valuable member of the team working on a significant software project for a world-class company that provides M2M / IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your engagement will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Requirements - BA / BS degree in Computer Science, Mathematics, or a related technical field, or equivalent practical experience. - Proficiency in Cloud SQL and Cloud Bigtable. - Experience with Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub, and Genomics. - Familiarity with Google Transfer Appliance, Cloud Storage Transfer Service, and BigQuery Data Transfer. - Knowledge of data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and data processing algorithms (MapReduce, Flume). - Previous experience working with technical customers. - Proficiency in writing software in languages like Java or Python. - 6-10 years of relevant consulting, industry, or technology experience. - Strong problem-solving and troubleshooting skills. - Excellent communication skills. Job Responsibilities - Hands-on experience working with data warehouses, including technical architectures, infrastructure components, ETL / ELT, and reporting / analytic tools. - Experience in technical consulting. - Proficiency in architecting and developing software or internet-scale Big Data solutions in virtualized environments like Google Cloud Platform (mandatory) and AWS / Azure (good to have). - Familiarity with big data, information retrieval, data mining, machine learning, and building high availability applications with modern web technologies. - Working knowledge of ITIL and / or agile methodologies. - Google Data Engineer certification. What We Offer - Culture of caring: Prioritize a culture of caring, where people come first, fostering an inclusive environment of acceptance and belonging. - Learning and development: Commitment to continuous learning and growth, offering various programs, training curricula, and hands-on opportunities for personal and professional advancement. - Interesting & meaningful work: Engage in impactful projects that allow for creative problem-solving and exploration of new solutions. - Balance and flexibility: Embrace work-life balance with diverse career areas, roles, and work arrangements to support personal well-being. - High-trust organization: Join a high-trust organization with a focus on integrity, trustworthiness, and ethical practices. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with forward-thinking companies to create innovative digital products and experiences. Join the team in transforming businesses and industries through intelligent products, platforms, and services, contributing to cutting-edge solutions that shape the world today.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
kochi, kerala
On-site
Joining Gadgeon offers a dynamic and rewarding career experience that fosters both personal and professional growth. Our collaborative culture encourages innovation, empowering team members to contribute their ideas and expertise to cutting-edge projects. We are seeking a Principal Data Engineer with over 10 years of experience. In this high-impact role, you will be responsible for defining and driving the data architecture strategy, guiding technical teams, and ensuring robust, scalable, and secure database systems that support business-critical applications. You will work autonomously and directly with stakeholders across engineering, infrastructure, and product leadership, influencing decisions related to GCP Cloud SQL, GKE, relational and non-relational databases (including Cassandra and Couchbase), while also managing complex database challenges. Key Responsibilities: - Define and lead the overall data architecture strategy across cloud and hybrid environments, in alignment with business and technical goals. - Collaborate directly with leadership to drive key architectural decisions. - Provide deep, hands-on expertise in GCP Cloud SQL and containerized database deployment using Google Kubernetes Engine (GKE). - Architect and optimize relational database solutions using PostgreSQL, MySQL, and MS SQL Server. - Design and implement NoSQL data solutions including Cassandra, Couchbase, and other scalable data stores. - Lead end-to-end planning and execution of complex data engineering tasks, including schema design, performance tuning, high availability, and disaster recovery. - Handle and coordinate P1 incident responses, conducting root cause analysis and designing long-term mitigation plans. - Guide and mentor internal database teams, fostering technical quality and architectural consistency. - Automate database provisioning, scaling, and maintenance using modern DevOps and infrastructure-as-code tools. - Provide architectural documentation and recommendations for scalability, security, compliance, and cost efficiency. Skills: - 10+ years of experience in data engineering and architecture, including hands-on cloud and database infrastructure leadership. - Proven expertise in Google Cloud Platform (GCP) especially Cloud SQL and GKE. - Strong hands-on background in relational databases: PostgreSQL, MySQL, MS SQL Server - Deep knowledge and implementation experience with NoSQL technologies: Cassandra, Couchbase, [optionally MongoDB, Redis, etc.] - Experience in independent or consulting roles, capable of working with minimal supervision and maximum ownership. - Proven ability to manage and resolve high-severity incidents and communicate resolution paths to leadership. - Strong scripting and automation skills (e.g., Python, Terraform, Bash). - Familiarity with CI/CD, container orchestration, and observability practices for modern data infrastructure. - Background in supporting data compliance, governance, and cost optimization in cloud environments. Why Join Us: - Work with a team that values deep security thinking over checkbox compliance. - Experiment with tools, techniques, and platforms to continuously simulate, validate, and improve. - Collaborate across teams that respect security as a partner, not a blocker. - Build scalable security strategies in a growing, tech-driven organization. Experience: - 10+ Years,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,
Posted 1 month ago
2.0 - 5.0 years
13 - 17 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 month ago
4.0 - 8.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management.- Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language).- Strong understanding of data pipeline architecture and design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Agile Project Management.- This position is based in Chennai (Mandatory).- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
6 - 13 Lacs
Mohali
Work from Office
Job Profile: Senior SQL Cloud Database Administrator (DBA) position. Responsibilities: - Managing, optimizing, and securing our cloud-based SQL databases, ensuring high availability and performance. - Design and implement scalable and secure SQL database structures in AWS and GCP environments. - Plan and execute data migration from on-premises or legacy systems to AWS and GCP cloud platforms. - Monitor database performance, identify bottlenecks, and fine-tune queries and indexes for optimal efficiency. - Implement and manage database security protocols, including encryption, access controls, and compliance with regulations. - Develop and maintain robust backup and recovery strategies to ensure data integrity and availability. - Perform regular maintenance tasks such as patching, updates, and troubleshooting database issues. - Work closely with developers, DevOps, and data engineers to support application development and deployment. - Ensure data quality, consistency, and governance across distributed systems. - Keep up with emerging technologies, cloud services, and best practices in database management. Required Skills: - Proven experience as a SQL Database Administrator with expertise in AWS and GCP cloud platforms. - Strong knowledge of SQL database design, implementation, and optimization. - Experience with data migration to cloud environments. - Proficiency in performance monitoring and query optimization. - Knowledge of database security protocols and compliance regulations. - Familiarity with backup and disaster recovery strategies. - Excellent troubleshooting and problem-solving skills. - Strong collaboration and communication skills. - Knowledge of DevOps integration.
Posted 1 month ago
4.0 - 8.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |