Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
- Desired technical and interpersonal skills include, but are not limited to: 1.BE with hands on experience in Cisco technologies 2.CCNA and/or CCNP Routing & Switching certifications (preferred) 3.Strong communication skills 4.Very Good understanding on Cisco Architectures (EN/Sec/SP) and Solutions 5.Desire and ability to learn new technology and solutions. Specialized experience requirements:- 6+ Years of experience on any one EN/Sec/SP Architecture Understanding and hands on experience preferred in the detailed sub technologies in that Architecture Ability to understand and capture technical as well as business requirements Self-starter with excellent Presentation skills and consultative skills Strong Analytical, Communication both written and verbal Business
Posted 3 weeks ago
4.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description You will join an Integration Centre of Excellence as part of a financial organisation. Our IT teams design and develop modern systems, customer tools, handling transactions and providing integration support to Core Banking and online channel applications. Team is responsible for the operation and continuous development of an integration layer developed using IBM API Connect platform. The systems are used to implement business processes spanning different inhouse applications and third-party providers and regulatory authorities. Responsibilities Develop integration applications based on user requirements in our existing IIB (IBM Integration Bus) and ACE (App Connect Enterprise) platforms Analysis and design of existing processes and systems Monitor and maintain the technical environment Contribute collaboratively to the team and appreciate the needs of the user to recommend and develop sound technical solutions Offer good communication skills as you will be expected to build strong and effective relationships with other teams. Skills Must have 4 + years of experience in Software Development with proficiency in designing, modelling and developing enterprise applications in integration layer using IBM Integration Bus (IIB), IBM App Connect Enterprise (ACE), WebSphere MQ, WebSphere DataPower, IBM API Connect, IBM Cloud Pak for Integration Good knowledge of Core Java debugging and Java based IIB APIs Strong communication skills and high motivation to learn new technology with excellent solving skills. Excellent knowledge of REST, JSON and XML Extensive experience with software development methodologies like Waterfall, scrum and agile Experience in developing micro service, API's RESTful Web Services that interact with a wide range of systems. Extensive experience in migrating on-prem services developed in IBM integration Bus, IBM DataPower, IBM API Connect platforms. Implemented robust security patterns by adopting various authentication protocols such as Two-way SSL, OAuth, SAML, Kebros, LDAP, TLS, JWT etc. Extensive experience working with SOA, Web Services, SOAP, WSDL, WS Security, XSLT Style sheets, XML Schema, LDAP. Conversant with all phases of SDLC in Requirement gathering, Analysis, Design, Development, Implementation and Testing. Extensive in-depth knowledge in designing, development and testing of EAI applications. Extensive experience in point-to-point and pub/sub messaging features. Experience in the performance tuning for applications for optimal performance. Able to work collaboratively with developers, testers, technical support engineers and other team members in the overall enhancement of the software product quality. Nice to have ITIL knowledge and experience with ticketing systems (ServiceNow, Tivoli etc) Banking Domain Knowledge Knowledge of Microsoft Azure and Cloud Concepts
Posted 3 weeks ago
3.0 - 7.0 years
6 - 16 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 3+ Years in IT and minimum 2+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 3 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Mumbai, Mangaluru
Hybrid
6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.
Posted 3 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Mumbai
Work from Office
Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Analyst Qualifications: BE Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAutomate any process end-to-end with cognitive software robots using the robotic process automation software, Automation Anywhere Enterprise. What are we looking for Adaptable and flexibleAbility to perform under pressureProblem-solving skillsAbility to establish strong client relationshipAgility for quick learningThis request is raised for Contract Conversion Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification BE
Posted 3 weeks ago
3.0 - 8.0 years
17 - 30 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design, develop, and maintain high-performance, scalable, and secure Java-based applications using Spring Boot, JPA, and Hibernate. Work with both SQL (MySQL, PostgreSQL, Oracle) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Implement and optimize RESTful APIs, microservices, and event-driven architectures. Leverage cloud platforms (AWS/Azure/GCP) for deploying, monitoring, and scaling applications. Integrate message queue systems (PubSub / Kafka / RabbitMQ /SQS / Azure Service Bus) for asynchronous processing. Contribute to data lake and data warehouse solutions, ensuring efficient data ingestion, processing, and retrieval. Collaborate with frontend teams if needed (knowledge of React/Angular is a plus). Troubleshoot and debug complex issues, ensuring optimal performance and reliability. Follow industry best practices in coding standards, security (OWASP), CI/CD, and DevOps methodologies. Own the delivery of an integral piece of a system or application. Mandatory Skills & Qualifications: 3-5 years of hands-on experience in Java/J2EE, Spring Boot, Hibernate, JPA. Strong expertise in SQL & NoSQL databases, query optimization, and data modeling. Proven experience with cloud platforms (AWS/Azure/GCP) Lambda, EC2, S3, Azure Functions, GCP Cloud Run, etc. Knowledge of one of the message queue systems (Pubsub / Kafka / RabbitMQ / ActiveMQ /SQS). Familiarity with data lakes (Delta Lake, Snowflake, Databricks) & data warehouses (BigQuery, Redshift, Synapse). Experience with Docker, Kubernetes, and CI/CD pipelines (Jenkins/GitHub Actions/Azure DevOps). Strong problem-solving, debugging, and performance tuning skills. Experience in eCommerce and Deep hands-on technical expertise Good to Have (Plus Skills): Frontend experience with React.js/Angular. Knowledge of GraphQL, gRPC, or WebSockets. Understanding of AI/ML integration in backend systems. Certifications in cloud (AWS/Azure/GCP) or Java/Spring. Experience with monitoring and logging tools in GCP (Cloud Monitoring, Cloud Logging) Soft Skills: Strong analytical and communication skills. Ability to work in a fast-paced, collaborative environment. Proactive mindset with a focus on continuous learning.
Posted 3 weeks ago
15.0 - 20.0 years
32 - 40 Lacs
Pune
Work from Office
: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
2.0 - 5.0 years
5 - 7 Lacs
Noida
Work from Office
Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team.
Posted 3 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 3 weeks ago
5.0 - 8.0 years
15 - 25 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Dear Candidates, We have an excellent opportunity with V2Solutions for MuleSoft Developer. Please find below JD. Overview: We are seeking an experienced MuleSoft Developer/Administrator to join our team. The ideal candidate will have over 6 years of hands-on experience with MuleSoft, including development, administration, and deployment tasks. You will be responsible for analyzing user and business needs, coordinating with various stakeholders, and managing the MuleSoft integration platform. Key Responsibilities: 6+ years of experience on MuleSoft (Development) Good with documentation, RCA, and JIRA, and Agile methodologies Experience with Pub-Sub Model Good understanding of APIs, REST, and SOAP Good file concepts (JSON, XML, CSV) Good with MuleSoft Platform APIs Good file transfer concept (SFTP) Good understanding of Anypoint Platform Features Architectural, detailed Design and enterprise business experience within Mule ESB Proven industry experience with focused integration experience Good Database knowledge (SQL, Postgres) Experience with Object Store, MQ, Dashboards (MuleSoft) Good Administration skills Experience with CICD Deployment Experience with API based development and security aspects Experience in developing IT projects in the real-time and data integrations domain will be valued. Note - We are looking for immediate to 30 days joiners only because of business needs. Interested candidates please share your resumes of priyanka.singh@v2solutions.com
Posted 3 weeks ago
0.0 - 1.0 years
2 - 3 Lacs
Bengaluru
Work from Office
Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team.
Posted 3 weeks ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.
Posted 3 weeks ago
3.0 - 6.0 years
10 - 14 Lacs
Gurugram
Work from Office
As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architectureto design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 4 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Hybrid
Role & responsibilities For Solace Developer Minimum 2 years of experience working on Solace PubSub+ Design and implement event-driven solutions using Solace PubSub+ Develop and maintain event brokers, topic hierarchies, and queue networks Optimize message routing, filtering, and transformation for high-performance systems Collaborate with cross-functional teams to integrate Solace with various applications Hands on experience on CI/CD implementation on Solace is a must Candidate must be comfortable with SSO config and push config feature enablement on Solace Candidate should be well versed with new features of Solace PubSub+. Preferred candidate profile
Posted 4 weeks ago
4.0 - 9.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 4 weeks ago
3.0 - 8.0 years
14 - 24 Lacs
Chennai
Hybrid
Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team
Posted 1 month ago
8.0 - 13.0 years
14 - 24 Lacs
Chennai
Hybrid
Greetings from Getronics! Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP)Skills Required:GCP , Big Query,, AI/ML Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering/AIML Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.
Posted 1 month ago
8.0 - 13.0 years
14 - 24 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.
Posted 1 month ago
4.0 - 8.0 years
10 - 19 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.
Posted 1 month ago
3.0 - 7.0 years
3 - 7 Lacs
Bengaluru, Karnataka, India
On-site
Be an integral part of large-scale client business development and delivery engagements byunderstanding the business requirements. Hands-on withDataflow/Apache beam andRealtime data streaming Engineer ingestion and processing pipelines on GCP using python libraries, Java, BigQuery and composer. Automate the repeatable tasks into a framework that can be reused in other parts of the projects. Handle the data quality, governance, and reconciliation during the development phases. Being able to communicate with internal/external customers, desire to develop communication and client-facing skills. Understand and contribute in all the agile ceremonies to ensure the efficiency in delivery. Qualification & Experience: A bachelor s degree in Computer Science or related field. Minimum 5 years of experience in software development. Minimum 3 years of technology experience in Data Engineering projects Minimum 3 years of experience in GCP. Minimum 3 years of experience in python programming. Minimum 3 years of experience in SQL/PL SQL Scripting. Minimum 3 years of experience in Data Warehouse / ETL. Ability to build streaming/batching solutions. Exposure to project management tools like JIRA, Confluence and GIT. Ability to define, create, test, and execute operations procedures. Must have skills: Strong understanding of real time streaming concepts Strong problem solving and analytical skills. Good communication skills. Understanding of message queues like Kafka, Rabbit MQ, PubSub Understanding of fast data caching systems like Redis/Memory Store GCP experience - 3+ years Dataflow/Apache beam hands of experience - Custom templates Understanding of Composer Good experience with Big Query and PubSub Good hands-on experience with Python Hands on experience with modular java code development involving design patterns - Factory, Reflection, etc. Good to have skills: GCP Professional Data Engineer certification is an added advantage. Understanding of Terraform script. Understanding of Devops Pipeline Identity and Access Management, Authentication protocols Google drive APIs, One drive APIs
Posted 1 month ago
4.0 - 9.0 years
11 - 19 Lacs
Chennai
Work from Office
Role & responsibilities Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton Preferred candidate profile Data Engineer in Python - GCP Location Chennai Only 4+ Years of Experience
Posted 1 month ago
10.0 - 15.0 years
30 - 40 Lacs
Noida, Pune, Bengaluru
Hybrid
Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Good To Have:- Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc
Posted 1 month ago
5.0 - 15.0 years
6 - 7 Lacs
Mumbai
Work from Office
Acquire new Franchisee and service existing Franchisee to achieve set IR targets. Ensure all the paper work for the Franchisee acquired is done including authorized person or sub broker registration. Visit Franchisee mapped at pre-defined frequency and ensure that he is activated in currency Commodity and also cross sells all the products Make sure all the payouts and other queries of the Franchisee are addressed. Report to AVP / internal audit any suspicious activity which they have seen / heard during their Franchisee visit or otherwise. Conduct training events for Franchisee and his employees to help the Franchisee to go out and acquire more customers and increase his business. To continuously provide feedback on competitive activities and track market development Help Franchisee to conduct investor meets in his location.
Posted 1 month ago
11.0 - 16.0 years
40 - 45 Lacs
Pune
Work from Office
Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.
Posted 1 month ago
6.0 - 9.0 years
9 - 15 Lacs
Bengaluru
Hybrid
Job Description: We are hiring a Java Developer with strong GCP experience, or a GCP Engineer proficient in Java. The candidate should be capable of developing scalable cloud-native applications using Google Cloud services. Key Skills: Java, Spring Boot, RESTful APIs Google Cloud Platform (GCP) Cloud functions, Pub/Sub, BigQuery (preferred) CI/CD, Docker, Kubernetes
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough