Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 7.0 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Nagpur
Work from Office
Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : SAP FI CO Finance Good to have skills : SAP CO Product Cost Controlling Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role ::Sr. SAP S4H FICO Consultant Job Duties & ResponsibilitiesIn-depth SAP Solutions and process knowledge including industry best practicesLeads fit/gap and other types of working sessions to understand needs driven by business process requirements.Translate requirements into solutions, using SAP Best Practices or Navisite Solutions as a baseline.Leader of their respective workstream on assigned projects.Work in conjunction with Navisite Service Delivery Lead to establish the overall plan for their respective work for the customerSAP configuration experience primarily in the FI/CO modules.Configure SAP CO systems to meet client business requirements, including connection points with SD, PP, MM and other modules and implementation of SAP best practices. At least two full lifecycle implementations as an SAP CO functional consultant and minimum 5 support projects. S4 HANA Experience is a mustApply strong knowledge of the business processes for designing, developing, and testing SAP functions associated with financial operations, which includes expertise in cost center accounting (CCA), Internal Order Accounting (IOA), product cost controlling (CO-PC), profitability analysis (CO-PA), and profit center accounting (PCA). Focus on business process re-engineering efforts and technology enablement Serves as the subject matter expert on product systems, processes, network architecture and interface capabilities Should have in-depth understanding and execution skills in FI and CO sub modules SAP FI:FI General Ledger accounting, Accounts Receivables, Account Payables, Asset accounting Experience in developing specifications for Interfaces and Custom ReportsCreates functional specifications for development objects.Conducts unit testing on overall solution including technical objects.Supports integration testing and user acceptance testing with customer.Explores new SAP applications as a subject matter expert and may be first adopter for emerging SAP technologies.Supports Navisite Application Managed Services (AMS) by working and resolving tickets as assigned.Sustains adequate product knowledge through formal training, webinars, SAP publications, collaboration among colleagues and self-study.Enforce the core competencies and professional standards of Navisite in all client engagements.Supports internal projects as assigned.Collaborates with colleagues to grow product knowledge.Assists in the continual improvement of Navisite methods and tools.Adheres to Navisite professional standardsWilling to travel as per business needs Key Competencies:Customer FocusResults DrivenBusiness AcumenTrusted AdvisorTask ManagementProblem Solving SkillsCommunication SkillsPriority SettingPresentation SkillsMentorship and CollaborationAbility to work regularly scheduled shifts After-hours coverage for critical issues as needed Qualifications 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills:Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 weeks ago
5.0 - 8.0 years
20 - 22 Lacs
Chennai
Work from Office
Minimum 5 Years of in-depth experience in Java/Spring Boot Minimum 3 Years of Experience in Angular ability to develop rich UI screens and custom/re-usable components. Minimum 2 Years of GCP experience working in GCP Big Query, Google Cloud Storage, Cloud Run, PubSub. Minimum 2 of experience in using CI/CD pipelines like Tekton. 1-2 Years of experience in deploying google cloud services using Terraform. Experience mentoring other software engineers and delivering systemic change across 5+ years of experience in J2EE
Posted 2 weeks ago
7.0 - 12.0 years
0 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Noida
Work from Office
Required Skills & Experience Bachelor's degree in computer science or a related field. 3+ years of experience in web and backend development. Should have expertise in multiple programming languages (Java, Python, Golang) Strong understanding of databases (e.g PostgreSQL & MongoDB databases) Strong understanding of data structures, algorithms, and their applications in software development. Should have experience or knowledge on PubSub.
Posted 2 weeks ago
5.0 - 9.0 years
19 - 25 Lacs
Chennai
Work from Office
5+ years of experience in Java/J2EE development, including strong object-oriented design principles. --ERP implementation exp preferred. MBC and BTP knowledge appreciated Expertise in Java 8 and above, including functional programming concepts. Expertise in Spring Platform (Spring MVC, Spring Boot, Spring JDBC, Spring Cloud) and RESTful/SOAP web services. In-depth knowledge of GCP services (Cloud Run, Redis, PubSub, Kubernetes, Cloud Scheduler). Experience with Enterprise SSO technology Mandatory Key Skills Java*,functional programming,object-oriented design,SOAP web services,J2EE development,ERP implementation,Java 8,Spring Platform,MBC,BTP,RESTful,GCP services
Posted 2 weeks ago
10.0 - 15.0 years
30 - 40 Lacs
Bhopal, Pune, Gurugram
Hybrid
Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title: Data Engineer GCP Company: Xebia Location: Hybrid - Any Xebia Location Experience: 5+ Years Salary: As per industry standards Job Type: Full Time About the Role: Xebia is hiring a seasoned Data Engineer (L4) to join a high-impact team building scalable data platforms using GCP, Databricks, and Airflow . If you thrive on architecting future-ready solutions and have strong experience in big data transformations, we’d love to hear from you. Project Overview: We currently manage 1000+ Data Pipelines using Databricks Clusters for end-to-end data transformation ( Raw Silver Gold ) with orchestration handled via Airflow — all on Google Cloud Platform (GCP) . Curated datasets are delivered through BigQuery and Databricks Notebooks . Our roadmap includes migrating to a GCP-native data processing framework optimized for Spark workloads. Key Responsibilities: Design and implement a GCP-native data processing framework Analyze and plan migration of existing workloads to cloud-native architecture Ensure data availability, integrity, and consistency Build reusable tools and standards for the Data Engineering team Collaborate with stakeholders and document processes thoroughly Required Experience: 5+ years in Data Engineering with strong data architecture experience Hands-on expertise in Databricks , Airflow , BigQuery , and PySpark Deep knowledge of GCP services for data processing (Dataflow, Dataproc, etc.) Familiarity with data lake table formats like Delta, Iceberg Experience with orchestration tools ( Airflow , Dagster , or similar) Key Skills: Python programming Strong understanding of data lake architectures and cloud-native best practices Excellent problem-solving and communication skills Notice Period Requirement: Only Immediate Joiners or Candidates with Max 30 Days Notice Period Will Be Considered How to Apply: Interested candidates can share their details and updated resume with vijay.s@xebia.com in the following format: Full Name: Total Experience (Must be 5+ years): Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: Note: Please apply only if you have not recently applied or interviewed for any open roles at Xebia.
Posted 3 weeks ago
1.0 - 3.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Skills required : - Strong understanding of back-end and front-end technologies such as NodeJS, ReactJS, Pug, GraphQL, PubSub, TypeScript, HTML5, CSS and CSS-In-JS. - Strong knowledge of React Hooks, Styled Components, Apollo Client, GraphQL, and Typescript - Understanding of NoSQL & SQL databases - Experience implementing authorization and authentication workflows using JWT etc. - Experience with bundlers like Webpack, Rollup, Grunt or Gulp - Previous experience with cloud platforms like AWS and tools like GIT. - Ability to write performant code in an easily understandable structure. Roles and Responsibilities : - Design and develop highly complex application components, and integrate software packages, programs, and reusable objects capable to run on multiple platforms - Leverage open-source code and libraries to quickly experiment and build novel solutions - Independently think of solutions to complex requirements; possess exceptional logical skills - Analyze current products in development, including performance, diagnosis, and troubleshooting - Work with the existing framework and help evolve it by building reusable code and libraries - Search and introduce new software related technologies, process, and tools to the team Brownie Points : - Knowledge of Docker & Kubernetes - Understanding of React Native - Familiarity with Python What we have to offer : - Work with a performance-oriented team driven by ownership and open to experiments with cutting-edge technologies - Learn to design system for high accuracy, efficiency, and scalability - Flexible hours - Learn to innovate - Meritocracy driven, candid startup culture
Posted 3 weeks ago
1.0 - 3.0 years
10 - 15 Lacs
Kolkata, Gurugram, Bengaluru
Hybrid
Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 3 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon / Bangalore (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 3 weeks ago
4.0 - 9.0 years
25 - 37 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Location : Pan India (Any where In Location) Job Summary: We are seeking a skilled Solace PubSub+ Developer with a minimum of 2 years of experience in designing and implementing event-driven solutions using Solace PubSub+. The ideal candidate will be responsible for developing and maintaining event brokers, optimizing message routing, and integrating Solace with various applications. Expertise in CI/CD implementation on Solace, SSO configuration, and push config feature enablement is essential for this role. Key Responsibilities: Design and implement event-driven architecture solutions using Solace PubSub+. Develop and maintain event brokers, topic hierarchies, and queue networks to ensure efficient message flow. Optimize message routing, filtering, and transformation to enhance system performance and scalability. Collaborate with cross-functional teams to integrate Solace PubSub+ with various enterprise applications. Implement and manage CI/CD pipelines for Solace, ensuring automated and seamless deployments. Configure and enable SSO (Single Sign-On) and push config features on Solace. Stay up to date with the latest features and updates of Solace PubSub+ and implement best practices. Required Qualifications & Skills: Minimum 2 years of experience working with Solace PubSub+. Strong knowledge of event-driven architecture, messaging patterns, and distributed systems. Hands-on experience in CI/CD implementation on Solace. Expertise in configuring SSO and push config features within Solace. Strong understanding of message brokers, topic-based routing, and queue management. Ability to troubleshoot and optimize message flows for high-performance applications. Excellent communication and collaboration skills to work with cross-functional teams. Preferred Qualifications: Experience integrating Solace with cloud platforms such as AWS, Azure, or GCP. Familiarity with REST, MQTT, and JMS messaging protocols. Knowledge of containerization technologies (Docker, Kubernetes) for deploying Solace in cloud-native environments.
Posted 3 weeks ago
2.0 - 4.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Role Finance Controller Lead DO Lead cross global functional teams in developing finance strategies to support a strategic alignment with companys Business Operations, and Corporate departments on company goals & initiatives. Manage financial goals that result in strong customer satisfaction, align with company strategy, and optimize costs and supplier relations. Influence senior leaders in setting direction for their functional areas by linking finance and business strategies to optimize business results.
Posted 3 weeks ago
5.0 - 10.0 years
14 - 24 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Roles & Responsibilities- 1.Experience into Webmethod Support role 2.Experience into L1,L2,L3 Support 3.Experience into Webmethod Integrations 4.Experience in P1,P2,P3 role 5.Experience in Webmethod Pub-Sub model
Posted 3 weeks ago
4.0 - 9.0 years
8 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Required Skills: SQL, GCP (BigQuery, Composer, Data Flow), Big Data (Scala, Kafka) You'll need to have: Experience in Big Data technologies - GCP/Composer/Bigquery /DataFlow Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Experience with Data Warehouse concepts and Data Management life cycle.
Posted 3 weeks ago
1.0 - 3.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Skills required : - Strong understanding of back-end and front-end technologies such as NodeJS, ReactJS, Pug, GraphQL, PubSub, TypeScript, HTML5, CSS and CSS-In-JS. - Strong knowledge of React Hooks, Styled Components, Apollo Client, GraphQL, and Typescript - Understanding of NoSQL & SQL databases - Experience implementing authorization and authentication workflows using JWT etc. - Experience with bundlers like Webpack, Rollup, Grunt or Gulp - Previous experience with cloud platforms like AWS and tools like GIT. - Ability to write performant code in an easily understandable structure. Roles and Responsibilities : - Design and develop highly complex application components, and integrate software packages, programs, and reusable objects capable to run on multiple platforms - Leverage open-source code and libraries to quickly experiment and build novel solutions - Independently think of solutions to complex requirements; possess exceptional logical skills - Analyze current products in development, including performance, diagnosis, and troubleshooting - Work with the existing framework and help evolve it by building reusable code and libraries - Search and introduce new software related technologies, process, and tools to the team Brownie Points : - Knowledge of Docker & Kubernetes - Understanding of React Native - Familiarity with Python What we have to offer : - Work with a performance-oriented team driven by ownership and open to experiments with cutting-edge technologies - Learn to design system for high accuracy, efficiency, and scalability - Flexible hours - Learn to innovate - Meritocracy driven, candid startup culture Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 3 weeks ago
4.0 - 8.0 years
10 - 19 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Narmadha
Posted 3 weeks ago
8 - 13 years
14 - 24 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team
Posted 4 weeks ago
7 - 11 years
6 - 10 Lacs
Mumbai
Work from Office
Skill required:Procure to Pay Processing - Invoice Processing Operations Designation:Management Level - Team Lead/Consultant Job Location:Mumbai Qualifications:Any Graduation Years of Experience:7 to 11 years What would you do? The incumbent should be an expert in Accounts payable lifecycle and will be responsible for Must be flexible in working hours UK/US (EST hours in US shift if required) Managing team of 30-35 FTEs.for end to end process. Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon and T&E. The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years The Procure to Pay Processing team helps clients and organizations by boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. The team is responsible for accounting of goods and services, through requisitioning, purchasing and receiving. They also look after order sequence of procurement and financial process end to end. In Invoice Processing Operations you will ensure efficient and accurate processing of expense invoices / claims in adherence with client policy and procedures.You will be working on audit claims in accordance with client policies and procedures. You will work on save/post invoice in ERP,verify WHT, VAT/WHT discrepancy resolution.You will also be required to post the invoices for payment and work on PO Process, Non - PO, credit note, 2 way Match & 3 Way Match, Email management and ERP Knowledge. What are we looking for? Adaptable and flexible Ability to perform under pressure Problem-solving skills Detail orientation Ability to establish strong client relationship Minimum 10-12 years of AP experience in BPO out of which 7-8 years minimum with experience @ Lead roles in different capacities. Minimum Bachelor's degree in Finance Accounting or related field Advanced knowledge of AP concepts and applications Strong understanding of AP metrics and SLAs and the factors that influence them System & applications Experience of working in SAP/Oracle ERP would be an added advantage. Intermediate knowledge of MS office tools (Excel/Word/PPT) is must. Having advanced excel knowledge would be an added advantage. Ability to run/support automation/RPA/process improvement initiatives parallel to the core job Ability to interact with client finance leads, understands the business and process. Excellent in communication skills both oral and written as need to interact client leadership. Should be able ssto articulate the things. Good understanding of risks, issues and have thought process to anticipate the potential risks in a process and set mitigations plans/controls to eliminate or minimize the risks. Roles and Responsibilities The Role:The incumbent should be an expert in Accounts payable lifecycle & will be responsible for:Must be flexible in working hours UK/US (EST hours in US shift if required)Managing team of 30-35 FTEs for end to end process.Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon & T&E.The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years Functional Responsibilities:Complete underst&ing of accounts payable life cycle & must possess in-depth knowledge of processing all categories of Invoices (PO, Non-PO, OTP Invoices, Utility Invoices, Statutory Payments, Payments Vendor Master, AP helpdesk.Should be an expert in managing all sub-processes PTP.Should have experience of h&ling international client in BPM organization. Must possess great interpersonal skills, must have experience of speaking to client leads & have regular governance.Manage AP teams & processes in accordance with documented procedures & policies.Participate in the weekly, monthly governance call & manage the status call. Lead the resolution of complex or sensitive issues from client, senior management, or vendor queries on a timely basis.Track the progress of Knowledge Transfer, Transition progress & proactively work on deviation if any to fix it.Monitor process & operational KPIs to ensure effective delivery against targets & benchmarks.Manage & oversee control procedures & practices to ensure no significant SOX control deficiencies in the AP delivery sub-function.Drive controls & compliance in a process & ensure 100% noiseless operations. Identify & support AP improvement initiatives to drive operational efficiencies & improved controls.Manage required & appropriate reporting to facilitate informed decision making (e.g. aging, forecasted payables)Support regional leadership through business partnering by providing metrics, problem resolution, & reporting process performance.Maintain files & documentation thoroughly & accurately, in accordance with company policy. People Management Responsibilities:Supervise & manage an PTP team with multiple sub-processes, approximately 30-35 team members, ensuring communication & coordination across teams Closely work with Team leads & SMEs to drive the business transformation
Posted 1 month ago
5 - 10 years
20 - 35 Lacs
Bengaluru
Hybrid
GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications
Posted 1 month ago
4 - 7 years
10 - 19 Lacs
Indore, Gurugram, Bengaluru
Work from Office
We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles and Responsibilities 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.
Posted 1 month ago
10 - 14 years
12 - 16 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Assoc Mgr Qualifications: Any Graduation Years of Experience: 10 to 14 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Artificial Neural Networks (ANNS) Machine Learning Results orientation Problem-solving skills Ability to perform under pressure Strong analytical skills Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 month ago
16 - 25 years
15 - 20 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Technological Innovation Designation: Program & Project Mgmt Senior Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions. The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent Automation In Technology Innovation, you will be working on the scientific field of innovation studies which serves to explain the nature and rate of technological change. You will have to understand new products, processes and significant technological changes of products and processes. What are we looking for? In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2