Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 9 years
14 - 19 Lacs
Pune
Work from Office
About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
11 - 21 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
2 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 1 month ago
3 - 6 years
9 - 13 Lacs
Bengaluru
Work from Office
locationsTower 02, Manyata Embassy Business Park, Racenahali & Nagawara Villages. Outer Ring Rd, Bangalore 540065 time typeFull time posted onPosted 5 Days Ago job requisition idR0000388711 About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. AtTarget, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- Benefits- Culture-
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 1 month ago
8 - 13 years
12 - 20 Lacs
Bengaluru
Hybrid
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required
Posted 1 month ago
4 - 7 years
6 - 9 Lacs
Chennai
Work from Office
Skills : Google BigQuery, SQL, Python, Apache Airflow, Oracle to BigQuery DWH migration and modernization, DataProc, GCS, PySpark, Oracle DB and PL/SQL Required Candidate profile Notice Period: 0- 15 days Education: BE, B.tech, ME, M.tech
Posted 2 months ago
5 - 10 years
8 - 12 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Skills : GCP Hadoop, Handling batch data processing on Hadoop ecosystem, Scala Spark, Hive, GCP skills like BQ,GCS and DataProc, Big query GCP AND coding anything like Java, python and HadoopLocation:Pune,Noida,Gurugram Required Candidate profile Notice Period: Not Available Education: Not Available
Posted 2 months ago
5 - 10 years
0 - 1 Lacs
Chennai
Hybrid
Must have: Gen AI, Agentic AI, Computer Vision, Tensorflow, Pytorch.. etc.. Proficiency in programming languages such as Python, with hands on experience in AIML libraries, Gen AI, Cloud ML, Vertex AI services and frameworks (e.g., TensorFlow, PyTorch, scikit-learn) Solid understanding of Gen AI, Agentic AI, Prompt engineering, AIML concepts, algorithms, and techniques. Experience with data preprocessing, feature engineering, and data visualization. Strong analytical and problem-solving skills, with the ability to analyze complex datasets and extract actionable insights. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and experience deploying AIML models in cloud environments. Knowledge of software development practices, version control systems (e.g., Git), and agile methodologies. Excellent communication and collaboration skills to work effectively in cross-functional teams. Strong organizational and time management skills, with the ability to prioritize and manage multiple projects simultaneously. Good to have handson experience with NLP projects Responsibilities: Design and develop AIML models, algorithms, and applications to solve complex business problems. Collaborate with data scientists and subject matter experts to understand business requirements and translate them into AIML solutions. Implement and optimize AIML algorithms and models using programming languages such as Python, R, or Java. Conduct data preprocessing, feature engineering, and data exploration to ensure the availability of high-quality data for training and evaluation. Evaluate and benchmark different AIML models, frameworks, and tools to identify the most suitable solutions for specific use cases. Train, validate, and fine-tune AIML models using various techniques such as deep learning, reinforcement learning, and natural language processing. Deploy AIML models in production environments, ensuring scalability, performance, and reliability. Collaborate with software engineers to integrate AIML solutions into existing applications or develop new applications. Stay up-to-date with the latest advancements in AIML technologies, research papers, and industry trends. Provide technical guidance and support to junior team members and promote knowledge sharing within the COE team.
Posted 2 months ago
8 - 13 years
25 - 40 Lacs
Bengaluru
Remote
Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. We support your pursuit of success by making smart financial decisions simple, to help you get back to doing the things you care about most. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 months ago
4 - 6 years
5 - 10 Lacs
Bengaluru
Work from Office
At Sogeti, we believe the best is inside every one of us. Whether you are early in your career or at the top of your game, well encourage you to fulfill your potentialto be better. Through our shared passion for technology, our entrepreneurial culture , and our focus on continuous learning, well provide everything you need to doyour best work and become the best you can be. About The Role Hands on experience in Oracle DBA About The Role - Grade Specific Hands on experience in Oracle DBA Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a localpartner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, andsmarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud andapplication development, all driven by AI, data and automation.
Posted 2 months ago
4 - 8 years
10 - 19 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team
Posted 2 months ago
4 - 9 years
12 - 22 Lacs
Pune, Hyderabad, Gurgaon
Work from Office
How to Apply: Send your resume to heena.ruchwani@gspann.com to apply now! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using Airflow on Google Cloud Platform (GCP). Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex workflows using Python scripts and Pyspark to process large datasets stored in BigQuery. Ensure high availability, scalability, and performance of data processing systems by monitoring logs, troubleshooting issues, and optimizing resource utilization. Participate in code reviews to ensure adherence to coding standards and best practices.
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark, Java, Google Dataproc Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will be responsible for the design and development of data solutions, collaborating with multiple teams, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines to migrate and deploy data across systems. Ensure data quality by implementing ETL processes. Collaborate with multiple teams to provide solutions to data-related problems. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Spark, Java, Google Dataproc. Good To Have Skills:Experience with Apache Airflow. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP) Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow) Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc) Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series) Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP) Comfortable communicating with various stakeholders (technical and non-technical) GCP Data Engineer Certification is a nice to have Additional Information: The candidate should have a minimum of 5 years of experience in Apache Spark. This position is based in Bengaluru. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 2 months ago
4 - 9 years
7 - 14 Lacs
Chennai
Work from Office
Role & responsibilities Bachelors Degree 4 + Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exportingSkills Required:Google Cloud Platform - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Confluent Kafka, Airflow Pyspark, Python, Cloud StorSkills Preferred:API, Python,
Posted 2 months ago
9 - 12 years
25 - 30 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
This role is ideal for someone with strong technical skills in cloud computing, data engineering, and analytics, who is passionate about working with cutting-edge technologies in GCP to build robust and scalable data solutions. Key Responsibilities: Data Architecture and Design: Design and implement scalable, reliable, and high-performance data pipelines on Google Cloud. Define and implement data architecture strategies to store, process, and analyze large datasets efficiently. Create optimized schemas and ensure data structures meet business requirements. Data Pipeline Development: Build and maintain ETL (Extract, Transform, Load) pipelines using tools like Google Cloud Dataflow , Apache Beam , and Cloud Dataproc . Work with Google Cloud Storage (GCS), BigQuery , and Pub/Sub for data ingestion, storage, and analysis. Automate and orchestrate data workflows using tools such as Apache Airflow or Google Cloud Composer . Data Processing and Transformation: Develop and manage batch and real-time data processing solutions. Transform raw data into useful formats for analysis or machine learning models using BigQuery , Dataflow , or Dataproc . Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Provide data support for machine learning and AI models, ensuring the data is clean, structured, and properly ingested. Optimization and Monitoring: Monitor and optimize the performance of data pipelines, ensuring minimal downtime and efficient use of resources. Troubleshoot data issues and resolve bottlenecks in the pipeline or storage systems Required Skills and Qualifications: Experience with GCP: Strong experience with GCP services like BigQuery, Google Cloud Storage (GCS), Dataflow, Dataproc, Pub/Sub, Cloud Composer, and Cloud Functions. Programming Languages: Proficient in languages such as Python, Java, or Scala for developing data pipelines and processing. ETL Tools: Experience with ETL frameworks, tools like Apache Beam, Airflow, or Cloud Data Fusion. Data Modeling and Warehousing: Understanding of data modeling, relational databases, and data warehousing concepts. SQL and NoSQL Databases: Strong proficiency in SQL, with experience in data analysis using BigQuery or other relational databases. Familiarity with NoSQL databases is a plus. Cloud Infrastructure: Knowledge of cloud architecture and infrastructure best practices on GCP. Data Analytics and BI Tools: Experience working with data visualization tools like Google Data Studio, Tableau, or Looker is a plus. DevOps Practices: Experience with CI/CD pipelines, version control systems (e.g., Git), and automated testing. Preferred Skills: Experience with containerized environments, including Docker and Kubernetes. Familiarity with machine learning tools like AI Platform on GCP. Ability to manage large datasets efficiently and design solutions that scale with growing data volumes. Education and Certifications: A bachelors or masters degree in Computer Science, Information Technology, Data Science, or a related field. Google Cloud Certified Professional Data Engineer or other relevant certifications are often preferred.
Posted 2 months ago
8 - 12 years
30 - 40 Lacs
Pune
Work from Office
Who we are? Searce means a fine sieve & indicates to refine, to analyze, to improve. It signifies our way of working: To improve to the finest degree of excellence, solving for better every time. Searcians are passionate improvers & solvers who love to question the status quo. The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday. What are we looking for? Searce is looking for a Lead Engineer who is able to work with business leads, analysts, data scientists and fellow engineers to build data products that empower better decision making. Someone who is passionate about the data quality of our business metrics and geared up to provide flexible solutions that can be scaled up to respond to broader business questions. What you'll do as a Lead Data Engineer with us? 1. Understand the business requirements and translate these to data services to solve the business and data problems 2. Develop and manage the transports/data pipelines (ETL/ELT jobs) and retrieve applicable datasets for specific use cases using cloud data platforms and tools 3. Explore new technologies and tools to design complex data modelling scenarios, transformations and provide optimal data engineering solutions 4. Build data integration layers to connect with different heterogeneous sources using various approaches 5. Understand data and metadata to support consistency of information retrieval, combination, analysis and reporting 6. Troubleshoot and monitor data pipelines to have high availability of reporting layer 7. Collaborate with many teams - engineering and business, to build scalable and optimized data solutions and propose ways to improve platform and tools 8. Effectively manages workstreams for self and 2 to 3 analysts to support delivery What are the must-haves to join us? Is Education overrated? Yes. We believe so. But there is no way to locate you otherwise. So we might look for at least a Bachelors degree in Computer Science and 1. 5-9 years of experience with building the data pipelines or data ingestion for both Batch/Streaming data from different sources to data warehouse / data lake 2. Experience leading and delivering data warehousing and analytics projects, including using cloud technologies such as EMR, Lambda, Cloud Storage, BigQuery, etc. 3. Experience of atleast 2 to 3 projects building and optimizing BigData data pipelines, architectures and datasets 4. Handson experience/knowledge with SQL/Python/Java/Scala programming, understanding of SQL is a must 5. Experience with any cloud computing platforms like AWS (S3, Lambda functions, RedShift, Athena), GCP (GCS, Dataflow, Dataproc, Pub/Sub, Bigquery), Azure (Blob/Azure, Synapse) etc. 6. Knowledge of version control software such as Git and experience in working with relevant hosting services (e.g. Azure DevOps, Github, Bitbucket, Gitlab) 7. Strong experience working with relevant Python & R packages and ecosystems (e.g. pandas, NumPy, scikit-learn, tidyverse, data tables) 8. Extensive experience in developing complex stored procedures and transformations 9. Experience/knowledge with Big Data tools (Hadoop, Hive, Spark, Presto)
Posted 2 months ago
4 - 7 years
6 - 9 Lacs
Mumbai
Work from Office
Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 2 months ago
15 - 20 years
35 - 40 Lacs
Pune
Work from Office
Job Title: Cloud Architecture and Engineering Corporate Title: Director Overview: Corporate Banking is a technology centric business, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help rebuild the core of some of our most mission critical processing systems from the ground-up. Our Corporate Bank Technology team is a global team of 3000 Engineers (and growing!) across 30 countries. The primary businesses that we support within Corporate Bank are Cash Management, Securities Services, Trade Finance and Trust & Agency Services. CB Technology supports these businesses through vertical CIO aligned teams and by horizontals such as Client Connectivity, Surveillance and Regulatory, Infrastructure, Architecture, Production, and Risk & Control. Your Role - What Youll Do As a GenAI and Google Cloud Architecture and Engineer for Corporate Bank domains, you will be responsible for helping direct, create, review, and approve architectural designs for applications in the tribe. You are expected to be a hands-on developer with practical architecture depth to meet business and technical requirements as well as provide coaching and guidance to the team to enable future success. Key Responsibilities: An experienced architect to lead our global cloud infrastructure architecture across domains in Corporate Bank Technology This individual will create domain level architecture roadmaps that will ensure long-term business strategy success while reducing complexity and managing cost The individual will also be responsible to build solutions and deliver to production as needed for TAS business. This position will play a critical role in design and blueprints for GenAI and cloud infrastructure and migration projects. Provide technical leadership to globally distributed and diverse development team(s) across multiple TAS Tribes and potentially other Corporate Bank Domains Lead solution design and work with group architecture to deliver complex high and low-level designs Work closely with Senior architects/engineering heads/project/program management as well as managers of related applications to build and deliver the software Continually help to improve the performance of the team regarding SDLC, QA, CICD, DORA KPIs and post-release activities Act as an expert on the platform in various capacities, strategy meetings, and product development opportunities Coordinate with other Lead Software Development Engineers to create and deliver an application release plan with a focus on cloud implementation Work with SRE and L3 support for the application to help bring architectural improvements Skills Youll Need : Expert level knowledge of GenAI and cloud architecture at the solution implementation level. Overall experience of 15+ years with hands on coding/engineering skills with 6+ years in building cloud applications in production Excellent communication skills and experience to broadcast complex architecture strategies and implementation details. Prior experience with large scale enterprise architecture, road mapping, and design with applications in production on cloud. (not just poc) Architecture certifications for GCP (preferred) or AWS Deep knowledge of the Architecture and Design Principles, Algorithms and Data Structures, and UI Accessibility for both on-prem and cloud native solutions (GCP preferred) Good Knowledge of Micro frontend & Microservices Architecture, Kubernetes, Docker, Cloud Native application, Grafana/New Relic or other monitoring tools, APIs, Rest Services & Google Cloud Platform Working knowledge of GIT, Jenkins, CICD, Gradle, DevOps and SRE techniques Understanding of the Cryptography and principles of Information Security Strong knowledge of JavaScript, React, Node, Typescript, HTML, CSS Strong knowledge of Core Java, Spring-boot, Oracle, MySQL and/or PostgreSQL, Data processing, Kafka, MQ to help with migration of on prem application to cloud. Cloud Components (e.g. Big query, Dataflow, Dataproc, DLP, Big Table, Pub/Sub, Compos, Azure, AWS etc.) Experience with high-load, high traffic systems and performance testing and tuning Experience in building highly available distributed applications with zero-downtime release cycle Knowledge and experience with modern security eco-systems GCP cloud architect certification and implementation of mid to large systems
Posted 2 months ago
3 - 8 years
17 - 22 Lacs
Pune, Bengaluru, Hyderabad
Work from Office
Role & responsibilities 1. Strong experience in GCP Data Engineering 2 Experience in Bigquery 3. Experience in Python/Pyspark
Posted 2 months ago
8 - 12 years
10 - 20 Lacs
Bengaluru
Work from Office
Senior Data Engineer total 8+ Years Must | senior-level Data Engineer capable of developing data pipelines and data delivery solutions on GCP.BigQuery, Dataflow, Pub/Sub, Cloud Storage Proficiency in SQL, Python, ETL processes.
Posted 2 months ago
2 - 6 years
7 - 11 Lacs
Bengaluru
Work from Office
If you are creating JD for an Associate program rolereplace word "entry" in the first sentence below with word Associate. For example -> "As an entry level Software Developer..." will be changed to "As an Associate Software Developer..." As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible forWorking across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product --- IF APPLICABLE review & complete fields in > Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2