Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a qualified candidate for this role, you should have in-depth expertise in Google Cloud Platform (GCP) services such as Pubsub, BigQuery, Airflow, Data Proc, Cloud Composer, and Google Cloud Storage (GCS). Additionally, proficiency in DataFlow and Java is a must for this position. Experience with Kafka would be considered a plus. Your responsibilities will include working with these technologies to design, develop, and maintain scalable and efficient data processing systems. If you meet these requirements and are eager to work in a dynamic and innovative environment, we look forward to reviewing your application.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Role: GCP Data Engineer Experience: 5-9 Years Notice: 15 Days less Interview Mode: First Round Virtual/ Second Round Face to face (Mandate) Location: Bangalore Job Description Data Ingestion, Storage, Processing and Migration Acquire, cleanse, and ingest structured and unstructured data on the cloud platforms (in batch or real time) from internal and external data sources Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Create, maintain and provide test data to support fully automated testing Enable and support data movement from one system / service to another system / service. Reporting : Design, Develop and maintain high performance Look ML models that provide comprehensive data visibility across business function. Build interactive dashboards and data visualization that tell compelling stories and drive decision making. Stay up to date with the latest Looker Features and best practices ,Sharing your knowledge with the team. Skills & Software Requirements: GCP data services (Big Query; Dataflow; Data Fusion; Data proc; Cloud Composer; Pub/Sub; Google Cloud Storage; Looker; Look ML) Programming languages e.g., Python, Java, SQL Show more Show less
Posted 2 weeks ago
15.0 - 17.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Sr Associate Director Software Engineering In this role, you will: This role is part of the Innovation and Gen AI tech team and is responsible for supporting the use cases delivery on Innovation and GenAI, including: New use case onboarding design, development/coding, testing & deployment into Production. . Understand Group and Compliance/Enterprise Technology Innovation strategy, help define technical strategy and execute which to fulfill business needs and aspirations on growth, management and control. . Work with global Risk and Compliance IT architecture team to ensure that technical solution, development, and integrations adhere to group standards. . Establish and execute a vision to plan, deliver, and support solutions in a complex, distributed technology environment. The ideal candidate must be able to communicate clearly and effectively with both technical and non-technical individuals. . Plan for people and project management, provide coaching and guiding directly or indirectly to teams having developers, testers, analyst and architects inside by giving clear direction, feedbacks and timely suggestions to ensure a high quality standard of deliverables according to HSBC standards and best practices. . Address existing technical debt and drive for technical evolutions Innovation and Digital Transformation for the teams by working closely with various parties including business, Transformation, Solution Architects globally. . Establish and maintain trustworthy relationships with business and relevant stakeholders. Manage expectation of key stakeholders and work jointly to maximize interest of business and customers . Manage supply and demand pipeline and give guidance, direction for making decision to achieve goals, deliver products that align with business. Requirements To be successful in this role, you should meet the following requirements: You will be tenacious about doing things the right way and building efficient and brilliantly simple business solutions. You will also be adept at working in customer facing roles within enterprise environments, helping clients to capitalise on technology for their commercial benefit. High Level & Holistic Capabilities sought: . Bachelor degree in Computer Science, Engineering or equivalent advanced degree is preferred . GCP or AWS certified (GCP preferred) . Prompt engineering experience preferred . Expert level core Java / Python skills . Minimal 15+ years experience of software developments with both waterfall and Agile methodologies. . Have experience leading and managing agile, cross function delivery teams encompassing 30+ staff (direct and indirect reports). . Possess strong technical capabilities (BigData, AI/ML, API, Microservices), knowledge and experience on DevOps, Disciplined Agile Delivery (DAD) and Agile control Framework. . Are passionate about technology and look for opportunities to learn & bring new ideas to the team. . Sound understanding of Azure / Google Cloud platforms . Experience working with Kubernetes, Docker, Storage, Cloud Functions . Experience in performing data analysis, on Databases (SQL, no-SQL, DBT) . CI/CD pipelines . Data Flow, Data Proc, Big Query skills an advantage . Security (IAM, AD, ADLDS, roles, service accounts, entitlements) . Events & Data Streaming: Data Proc, Pub-Sub, Confluent/Kafka . Ability to communicate and explain complex ideas in both oral and written English You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI
Posted 2 weeks ago
3.0 - 5.0 years
5 - 15 Lacs
pune
Hybrid
Responsibilities: Design, implement, and manage ETL pipelines on Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Composer) . Write complex SQL queries and optimize for BigQuery performance. Work with structured/unstructured data from multiple sources (databases, APIs, streaming). Build reusable data frameworks for transformation, validation, and quality checks. Collaborate with stakeholders to understand business requirements and deliver analytics-ready datasets. Implement best practices in data governance, security, and cost optimization . Requirements: Bachelors in Computer Science, IT, or related field. experience in ETL/Data Engineering . Strong Python & SQL skills. Hands-on with GCP (BigQuery, Dataflow, Composer, Pub/Sub, Dataproc) . Experience with orchestration tools (Airflow preferred). Knowledge of data modeling and data warehouse design. Exposure to CI/CD, Git, DevOps practices is a plus.
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at Deutsche Bank in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong grasp of essential engineering principles and possess root cause analysis skills to address enhancements and fixes in product reliability and resiliency. You should be capable of working independently on medium to large projects with strict deadlines and adapt to a cross-application mixed technical environment. Your role involves hands-on development experience in ETL, Big Data, Hadoop, Spark, and GCP while following an agile methodology. Collaboration with a geographically dispersed team is essential in this role. The position is part of the Compliance tech internal development team in India, focusing on delivering improvements in compliance tech capabilities to meet regulatory commitments and mandates. You will be involved in analyzing data sets, designing stable data ingestion workflows, and integrating them into existing workflows. Additionally, you will work closely with team members and stakeholders to provide ETL solutions, develop analytics algorithms, and handle data sourcing in Hadoop and GCP. Your responsibilities include unit testing, UAT deployment, end-user sign-off, and supporting production and release management teams. To excel in this role, you should have over 10 years of coding experience in reputable organizations, proficiency in technologies such as Hadoop, Python, Spark, SQL, Unix, and Hive, as well as hands-on experience in Bitbucket and CI/CD pipelines. Knowledge of data security in on-prem and GCP environments, cloud services, and data quality dimensions is crucial. Experience in regulatory delivery environments, banking, test-driven development, and data visualization tools like Tableau would be advantageous. At Deutsche Bank, you will receive support through training, coaching, and a culture of continuous learning to enhance your career progression. The company fosters a collaborative environment where employees are encouraged to act responsibly, think commercially, and take initiative. Together, we strive for excellence and celebrate the achievements of our diverse workforce. Deutsche Bank promotes a positive, fair, and inclusive work environment and welcomes applications from all individuals. For more information about Deutsche Bank and our values, please visit our company website: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm),
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,
Posted 2 months ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git . Experience in designing modular , automated , and secure ETL frameworks .
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 5 - 15 Yrs Location: Pan India Job Description: Minimum 2 years hands on experience in GCP Development ( Data Engineering ) Position : Developer / Tech Lead / Architect Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 2 months ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! IBMs Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 3 months ago
3.0 - 5.0 years
10 - 13 Lacs
Chennai
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 months ago
5.0 - 7.0 years
8 - 10 Lacs
gurugram, chennai
Work from Office
Proficiency in GCP services including Big Query, Cloud Storage, Dataflow, Data Proc, and Data Mesh. Strong in Hive, Hadoop, PySpark, Scala, and other Big Data Technologies.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |