Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! IBMs Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 2 weeks ago
12 - 20 years
30 - 45 Lacs
Hyderabad
Hybrid
Job Description: We are seeking a highly experienced Data Architect with 15-20 years of experience to lead the design and implementation of data solutions at scale. The ideal candidate will have deep expertise in cloud technologies, particularly GCP, along with a broad skill set in SQL, BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, DLP, Dataproc, Cloud Composer, Python, ETL, and big data technologies like MapR/Hadoop, Hive, Spark, and Scala. Key Responsibilities: Lead the design and implementation of complex data architectures across cloud platforms, ensuring scalability, performance, and cost-efficiency. Architect data solutions using Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Design and optimize ETL - Abinitio processes and data pipelines using Python and related technologies, ensuring seamless data integration across multiple systems. Work with big data technologies including Hadoop (MapR), Hive, Spark, and Scala to build and manage large-scale, distributed data systems. Oversee the end-to-end data flow from ingestion to processing, transformation, and storage, ensuring high availability and disaster recovery. Lead and mentor a team of engineers, guiding them in adopting best practices in data architecture, security, and governance. Define and enforce data governance, security, and compliance standards to ensure data privacy and integrity. Collaborate with cross-functional teams to understand business requirements and translate them into data architecture and technical solutions. Design and implement data lake, data warehouse, and analytics solutions to support business intelligence and advanced analytics. Lead the integration of cloud-native tools and services for real-time and batch processing, using Pub/Sub, Dataproc, and Cloud Composer. Conduct performance tuning and optimization for SQL, BigQuery, and big data technologies to ensure efficient query execution and resource usage. Provide strategic direction on new data technologies, trends, and best practices to ensure the organization remains competitive and innovative. Required Skills: 15-20 years of experience in data architecture, data engineering, or related roles, with a focus on cloud solutions. Extensive experience with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Strong Experience in ETL - Abinitio. Proficient in SQL and experience with cloud-native data storage and processing technologies (BigQuery, Hive, Hadoop, Spark). Expertise in Python for ETL pipeline development and data manipulation. Solid understanding of big data technologies such as MapR, Hadoop, Hive, Spark, and Scala. Experience in designing and implementing scalable, high-performance data architectures and data lakes/warehouses. Deep understanding of data governance, security, privacy (DLP), and compliance standards. Proven experience in leading teams and delivering large-scale data solutions in cloud environments. Excellent problem-solving, communication, and leadership skills. Ability to work with senior business and technical leaders to align data solutions with organizational goals. Preferred Skills: Experience with other cloud platforms (AWS, Azure). Knowledge of machine learning and AI data pipelines. Familiarity with containerized environments and orchestration tools (e.g., Kubernetes). Experience with advanced analytics or data science initiatives.
Posted 2 months ago
10 - 17 years
22 - 25 Lacs
Bengaluru, Hyderabad, Mumbai (All Areas)
Work from Office
Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or a related field. Minimum of 10 years of IT experience for an engineer-level position. Basic Skills: 2-3 years of experience as a Cloud SRE/Engineer with applications utilizing services such as: Cloud Build Cloud Functions GKE (Google Kubernetes Engine) Logging Monitoring GCS (Google Cloud Storage) CloudSQL IAM (Identity and Access Management) Proficiency in Python, with experience in a secondary language like Golang or Java. Proven ability to manage codebases and configurations effectively. Key Skills: Strong knowledge and hands-on experience with: GKE/Kubernetes Docker Experience in implementing and maintaining CI/CD pipelines using: GCP Cloud Build Other cloud-native services Proficiency in Infrastructure as Code (IaC) tools like Terraform. Knowledge of security best practices and RBAC. Experience in defining, monitoring, and achieving: Service Level Objectives (SLOs) Service Level Agreements (SLAs) Proficiency with source control tools like Github Enterprise. Commitment to continuous improvement and automation of manual tasks. Familiarity with monitoring tools such as: Grafana Prometheus Splunk GCP native logging solutions Willingness to provide extra hours of support when necessary. Nice to Have Skills: Experience in secrets management using HashiCorp Vault. Experience in any tracing tools like google tracing , honeycomb
Posted 3 months ago
12 - 17 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. Zoominfo is a rapidly growing data-driven company, and as such- we understand the importance of a comprehensive and solid data solution to support decision making in our organization. Our vision is to have a consistent, democratized, and accessible single source of truth for all company data analytics and reporting. Our goal is to improve decision-making processes by having the right information available when it is needed. As a Principal Software Engineer in our Data Platform infrastructure team you'll have a key role in building and designing the strategy of our Enterprise Data Engineering group. What You'll do: Design and build a highly scalable data platform to support data pipelines for diversified and complex data flows. Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities. Deliver scalable, reliable and reusable data solutions. Leading, building and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms. Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs. Develop processes and tools to monitor, analyze, maintain and improve data operation, performance and usability. What you bring: Relevant Bachelor degree or other equivalent Software Engineering background. 12+ years of experience as an infrastructure / data platform / big data software engineer. Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena. IaC design and hands-on experience. Familiarity designing CI/CD pipelines with Jenkins, Github Actions, or similar tools. Experience in designing, building and maintaining enterprise systems in a big data environment on public cloud. Strong SQL abilities and hands-on experience with SQL, performing analysis and performance optimizations. Hands-on experience in Python or equivalent programming language. Experience with administering data warehouse solutions (like Bigquery/ Redshift/ Snowflake). Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance. Experience with Airflow and DBT - advantage Experience with Kubernetes using GKE or EKS - advantage.. Experience with development practices Agile, TDD - advantage
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2