Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a software engineer at Google, you will be part of a dynamic team developing cutting-edge technologies that impact billions of users worldwide. Your role will involve working on projects that require handling massive amounts of data and creating innovative solutions that go beyond traditional web search. We are seeking individuals with expertise in various areas such as information retrieval, distributed computing, system design, networking, security, artificial intelligence, and more. Your responsibilities will include writing code for product and system development, participating in design reviews, and providing feedback on code quality and best practices. You will contribute to documentation and educational content, adapt content based on feedback, and troubleshoot and resolve product or system issues effectively. To be successful in this role, you should have a Bachelor's degree in a relevant field and at least 2 years of experience in software development with proficiency in data structures and algorithms. Experience with Machine Learning Algorithms, AI Algorithms, and coding in languages like C, C++, Java, or Python will be beneficial. A Master's degree or PhD in Computer Science, along with experience in building developer tools and knowledge of machine learning methods, is preferred. As a software engineer at Google, you will have the opportunity to work on critical projects that align with Google's needs, with the flexibility to switch teams and projects as the business evolves. Your ability to manage project priorities, deadlines, and deliverables will be essential, along with your skills in designing, developing, testing, deploying, and maintaining software solutions. If you are a versatile engineer with leadership qualities and a passion for innovation, we encourage you to join our team and help drive technology forward.,
Posted 3 weeks ago
0.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities . Architect and design solutions to meet functional and non-functional requirements. . Create and review architecture and solution design artifacts. . Evangelize re-use through the implementation of shared assets. . Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. . Proactively guide engineering methodologies, standards, and leading practices. . Guidance of engineering staff and reviews of as-built configurations during the construction phase. . Provide insight and direction on roles and responsibilities required for solution operations. . Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. . Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. . Demonstrate strong analytical and technical problem-solving skills. . Ability to analyze and operate at various levels of abstraction. . Ability to balance what is strategically right with what is practically realistic. . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. . Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. . Supporting and developing our people, including learning & development, certification & career development plans . Providing technical governance and oversight for solution design and implementation . Should have technical foresight to understand new technology and advancement. . Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance . Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. . Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives . Evangelizing the Data Engineering service offerings to both internal and external stakeholders . Development of Whitepapers, blogs, webinars and other though leadership material . Development of Go-to-Market and Service Offering definitions for Data Engineering . Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. . Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. . Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. . Build new Data capabilities, solutions, assets, accelerators, and team competencies. . Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications . Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. . Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. . Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. . Knowledge and experience in IT methodologies and life cycles that will be used. . Familiar with solution implementation/management, service/operations management, etc. . Leadership skills can inspire others and persuade. . Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. . Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience . Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. . Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. . Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance . Must have experience to design platform on Databricks. . Hands-on Experience to design and build Databricks based solution on any cloud platform. . Hands-on experience to design and build solution powered by DBT models and integrate with databricks. . Must be very good designing End-to-End solution on cloud platform. . Must have good knowledge of Data Engineering concept and related services of cloud. . Must have good experience in Python and Spark. . Must have good experience in setting up development best practices. . Intermediate level knowledge is required for Data Modelling. . Good to have knowledge of docker and Kubernetes. . Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC, SSO etc. . Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. . Experience building and supporting mission-critical technology components with DR capabilities. . Experience with multi-tier system and service design and development for large enterprises . Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. . Exposure to infrastructure and application security technologies and approaches . Familiarity with requirements gathering techniques. Preferred qualifications . Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. . Must have excellent coding skills either Python or Scala, preferably Python. . Must have experience in Data Engineering domain with total . Must have designed and implemented at least 2-3 project end-to-end in Databricks. . Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint - Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security . Must have knowledge of new features available in Databricks and its implications along with various possible use-case. . Must have followed various architectural principles to design best suited per problem. . Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. . Must have strong understanding of Data warehousing and various governance and security standards around Databricks. . Must have knowledge of cluster optimization and its integration with various cloud services. . Must have good understanding to create complex data pipeline. . Must be strong in SQL and sprak-sql. . Must have strong performance optimization skills to improve efficiency and reduce cost. . Must have worked on designing both Batch and streaming data pipeline. . Must have extensive knowledge of Spark and Hive data processing framework. . Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. . Must be strong in writing unit test case and integration test. . Must have strong communication skills and have worked with cross platform team. . Must have great attitude towards learning new skills and upskilling the existing skills. . Responsible to set best practices around Databricks CI/CD. . Must understand composable architecture to take fullest advantage of Databricks capabilities. . Good to have Rest API knowledge. . Good to have understanding around cost distribution. . Good to have if worked on migration project to build Unified data platform. . Good to have knowledge of DBT. . Experience around DevSecOps including docker and Kubernetes. . Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools . Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java, Python, etc. . Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx . Experience with visualization tools such as Tableau, Power BI . Experience with machine learning tools such as mlFlow, Databricks AI/ML, Azure ML, AWS sagemaker, etc. . Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. . Experience coordinating the intersection of complex system dependencies and interactions . Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
8.0 - 13.0 years
40 - 50 Lacs
Pune
Work from Office
Required Skills and Attributes: Experienced in Data Modeling Experience with Azure Cloud and relevant data technologies Experience building reports and dashboards preferably in Power BI. Ability to work directly with and communicate well with all levels of leadership and business partners. Must be comfortable balancing several projects at once and able to pivot as business need arise
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.
Posted 1 month ago
5.0 - 7.0 years
12 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: Power Apps QA Location: Remote, Hyderabad,ahmedabad,pune,chennai,kolkata. Notice Period: Immediate iSource Services is hiring for one of their client for the position of Power Apps QA. About the Role - We are looking for an experienced Power Apps QA with 5-7 years of experience to join our team remotely. In this role, you will be responsible for designing, developing,and managing workflows and data pipelines to streamline business processes and ensure quality automation solutions. The ideal candidate will have in-depth experience with Microsoft Power Automate, Azure Data Factory (ADF), and other Azure services, alongside a strong ability to troubleshoot and optimize automated solutions. You will be integral in ensuring that the automation and data workflows run smoothly, efficiently, and comply with security and quality standards. 1. Power Automate: Experience in creating and managing workflows using Microsoft Power Automate. Proficiency in integrating Power Automate with other Microsoft tools like SharePoint, Microsoft Teams. Ability to design, develop, and implement automated solutions to streamline business processes. Strong understanding of connectors and their configurations in Power Automate. 2. Azure Data Factory: In-depth knowledge of Azure Data Factory (ADF) including data pipelines, data flows, and data integration activities. Experience in creating, scheduling, and managing data pipelines. Understanding of various data storage services (e.g., Azure Blob Storage, Azure SQL Database, etc.) and their integration in ADF. Expertise in debugging, performance tuning, and troubleshooting data workflows in ADF. Familiarity with version control systems like Git for managing ADF assets. 3. General Azure Service General knowledge of other Azure services like Azure Functions, Logic Apps, and Azure DevOps. Understanding of Azure security and compliance standards.
Posted 1 month ago
1.0 - 4.0 years
1 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Drive adoption of cloud technology for data processing and warehousing You will drive SRE strategy for some of GS largest platforms including Lakehouse and Data Lake Engage with data consumers and producers to match reliability and cost requirements You will drive strategy with data Relevant Technologies: Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, Gitlab BasicQualifications A Bachelor or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) 1-4+ years of relevant work experience in a team-focused environment 1-2 years hands on developer experience at some point in career Understanding and experience of DevOps and SRE principles and automation, managing technical and operational risk Experience with cloud infrastructure (AWS, Azure, or GCP) Proven experience in driving strategy with data Deep understanding of multi-dimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (eg star schema, entitlement implementations, SQL v/s NoSQL modelling, milestoning, indexing, partitioning) Excellent communications skills and the ability to work with subject matter experts to extract critical business concepts Independent thinker, willing to engage, challenge or learn Ability to stay commercially focused and to always push for quantifiable commercial impact Strong work ethic, a sense of ownership and urgency Strong analytical and problem-solving skills Ability to build trusted partnerships with key contacts and users across business and engineering teams Preferred Qualifications Understanding of Data Lake / Lakehouse technologies incl. Apache Iceberg Experience with cloud databases (eg Snowflake, Big Query) Understanding concepts of data modelling Working knowledge of open-source tools such as AWS lambda, Prometheus Experience coding in Java or Python
Posted 1 month ago
2.0 - 6.0 years
4 - 9 Lacs
Hyderabad
Work from Office
Design and Develop Data Flows Integration with Data Sources Data Transformation Error Handling and Monitoring Performance Optimization Collaboration Documentation Security and Compliance Required Candidate profile Apache NiFi and data integration tools ETL concepts Data formats like JSON, XML, and Avro Programming languages such as Java, Python, or Groovy data storage solutions such as Hadoop, Kafka
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Hybrid
Role & responsibilities Product Strategy & Roadmap: Develop and maintain a compelling, customer-centric product roadmap that aligns with strategic business goals and customer needs. Customer and Market Insights: Conduct detailed market analysis, competitive assessments, and customer feedback reviews to drive product innovation and positioning. Cross-Functional Collaboration: Collaborate effectively with engineering, product marketing, sales, and support teams globally to ensure successful product execution and market introduction. Requirement Definition: Develop clear and detailed product requirements, user stories, and specifications, ensuring alignment between customer requirements and technical feasibility. Go-to-Market Execution: Partner with product marketing and sales teams to craft and execute effective launch strategies and materials, ensuring successful adoption and revenue growth. Performance Analysis: Continuously track product performance, revenue, customer adoption, and satisfaction metrics to inform roadmap adjustments and ongoing improvements. Preferred candidate profile Bachelors degree in Engineering, Computer Science, Business, or related field; MBA preferred. 5+ years of proven product management experience in the data storage, backup, or enterprise IT industry. Strong understanding of backup, deduplication technologies, and enterprise storage ecosystems. Exceptional communication and presentation skills, with demonstrated experience engaging global stakeholders. Proven analytical and strategic thinking skills, capable of translating market data into actionable product insights and strategies. Experience working with globally distributed teams, particularly collaborating effectively with stakeholders based in the U.S. Desired Skills Prior experience managing enterprise backup or deduplication products. Technical depth and familiarity with storage infrastructure and software-defined storage architectures. Strong customer orientation with a track record of effectively representing the voice of the customer within product decisions. If you're passionate about product leadership, customer success, and cutting-edge enterprise technologies, we welcome you to apply and lead Quantum DXi into its next stage of growth.
Posted 1 month ago
3.0 - 5.0 years
3 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Required technical and professional expertise Work in close cooperation with the Product Management Executive team, other executives, peers and cross-functional colleagues to meet existing and evolving product and business objectives Provide direction to IBM Marketing to ensure effective messaging and content development. Work with the program management leads to execute and deliver on launches and overall, AI roadmap Support any required business and financial planning including working with IBM Finance to develop plans to meet financial objectives At least 2 years AI or industry business experience At least 2 years offering management experience in IT systems development and/or product (offering) management Experience with Agile Development and Design Thinking 3-5 years of Professional or academic business or Engineering experience. Strong written and oral communication skills Experience speaking with and presenting to customers or business partners, both pre- and post-sales Preferred technical and professional experience Bachelor's degree in computer science, Engineering or business Experience in AI industry models, particularly Generative AI models Foundational knowledge of data storage and data protection. Foundational knowledge of compute or Power Systems Foundational knowledge of cloud computing and cloud services. Foundational knowledge of technology industry or IBM GTM and distribution structures.
Posted 1 month ago
0.0 - 5.0 years
0 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred technical and professional experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients business processes and challenges
Posted 1 month ago
2.0 - 4.0 years
2 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction Ready to re-imagine the product management role At IBM, as Product Manager/Market Strategy, you are responsible for validating the market and strategic direction of offerings for IBM. Someone who is ready to roll up their sleeves and dive deep into understanding the AI market and strategy and help drive that back into the infrastructure team to lay the path for AI Offerings. Your role and responsibilities The Product Manager for IBM AI Infrastructure will play critical role in driving the growth of the IBM Infrastructure business. The AI Infrastructure PM lead will partner closely with the AI Infrastructure team as well as the Strategy and Development team teams across the IBM brands. The PM will be responsible for influencing the strategy and offerings for AI across the Infrastructure portfolio. The Product Manager/Market Strategy for IBM AI Infrastructure will play critical role in driving the growth of the IBM Infrastructure business. They will partner closely with the AI Infrastructure team as well as the Strategy and PM team teams across the IBM brands. They will be responsible for the view on the AI Market and will set the path IBM should be on to be key in the market in AI and Infrastructure. In this role, you will own the AI infrastructure market strategy and drive these views into the existing and future offering portfolio in conjunction with other product leaders across systems. You will play a role in influencing and driving the teams to deliver AI offerings that will propel IBM in the market. The team will consist of Strategists, technical product managers, product owners focused on IBM AI Infrastructure offerings, platform, and solutions. Required education Bachelor's Degree Required technical and professional expertise Work in close cooperation with the Product Management Executive team, other executives, peers and cross-functional colleagues to meet existing and evolving product and business objectives Work with the program management leads to execute and deliver on launches and overall, AI roadmap At least 1 year AI or industry business experience At least 1 year product management experience in IT systems development and/or product (offering) management 2-4 years of Professional or academic business or Engineering experience. Strong written and oral communication skills Experience speaking with and presenting to customers or business partners, both pre- and post-sales Preferred technical and professional experience Bachelor's degree in computer science, Engineering or business Experience in AI industry models, particularly Generative AI models Foundational knowledge of data storage and data protection. Foundational knowledge of Power Systems and IBM Z Foundational knowledge of cloud computing and cloud services.
Posted 1 month ago
3.0 - 6.0 years
2 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Establish and implement best practices for DBT workflows, ensuring efficiency, reliability, and maintainability. Collaborate with data analysts, engineers, and business teams to align data transformations with business needs. Monitor and troubleshoot data pipelines to ensure accuracy and performance. Work with Azure-based cloud technologies to support data storage, transformation, and processing Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred technical and professional experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required
Posted 1 month ago
3.0 - 6.0 years
1 - 6 Lacs
Remote, , India
On-site
We are seeking a skilled and motivated RPA Developer to join our team. The ideal candidate will have hands-on experience in developing and deploying RPA solutions using Blue Prism and Python . You will play a key role in automating business processes, reducing manual effort, and driving operational efficiency
Posted 1 month ago
2.0 - 4.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction Ready to re-imagine the product management role At IBM, as Product Manager/Market Strategy, you are responsible for validating the market and strategic direction of offerings for IBM. Someone who is ready to roll up their sleeves and dive deep into understanding the AI market and strategy and help drive that back into the infrastructure team to lay the path for AI Offerings. Your role and responsibilities The Product Manager for IBM AI Infrastructure will play critical role in driving the growth of the IBM Infrastructure business. The AI Infrastructure PM lead will partner closely with the AI Infrastructure team as well as the Strategy and Development team teams across the IBM brands. The PM will be responsible for influencing the strategy and offerings for AI across the Infrastructure portfolio. The Product Manager/Market Strategy for IBM AI Infrastructure will play critical role in driving the growth of the IBM Infrastructure business. They will partner closely with the AI Infrastructure team as well as the Strategy and PM team teams across the IBM brands. They will be responsible for the view on the AI Market and will set the path IBM should be on to be key in the market in AI and Infrastructure. In this role, you will own the AI infrastructure market strategy and drive these views into the existing and future offering portfolio in conjunction with other product leaders across systems. You will play a role in influencing and driving the teams to deliver AI offerings that will propel IBM in the market. The team will consist of Strategists, technical product managers, product owners focused on IBM AI Infrastructure offerings, platform, and solutions. Required education Bachelors Degree Required technical and professional expertise Work in close cooperation with the Product Management Executive team, other executives, peers and cross-functional colleagues to meet existing and evolving product and business objectives Work with the program management leads to execute and deliver on launches and overall, AI roadmap At least 1 year AI or industry business experience At least 1 year product management experience in IT systems development and/or product (offering) management 2-4 years of Professional or academic business or Engineering experience. Strong written and oral communication skills Experience speaking with and presenting to customers or business partners, both pre- and post-sales Preferred technical and professional experience Bachelors degree in computer science, Engineering or business Experience in AI industry models, particularly Generative AI models Foundational knowledge of data storage and data protection. Foundational knowledge of Power Systems and IBM Z Foundational knowledge of cloud computing and cloud services.
Posted 2 months ago
5.0 - 8.0 years
8 - 14 Lacs
Bengaluru
Remote
Job Overview : We are looking for an experienced GCP Data Engineer with deep expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS to build, manage, and optimize large-scale data pipelines. The ideal candidate should have a strong background in cloud data storage, real-time data streaming, and orchestration. Key Responsibilities : Data Storage & Management : - Manage Google Cloud Storage (GCS) buckets, set up permissions, and optimize storage solutions for handling large datasets. - Ensure data security, access control, and lifecycle management. Data Processing & Analytics : - Design and optimize BigQuery for data warehousing, querying large datasets, and performance tuning. - Implement ETL/ELT pipelines for structured and unstructured data. - Work with DataProc (Apache Spark, Hadoop) for batch processing of large datasets. Real-Time Data Streaming : - Use Pub/Sub for building real-time, event-driven streaming pipelines. - Implement Dataflow (Apache Beam) for real-time and batch data processing. Workflow Orchestration & Automation : - Use Cloud Composer (Apache Airflow) for scheduling and automating data workflows. - Build monitoring solutions to ensure data pipeline health and performance. Cloud Infrastructure & DevOps : - Implement Terraform for provisioning and managing cloud infrastructure. - Work with Google Kubernetes Engine (GKE) for container orchestration and managing distributed applications. Advanced SQL & Data Engineering : - Write efficient SQL queries for data transformation, aggregation, and analysis. - Optimize query performance and cost efficiency in BigQuery. Required Skills & Qualifications : - 4-8 years of experience in GCP Data Engineering - Strong expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS - Experience in SQL, Python, or Java for data processing and transformation - Proficiency in Airflow (Cloud Composer) for scheduling workflows - Hands-on experience with Terraform for cloud infrastructure automation - Familiarity with NoSQL databases like Bigtable for high-scale data handling - Knowledge of GKE for containerized applications and distributed processing Preferred Qualifications : - Experience with CI/CD pipelines for data deployment - Familiarity with Cloud Functions or Cloud Run for serverless execution - Understanding of data governance, security, and compliance Why Join Us ? - Work on cutting-edge GCP data projects in a cloud-first environment - Competitive salary and career growth opportunities - Collaborative and innovative work culture - Exposure to big data, real-time streaming, and advanced analytics.
Posted 2 months ago
7.0 - 10.0 years
2 - 6 Lacs
Pune
Work from Office
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough