Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 - 14.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction A Data and AI Technology Sales Engineer role (what we internally call a, Brand Technical Specialist) within IBMs zStack brand means accelerating enterprises success by improving their ability to understand their data. It means providing solutions that enable people across organizations, in multiple roles, the ability to turn data into actionable insights without having to wait for IT. And it means solutioning and selling multi-award winning software deployed on IBM z/LinuxONE platform, and world-class design practices that enables business analysts to ask new questions. The answers to which are literally shaping the future and changing the world. Excellent onboarding and an industry leading learning culture will set you up for positive impact and success, whilst ongoing development will advance your career through an upward trajectory. Our sales environment is collaborative and experiential. Part of a team, youll be surrounded by bright minds and keen co-creators - always willing to help and be helped - as you apply passion to work that will compel our clients to invest in IBMs products and services. Your role and responsibilities Applying excellent communication and empathy, youll act as a trusted strategic advisor to some of the worlds most transformational enterprises and culturally influential brands, as they rely on your expertise and our technology to solve some of their hardest problems. With your focus on the front-end of the solution lifecycle, youll be a master at listening to stakeholders, grasping their business challenges and requirements, and forming more detailed definitions of new architectural structures that will make up their best-fit, value adding solutions. Were committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Understanding client needs and aligning them with IBM Z solutions. Creating effective end-to-end architecture using IBM Z. Ensuring architectural viability and conducting assessments. Identifying and resolving project risks and conflicts. Your primary responsibilities will include: - Client Strategy Design: Creating client strategies for Data & AI infrastructure around the IBM z and LinuxONE platform - Solution Definition: Defining IBM Data & AI solutions that constitute functionalities such as Data Integration (ETL), Data Store (DB2, Oracle, MySql) and Data Science (Watson studio, Watson ML) leveraging the strengths of the IBM z and LinuxONE platform - Providing proof of concepts and simplifying complex topics for meeting clients business requirements in the area of data platform modernization and analytics. - Credibility Building: Establishing credibility and trust to facilitate architecture and solution benefits to drive revenue and technical business objectives. Required education Bachelors Degree Required technical and professional expertise Required Professional and Technical Expertise : - Minimum 8-14 years of experience in Data and AI technologies which should include infrastructure for Analytics & Advance Analytics solutions like Datalake, Data Warehouse, Business Analytics, AI, GenAI & Datafabric - Experiential selling including co-creation and hands-on technical sales methods such as: demos, custom demos, Proofs of concept, Minimum Viable Products (MVPs) or other technical proofs Build deep brand (Data & AI) expertise to assist partners to deliver PoX (Custom Demo, PoC, MVP, etc.) to progress opportunities. Identifying partners with skills, expertise, and experience Exceptional interpersonal and communication skills, and an ability to collaborate effectively with Ecosystem Partners, clients and sales professionals. Understanding of areas of Governance Risk and Controls is a bonus. Experience in AI Landscape and technologies at work across Banking/Finance Know how and technical capabilities and working experience on similar Data AI Products like Cloudera, Teradata, Oracle, Informatica, SAS, etc Preferred technical and professional experience - Knowledge of IBM Z and how it fits in the Digital Transformation (training in IBMs Z product will be provided)
Posted 2 weeks ago
11 - 14 years
13 - 16 Lacs
Pune
Work from Office
GCP Data Architect GCP Certification (Either GCP Data Engineer or GCP Cloud Architect) 15+ years of experience in Architecting Data projects and knowledge of multiple ETL tools (like Informatica+ Talend+ DataStage+ etc.) 5+ experience in Data modeling and Data warehouse and Data lake implementation experience in implementing Teradata to Bigquery migration project Ability to identify and gather requirements to define a solution to be built and operated on GCP+ perform high-level and low-level design for the GCP platform Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project. 3+ Years of Strong experience in GCP technology areas of Data store+ Big Query+ Cloud storage+ Persistent disk IAM+ Roles+ Projects+ Organization. Databases including Big table+ Cloud SQL+ Cloud Spanner+ Memory store+ Data Analytics Data Flow+ DataProc+ Cloud Pub/Sub+ Kubernetes+ Docker+ managing containers+ container auto scaling and container security GCP technology areas of Data store+ Big Query+ Cloud storage+ Persistent disk IAM+ Roles+ Projects+ Organization. Databases including Big table+ Cloud SQL+ Cloud Spanner+ Memory store+ Data Analytics Data Flow+ DataProc+ Cloud Pub/Sub+ Kubernetes+ Docker+ managing containers+ container auto scaling and container security Experience in Design+ Deployment+ configuration and Integration of application infrastructure resources including GKE clusters+ Anthos+ APIGEE and DevOps Platform Application development concepts and technologies (e.g. CI/CD+ Java+ Python)
Posted 3 months ago
5 - 10 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Description AWS Data engineer Hadoop Migration We are seeking an experienced AWS Principal Data Architect to lead the migration of Hadoop DWH workloads from on-premise to AWS EMR. As an AWS Data Architect, you will be a recognized expert in cloud data engineering, developing solutions designed for effective data processing and warehousing requirements of large enterprises. You will be responsible for designing, implementing, and optimizing the data architecture in AWS, ensuring highly scalable, flexible, secured and resilient cloud architectures solving business problems and helps accelerate the adoption of our clients data initiatives on the cloud. Key Responsibilities: Lead the migration of Hadoop workloads from on-premise to AWS-EMR stack. Design and implement data architectures on AWS, including data pipelines, storage, and security. Collaborate with cross-functional teams to ensure seamless migration and integration. Optimize data architectures for scalability, performance, and cost-effectiveness. Develop and maintain technical documentation and standards. Provide technical leadership and mentorship to junior team members. Work closely with stakeholders to understand business requirements, and ensure data architectures meet business needs. Work alongside customers to build enterprise data platforms using AWS data services like Elastic Map Reduce (EMR), Redshift, Kinesis, Data Exchange, Data Sync, RDS , Data Store, Amazon MSK, DMS, Glue, Appflow, AWA Zero-ETL, Glue Data Catalog, Athena, Lake Formation, S3, RMS, Data Zone, Amazon MWAA, APIs Kong Deep understanding of Hadoop components, conceptual processes and system functioning and relative components in AWS EMR and other AWS services. Good experience on Spark-EMR Experience in Snowflake/Redshift Good idea of AWS system engineering aspects of setting up CI-CD pipelines on AWS using Cloudwatch, Cloudtrail, KMS, IAM IDC, Secret Manager, etc Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide AWS solution architect community Basic Qualifications: 10+ years of IT experience with 5+ years of experience in Data Engineering and 5+ years of hands-on experience in AWS Data/EMR Services (e.g. S3, Glue, Glue Catalog, Lake Formation) Strong understanding of Hadoop architecture, including HDFS, YARN, MapReduce, Hive, HBase. Experience with data migration tools like Glue, Data Sync. Excellent knowledge of data modeling, data warehousing, ETL processes, and other Data management systems. Strong understanding of security and compliance requirements in cloud. Experience in Agile development methodologies and version control systems. Excellent communication an leadership skills. Ability to work effectively across internal and external organizations and virtual teams. Deep experience on AWS native data services including Glue, Glue Catalog, EMR, Spark-EMR, Data Sync, RDS, Data Exchange, Lake Formation, Athena, AWS Certified Data Analytics – Specialty. AWS Certified Solutions Architect – Professional. Experience on Containerization and serverless computing. Familiarity with DevOps practices and automation tools. Experience in Snowflake/Redshift implementation is additionally preferred. Preferred Qualifications: Technical degrees in computer science, software engineering, or mathematics Cloud and Data Engineering background with Migration experience. Other Skills: A critical thinker with strong research, analytics and problem-solving skills Self-motivated with a positive attitude and an ability to work independently and or in a team Able to work under tight timeline and deliver on complex problems. Must be able to work flexible hours (including weekends and nights) as needed. A strong team player
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2