Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
12.0 - 15.0 years
13 - 17 Lacs
Pune
Work from Office
Title : Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects Req ID: 323749 We are currently seeking a Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works
Posted 6 days ago
8.0 - 13.0 years
10 - 14 Lacs
Chennai
Work from Office
Req ID: 324664 We are currently seeking a Data Architect to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Key Responsibilities: Develop and articulate long-term strategic goals for data architecture vision and establish data standards for enterprise systems. Utilize various cloud technologies, including Azure, AWS, GCP, and data platforms like Databricks and Snowflake. Conceptualize and create an end-to-end vision outlining the seamless flow of data through successive stages. Institute processes for governing the identification, collection, and utilization of corporate metadata, ensuring accuracy and validity. Implement methods and procedures for tracking data quality, completeness, redundancy, compliance, and continuous improvement. Evaluate and determine governance, stewardship, and frameworks for effective data management across the enterprise. Develop comprehensive strategies and plans for data capacity planning, data security, life cycle data management, scalability, backup, disaster recovery, business continuity, and archiving. Identify potential areas for policy and procedure enhancements, initiating changes where required for optimal data management. Formulate and maintain data models and establish policies and procedures for functional design. Offer technical recommendations to senior managers and technical staff in the development and implementation of databases and documentation. Stay informed about upgrades and emerging database technologies through continuous research. Collaborate with project managers and business leaders on all projects involving enterprise data. Document the data architecture and environment to ensure a current and accurate understanding of the overall data landscape. Design and implement data solutions tailored to meet customer needs and specific use cases. Provide thought leadership by recommending the most suitable technologies and solutions for various use cases, spanning from the application layer to infrastructure. Basic Qualifications: 8+ years of hands-on experience with various database technologies 6+ years of experience with Cloud-based systems and Enterprise Data Architecture, driving end-to end technology solutions. Experience with Azure, Databricks, Snowflake Knowledgeable on concepts of GenAI Ability to travel at least 25%." Preferred Skills: Possess certifications in AWS, Azure, and GCP to complement extensive hands-on experience. Demonstrated expertise with certifications in Snowflake. Valuable "Big 4" Management Consulting experience or exposure to multiple industries. Undergraduate or graduate degree preferred.
Posted 6 days ago
12.0 - 17.0 years
7 - 11 Lacs
Chennai
Work from Office
Req ID: 303369 We are currently seeking a Enterprise Resource Planning Advisor to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Has more than 12 years of relevant experience with Oracle ERP Cloud Data migration and minimum 4 end to end ERP cloud implementation. Analyze Data and Mapping Work with functional teams to understand data requirements and develop mappings to enable smooth migration using FBDI (File-Based Data Import) and ADFdi. Develop and Manage FBDI Templates Design, customize, and manage FBDI templates to facilitate data import into Oracle SaaS applications, ensuring data structure compatibility and completeness. Configure ADFdi for Data Uploads Use ADFdi (Application Development Framework Desktop Integration) for interactive data uploads, enabling users to manipulate and validate data directly within Excel. Data Validation and Quality Checks Implement data validation rules and perform pre- and post-load checks to maintain data integrity and quality, reducing errors during migration. Execute and Troubleshoot Data Loads Run data loads, monitor progress, troubleshoot errors, and optimize performance during the data migration process. Collaborate with Stakeholders Coordinate with cross-functional teams, including project managers and business analysts, to align on timelines, resolve data issues, and provide migration status updates.
Posted 6 days ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Req ID: 306668 We are currently seeking a Cloud Solution Delivery Sr Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 2+ years of experience inleading a team of data engineers - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification
Posted 6 days ago
1.0 - 4.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Req ID: 321505 We are currently seeking a Test Analyst to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Understand business requirements , develop test cases. "¢ Work with tech team and client to validate and finalise test cases.. "¢ Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects "¢ Run in testing phase "“ SIT and UAT "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required"¢ Test Cases development "¢ Jira knowledge for record test cases, expected results, outcomes, assign defects) "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional)
Posted 6 days ago
12.0 - 15.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Req ID: 323777 We are currently seeking a Data Architect Sr. Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"
Posted 6 days ago
7.0 - 12.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Req ID: 325298 We are currently seeking a AWS Redshift administrator Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties: "¢ Administer and maintain scalable cloud environments and applications for data organization. "¢ Understanding business objectives of the company and creating cloud-based solutions to facilitate those objectives. "¢ Implement Infrastructure as Code and deploy code using Terraform, Gitlab "¢ Install and maintain software, services, and application by identifying system requirements. "¢ Hands-on AWS Services and DB and Server troubleshooting experience. "¢ Extensive database experience with RDS, AWS Redshift, MySQL "¢ Maintains environment by identifying system requirements, installing upgrades and monitoring system performance. "¢ Knowledge of day-to-day database operations, deployments, and development "¢ Experienced in Snowflake "¢ Knowledge of SQL and Performance tuning "¢ Knowledge of Linux Shell Scripting or Python "¢ Migrate system from one AWS cloud to another AWS account "¢ Hands-on DB and Server troubleshooting experience "¢ Maintains system performance by performing system monitoring and analysis and performance tuning. "¢ Troubleshooting system hardware, software, and operating and system management systems. "¢ Secures web system by developing system access, monitoring, control, and evaluation. "¢ Testing disaster recovery policies and procedures; completing back-ups; and maintaining documentation. "¢ Upgrades system and services and developing, testing, evaluating, and installing enhancements and new software. "¢ Communicating with internal teams, like EIMO, Operations, and Cloud Architect "¢ Communicate with stakeholders and build applications to meet project needs. Minimum Skills Required: "¢ Bachelor"™s degree in computer science or engineering "¢ Minimum of 7 years of experience in System, platform, and AWS cloud administration "¢ Minimum of 5 to 7 years of Database administration and AWS experience using latest AWS technologies "“ AWS EC2, Redshift, VPC, S3, AWS RDS "¢ Experience with Java, Python, Redshift, MySQL, or equivalent database tools "¢ Experience with Agile software development using JIRA "¢ Experience in multiple OS platforms with strong emphasis on Linux and Windows systems "¢ Experience with OS-level scripting environment such as KSH shell., PowerShell "¢ Experience with version management tools and CICD pipeline "¢ In-depth knowledge of the TCP / IP protocol suite, security architecture, securing and hardening Operating Systems, Networks, Databases and Applications. "¢ Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) , query performance tuning. "¢ Experience supporting and optimizing data pipelines and data sets. "¢ Knowledge of the Incident Response life cycle "¢ AWS solution architect certifications. "¢ Strong written and verbal communication skills.
Posted 6 days ago
18.0 - 23.0 years
15 - 19 Lacs
Pune
Work from Office
Req ID: 317103 We are currently seeking a Digital Consultant - Innovation Group to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesWe are seeking a highly skilled and experienced Digital Consultant to join our Innovation Group. The ideal candidate will have a strong background in Big Data, Cloud, and AI/ML projects, with a focus on the health insurance or retail domains or manufacturing domains. This role involves engaging with clients for architecture and design, building accelerators for cloud migration, and developing innovative solutions using GenAI technologies. Key Responsibilities: "¢ Engage with clients to understand their requirements and provide architectural and design solutions. "¢ Develop and implement accelerators to facilitate faster cloud migration. "¢ Create innovative use cases or solutions to solve day to day data engineering problems using AI and GenAI tools. "¢ Develop reference architectures for various use cases using modern cloud data platforms. "¢ Understanding of Legacy toolsets, be it ETL, reporting etc is needed. "¢ Create migration suites for cataloging, migrating, and verifying data from legacy systems to modern platforms like Databricks and Snowflake. Minimum Skills RequiredQualifications: "¢ EducationB.E. in Electronics & Telecommunication or related field. "¢ Experience18+ years in IT, with significant experience in Big Data, Cloud, and AI/ML projects. "¢ Technical Skills: Proficiency in Databricks, Snowflake, AWS, GenAI (RAG and GANs), Python, C/C++/C
Posted 6 days ago
12.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Req ID: 323775 We are currently seeking a Data & AI Technical Solution Architects to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"
Posted 6 days ago
1.0 - 4.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Req ID: 321498 We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. "¢ Work closely with Data modeller to ensure data models support the solution design "¢ Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. "¢ Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. "¢ Develop documentation and artefacts to support projects Minimum Skills Required"¢ ADF "¢ Fivetran (orchestration & integration) "¢ SQL "¢ Snowflake DWH
Posted 6 days ago
8.0 - 13.0 years
13 - 17 Lacs
Bengaluru
Work from Office
We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in bangalore, Karntaka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelinesmandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring)Training & Certification "¢ Apache Kafka Administration Snowflake Fundamentals/Advanced Training "¢ Experience 8 years of experience in a technical role working with AWSAt least 2 years in a leadership or management role
Posted 6 days ago
12.0 - 15.0 years
13 - 17 Lacs
Pune
Work from Office
Req ID: 323754 We are currently seeking a Data & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works
Posted 6 days ago
2.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 310924 We are currently seeking a Technical Lead - Data Foundations Squad to join our team in Bangalore, Karntaka (IN-KA), India (IN). Technical Lead We are looking for a highly experienced technical lead to run (jointly with a solution architect) a medium sized squad delivering data as a product integrations and flows. This squad will be the first squad to utilise new standards and tools as they are delivered by the squads working on the core platform, both delivering value, and working with upstream teams to ensure that the baseline you are being asked to follow is high quality and fit for purpose. As a technical lead at NTT DATA UK, you will be expected to mentor and lead quality across the engineering activities with your team, working closely with the program wide engineering lead to ensure consistent standards across the platform. As this team will be having to deliver fully end to end data products, it is highly likely that there will be unfamiliar technologies needing to be handled at times, and a successful candidate will need to be comfortable with not just quickly upskilling themselves as requirements evolve, but to bring there team along with them. Required skills A deep understanding of data products and or DAAP concepts A willingness and ability to quickly upskill yourself and others on new technologies Experience at the level of technical lead, or lead engineer for 2+ years Significant expertise with Python, including design paradigms Experience with Kafka (or significant experience with another log based streaming technology) Experience with Snowflake (or significant experience with another cloud scale warehousing solution) Experience with Terraform (or significant experience with another IaC tool) Experience with unit testing within a data landscape Experience of an agile workflow Strong communication skills Extra useful skills Experience with working in a multi-vendor requirement A working understanding of cloud security An understanding of architectural patterns for real time data A history of authoring formal documentation and low level designs for a wide range of target audiences
Posted 6 days ago
12.0 - 15.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Req ID: 323774 We are currently seeking a Data & AI Technical Solution Architects to join our team in Hyderabad, Telangana (IN-TG), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"
Posted 6 days ago
5.0 - 10.0 years
13 - 23 Lacs
Hyderabad
Work from Office
Skill- Snowflake Location- Hyderabad Experience- 5 to 10years JD IT development experience with min 3+ years hands-on experience in Snowflake Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow Should have good understanding on ETL or ELT processes and ETL tools
Posted 6 days ago
4.0 - 6.0 years
15 - 25 Lacs
Noida
Work from Office
We are looking for a highly experienced Senior Data Engineer with deep expertise in Snowflake to lead efforts in optimizing the performance of our data warehouse to enable faster, more reliable reporting. You will be responsible for improving query efficiency, data pipeline performance, and overall reporting speed by tuning Snowflake environments, optimizing data models, and collaborating with Application development teams. Roles and Responsibilities Analyze and optimize Snowflake data warehouse performance to support high-volume, complex reporting workloads. Identify bottlenecks in SQL queries, ETL/ELT pipelines, and data models impacting report generation times. Implement performance tuning strategies including clustering keys, materialized views, result caching, micro-partitioning, and query optimization. Collaborate with BI teams and business analysts to understand reporting requirements and translate them into performant data solutions. Design and maintain efficient data models (star schema, snowflake schema) tailored for fast analytical querying. Develop and enhance ETL/ELT processes ensuring minimal latency and high throughput using Snowflake’s native features. Monitor system performance and proactively recommend architectural improvements and capacity planning. Establish best practices for data ingestion, transformation, and storage aimed at improving report delivery times. Experience with Unistore will be an added advantage
Posted 6 days ago
10.0 - 16.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Required Skills & Qualifications: 10-12 years of experience in data architecture, data warehousing, and cloud technologies. Strong expertise in Snowflake architecture, data modeling, and optimization. Solid hands-on experience with cloud platforms: AWS , Azure , and GCP . In-depth knowledge of SQL , Python , PySpark , and related data engineering tools. Expertise in data modeling (both dimensional and normalized models). Strong experience with data integration, ETL processes, and pipeline development. Certification in Snowflake , AWS , Azure , or related cloud technologies. Experience working with large-scale data processing frameworks and platforms. Experience in data visualization tools and BI platforms (e.g., Tableau, Power BI). Experience in Agile methodologies and project management. Strong problem-solving skills with the ability to address complex technical challenges. Excellent communication skills and ability to work collaboratively with cross-functional teams.
Posted 6 days ago
5.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2Team ManagementProductivity, efficiency, absenteeism 3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake.
Posted 6 days ago
7.0 - 12.0 years
20 - 35 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Skill set Snowflake, AWS, Cortex AI, Horizon Catalog or Snowflake, AWS, (Cortex AI or Horizon Catalog) or Snowflake, Azure, Cortex AI, Horizon Catalog Or Snowflake, Azure, (Cortex AI or Horizon Catalog) Preferred Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. Experience in data engineering, with at least 3 years of experience working with Snowflake. Proven experience in Snowflake, Cortex AI/ Horizon Catalog focusing on data extraction, chatbot development, and Conversational AI. Strong proficiency in SQL, Python, and data modeling. Experience with data integration tools (e.g., Matillion, Talend, Informatica). Knowledge of cloud platforms such as AWS or Azure, or GCP. Excellent problem-solving skills, with a focus on data quality and performance optimization. Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github/Gitlab, Azure repo
Posted 6 days ago
5.0 - 10.0 years
10 - 20 Lacs
Pune
Work from Office
Role & responsibilities : Senior Site Reliability Engineer - Data PlatformRole Summary The primary responsibility of the Senior Site Reliability Engineer (SRE) is to ensure reliability and performance of data systems while working on development, automation, and testing of data pipelines from extract to consumption layer population for the GPN Lakehouse. This role performs tasks connected with data analytics, testing, and system architecture to provide reliable data pipelines that enable business solutions. SRE engineers will be expected to perform at a minimum the following tasks: ETL process management, data modeling, data warehouse/lake architecture, ETL tool implementation, data pipeline development, system monitoring, incident response, data lineage tracking, and ETL unit testing. NOTICE PERIOD- Immediate Joiners only
Posted 6 days ago
3.0 - 6.0 years
10 - 13 Lacs
Pune
Hybrid
When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation. For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation. From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys. Innovate With Ansys, Power Your Career. Summary / Role Purpose The Business Operations Specialist II works with the Sales, Sales Ops, Legal, Accounting, Export Compliance, and other departments to process customer orders and generate license keys. This role is responsible for verifying and reviewing the accuracy of orders, also completing and maintaining associated records and preparing related reports. Little direction required; the Business Operations Specialist II is able to handle some complex tasks and accomplish straightforward work without assistance. Key Duties and Responsibilities Processes software license orders and stock orders via multiple CRM systems and verifies license agreements in accordance with ANSYS, Inc. policies and procedures Generates timely, accurate license keys and software license entitlement information, and delivers them to sales channels and customers Assists customers attempting to enroll for the ANSYS, Inc. Customer Portal Utilizes CRM checks to strive for succinct data integrity Acts as liaison to ANSYS, Inc. sales channel by providing quality customer service and support and resolving customer issues Provides assistance to sales personnel for proper order submission and documentation Interfaces with legal, accounting, and sales departments to facilitate procedural and policy adherence Proactively seeks ways to improve workflow, including identification of better ways to provide value-added customer service Participates in department projects such as developing rollout plans for product delivery Minimum Education/Certification Requirements and Experience Associates Degree or minimum 4 years of experience in a billing, order processing, or customer service environment Excellent customer services skills and orientation Demonstrated organizational and analytical skills Experience working in database environment including report generation responsibilities Demonstrated ability and experience in a detail-oriented position Ability and willingness to perform in fast paced, rapidly changing environment Excellent communication and interpersonal skills Demonstrated ability to multi-task in a deadline driven environment Microsoft Office experience required Preferred Qualifications and Skills Prior CRM experience preferred Bachelors Degree in Accounting or Business is preferred Previous experience with servicing global customers is highly preferred Experience working with Salesforce, Snowflake, and PowerBI Experience improving processes At Ansys, we know that changing the world takes vision, skill, and each other. We fuel new ideas, build relationships, and help each other realize our greatest potential in the knowledge that every day is an opportunity to observe, teach, inspire, and be inspired. Together as One Ansys, we are powering innovation that drives human advancement. Our Commitments: Amaze with innovative products and solutions Make our customers incredibly successful Act with integrity Ensure employees thrive and shareholders prosper Our Values: Adaptability: Be open, welcome what's next Courage: Be courageous, move forward passionately Generosity: Be generous, share, listen, serve Authenticity: Be you, make us stronger Our Actions: We commit to audacious goals We work seamlessly as a team We demonstrate mastery We deliver outstanding results OUR ONE ANSYS CULTURE HAS INCLUSION AT ITS CORE We believe diverse thinking leads to better outcomes. We are committed to creating and nurturing a workplace that fuels this by welcoming people, no matter their background, identity, or experience, to a workplace where they are valued and where diversity, inclusion, equity, and belonging thrive. At Ansys, you will find yourself among the sharpest minds and most visionary leaders across the globe. Collectively we strive to change the world with innovative technology and transformational solutions. With a prestigious reputation in working with well-known, world-class companies, standards at Ansys are high met by those willing to rise to the occasion and meet those challenges head on. Our team is passionate about pushing the limits of world-class simulation technology, empowering our customers to turn their design concepts into successful, innovative products faster and at a lower cost. At Ansys, it's about the learning, the discovery, and the collaboration. It's about what is next as much as the mission accomplished. And it's about the melding of disciplined intellect with strategic direction and results that have, can, and do impact real people in real ways. All this is forged within a working environment built on respect, autonomy, and ethics. CREATING A PLACE WE'RE PROUD TO BE Ansys is an S&P 500 company and a member of the NASDAQ-100. We are proud to have been recognized for the following more recent awards, although our list goes on: Americas Most Loved Workplaces, Gold Stevie Award Winner, Americas Most Responsible Companies, Fast Company World Changing Ideas, Great Place to Work Certified (China, Greece, France, India, Japan, Korea, Spain, Sweden, Taiwan, U.K.). For more information, please visit us at www.ansys.com Ansys is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics. Ansys does not accept unsolicited referrals for vacancies, and any unsolicited referral will become the property of Ansys. Upon hire, no fee will be owed to the agency, person, or entity.
Posted 6 days ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
ETL testers with Automation Testing experience in DBT and snowflake. DBT, experience in Talend and snowflake.
Posted 6 days ago
8.0 - 13.0 years
15 - 25 Lacs
Pune
Hybrid
About This Role : We are looking for a talented and experienced Data Engineer with Tech Lead with hands-on expertise in any ETL Tool with full knowledge about CI/CD practices with leading a team technically more than 5 and client facing and create Data Engineering, Data Quality frameworks. As a tech lead must ensure to build ETL jobs, Data Quality Jobs, Big Data Jobs performed performance optimization by understanding the requirements, create re-usable assets and able to perform production deployment and preferably worked in DWH appliances Snowflake / redshift / Synapse Responsibilities Work with a team of engineers in designing, developing, and maintaining scalable and efficient data solutions using Any Data Integration (any ETL tool like Talend / Informatica) and any Big Data technologies. Design, develop, and maintain end-to-end data pipelines using Any ETL Data Integration (any ETL tool like Talend / Informatica) to ingest, process, and transform large volumes of data from heterogeneous sources. Have good experience in designing cloud pipelines using Azure Data Factory or AWS Glues/Lambda. Implemented Data Integration end to end with any ETL technologies. Implement database solutions for storing, processing, and querying large volumes of structured and unstructured and semi-structured data Implement Job Migrations of ETL Jobs from Older versions to New versions. Implement and write advanced SQL scripts in SQL Database at medium to expert level. Work with technical team with client and provide guidance during technical challenges. Integrate and optimize data flows between various databases, data warehouses, and Big Data platforms. Collaborate with cross-functional teams to gather data requirements and translate them into scalable and efficient data solutions. Optimize ETL, Data Load performance, scalability, and cost-effectiveness through optimization techniques. Interact with Client on a daily basis and provide technical progress and respond to technical questions. Implement best practices for data integration. Implement complex ETL data pipelines or similar frameworks to process and analyze massive datasets. Ensure data quality, reliability, and security across all stages of the data pipeline. Troubleshoot and debug data-related issues in production systems and provide timely resolution. Stay current with emerging technologies and industry trends in data engineering technologies, CI/CD, and incorporate them into our data architecture and processes. Optimize data processing workflows and infrastructure for performance, scalability, and cost-effectiveness. Provide technical guidance and foster a culture of continuous learning and improvement. Implement and automate CI/CD pipelines for data engineering workflows, including testing, deployment, and monitoring. Perform migration to production deployment from lower environments, test & validate Must Have Skills Must be certified in any ETL tools, Database, Cloud.(Snowflake certified is more preferred) Must have implemented at least 3 end-to-end projects in Data Engineering. Must have worked on performance management optimization and tuning for data loads, data processes, data transformation in big data Must be flexible to write code using JAVA/Scala/Python etc. as required Must have implemented CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline. Must have managed a team technically of min 5 members and guided the team technically. Must have the Technical Ownership capability of Data Engineering delivery. Strong communication capabilities with client facing. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5 years of experience in software engineering or a related role, with a strong focus on Any ETL Tool, database, integration. Proficiency in Any ETL tools like Talend , Informatica etc for Data Integration for building and orchestrating data pipelines. Hands-on experience with relational databases such as MySQL, PostgreSQL, or Oracle, and NoSQL databases such as MongoDB, Cassandra, or Redis. Solid understanding of database design principles, data modeling, and SQL query optimization. Experience with data warehousing, Data Lake , Delta Lake concepts and technologies, data modeling, and relational databases.
Posted 6 days ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai, Hyderabad, Bengaluru
Hybrid
Your day at NTT DATA The Software Applications Development Engineer is a seasoned subject matter expert, responsible for developing new applications and improving upon existing applications based on the needs of the internal organization and or external clients. What you'll be doing Yrs. Of Exp: 5 Yrs. Data Engineer- Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. Work closely with Data modeller to ensure data models support the solution design Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. Develop documentation and artefacts to support projects.
Posted 1 week ago
1.0 - 3.0 years
3 - 5 Lacs
New Delhi, Chennai, Bengaluru
Hybrid
Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2