Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
chennai, tamil nadu, india
Remote
Req ID: 336026 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer - Azure to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred Minimum Skills Required: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 2 weeks ago
5.0 years
0 Lacs
chennai, tamil nadu, india
Remote
Req ID: 336025 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer - Azure to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred Minimum Skills Required: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 2 weeks ago
7.0 - 12.0 years
12 - 15 Lacs
dehradun, hyderabad, ahmedabad
Hybrid
Azure Integration Services JSON, XML, SOAP, REST APIs. Must be Azure Certified Logic Apps, API Manager, Event Hub, Service Bus, Data Factory. secure integrations and authentication protocols (OAuth2, OIDC, SAML) Agile methodologies
Posted 2 weeks ago
4.0 - 7.0 years
9 - 13 Lacs
bengaluru
Work from Office
Reference 25000GFH Responsibilities The position of back office cash management agent consists of recording in our tools the information contained in the contracts or amendments signed with the clients These actions enable the implementation or updating of cash management solutions contracts for major Societe Generale, France and International clients MAIN ACCOUNTABILITIES: Analyze and implement contracts and endorsements of national and international cash management products for customers Corporates and Banks, in dedicated IT tools, Record all process steps in dedicated tool until complete validation for each contract or amendments Follow up on contracts and amendments implemented in relation to SG, Implementation and Commercial agencies, Participate in the drafting and/or updating of the operating procedures as well as the knowledge base and FAQs, Track and support shared messaging and the team's phone line, Keep your knowledge up to date Provide quality service (SLA) to avoid customer complaints and operational risk, In the event of a problem, notify the French teams without delay in order to respect commitments COMPETENCIES: Teamwork Rigor Customer sense Analysis capacity Sharing knowledge French language added advantage Knowledge of the means of payment and SocitGnrale IT tools would be appreciated but does not constitute a prerequisite Required Profile required PRIOR WORK EXPERIENCE Relevant experience EDUCATION Graduate / Post Graduate from a reputed institute / university Why join us We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status? Business insight At SocitGnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating, and taking action are part of our DNA If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved, We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection, Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination,
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
kochi
Work from Office
As a Brand Technical Specialist, you'll work closely with clients to develop relationships, understand their needs, earn their trust and show them how IBM's industry leading solutions will solve their problems whilst delivering value to their business. We're committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Strategic Mainframe Solutions: Crafting client strategies for mainframe infrastructure and applications. Comprehensive zStack Solutions: Defining and detailing IBM zStack solutions for client enhancement. Effective Client Education: Delivering simplified proof of concepts and educating clients. Building Trust for Cloud Deals: Building trust for closing complex Cloud technology deals. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
coimbatore
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical override
Posted 2 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
kochi
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 2 weeks ago
3.0 - 6.0 years
6 - 10 Lacs
mumbai, pune, chennai
Work from Office
ETL Data Engineer with Python Work Schedule: 5 Days Onsite Required Skills: Strong programming experience in Python. Proficiency in Pandas, NumPy, and working with SQL queries. Experience in working with Data Lake and processing large volumes of data. Ability to parse and consolidate data from Excel, CSV, and plain text formats. Hands-on experience in writing unit tests, performing regression testing, and implementing error/exception handling. Key Responsibilities: Develop effective Python scripts for data extraction, transformation, and loading (ETL). Perform data manipulation and analysis using Pandas, NumPy, and SQL. Work with structured and unstructured data sources (Excel, CSV, text files) in a Data Lake environment. Implement robust error handling and exception handling in Python scripts. Design and maintain unit tests and regression tests to ensure data accuracy and code stability. Collaborate with data engineers, analysts, and business teams to gather requirements and deliver data solutions. Optimize and refactor existing code for improved performance and scalability. What We Offer: Competitive salary and benefits package. Opportunities for professional growth and development.
Posted 2 weeks ago
10.0 - 15.0 years
7 - 11 Lacs
bengaluru
Work from Office
About the Role We are looking for a Data Warehouse Engineer with strong expertise across the Azure Data Platform to design, build, and maintain modern data warehouse and analytics solutions. This role requires hands-on experience with Azure Synapse Analytics, Data Factory, Data Lake, Azure Analysis Services, and Power BI . The ideal candidate will ensure seamless data ingestion, storage, transformation, analysis, and visualization, enabling the business to make data-driven decisions. Key Responsibilities Data Ingestion & Orchestration 10-15years of experience in designing and building scalable ingestion pipelines using Azure Data Factory . Integrate data from multiple sources (SQL Server, relational databases, Azure SQL DB, Cosmos DB, Table Storage). Manage batch and real-time ingestion into Azure Data Lake Storage . Data Storage & Modelling Develop and optimize data warehouse solutions in Azure Synapse Analytics . Implement robust ETL/ELT processes to ensure data quality and consistency. Create data models for analytical and reporting needs. Data Analysis & Security Build semantic data models using Azure Analysis Services for enterprise reporting. Collaborate with BI teams to deliver well-structured datasets for reporting in Power BI . Implement Azure Active Directory for authentication, access control, and security best practices. Visualization & Business Support Support business teams in building insightful Power BI dashboards and reports . Translate business requirements into scalable and optimized BI solutions. Provide data-driven insights in a clear, business-friendly manner. Optimization & Governance Monitor system performance and optimize pipelines for efficiency and cost control. Establish standards for data governance, data quality, and metadata management . Qualifications & Skills Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field. Proven experience as a Data Warehouse Engineer / Data Engineer with strong expertise in: Azure Synapse Analytics Azure Data Factory Azure Data Lake Storage Azure Analysis Services Azure SQL Database / SQL Server Power BI (reporting & dashboarding) Strong proficiency in SQL and data modelling (star schema, snowflake schema, dimensional modelling). Knowledge of Azure Active Directory for authentication & role-based access control. Excellent problem-solving skills and ability to optimize large-scale data solutions. Strong communication skills to collaborate effectively with both technical and business stakeholders.
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Non-Degree Program Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 3 weeks ago
6.0 - 9.0 years
4 - 8 Lacs
pune, bengaluru, mumbai (all areas)
Work from Office
We are looking 4+ years exp candidate Mandatory skill:-Pyspark, ADF and Databricks
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
hyderabad
Work from Office
A proven and credible practitioner, your deep solution experience will help you lead a team of go-to subject matter experts. Fostering a culture of candour, collaboration, and growth-mindedness, you'll ensure co-creation across IBM Sales and client teams that drive investment in, and adoption of IBM's strategic platforms. Overseeing your teams' fusion of innovative solutions with modern IT architectures, integrated solutions, and offerings, you'll ensure they're helping to solve some of their clients most complex business challenges. We're committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Strategic Team Leadership: Leading a team of technical sales experts to co-create innovative solutions with clients. Partnership and Prototype Excellence: Collaborating with IBM and partners to deliver compelling prototypes. Optimizing Resource Utilization: Promoting maximum use of IBM's Technology Sales resources. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
bengaluru
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 3 weeks ago
10.0 years
0 Lacs
india
Remote
Job Title: Azure Data Architect Location: India Only (Remote) Seniority: Senior/Architect Duration: Long Term Soft Skills: Strong communication skills; fluent English preferred Interviews: HR interview and 1-2 technical interviews Overlap/Timings: 7.00 AM – 4.00 PM IST but this is based on project demand. Office: 100% Remote Work Dual Employment: Client does not allow dual employment; must be terminated if applicable Job Description: As an Azure Data Architect, your responsibilities are to: *Design and implement comprehensive data solutions with Azure Databricks, focusing on scalability, performance, and cost savings. *Lead the development of data mesh, data lake, and data warehouse architectures customized for the manufacturing sector. *Work closely with cross-functional teams to capture requirements and convert them into detailed technical designs and practical solutions. *Offer hands-on support in data ingestion, transformation, and visualization using Azure tools. *Enhance Databricks clusters and jobs for optimal performance and cost efficiency, following Spark and PySpark best practices. *Ensure adherence to data governance, security protocols, and disaster recovery standards within Azure. *Create and update technical design documents that align with business goals. *Guide development teams on CI/CD processes, data engineering, and analytics best practices. Profile Requirements For this position of Azure Data Architect, we are looking for someone with: 1. 10+ years of experience as a Data Architect, specializing in cloud-based data platforms. 2. 4+ years of hands-on experience with Azure technologies such as Data Factory, Databricks, and Data Lake Storage 3. Expertise in the manufacturing sector. Skilled in using Apache Spark, PySpark, and SQL for large-scale data processing tasks. 4. Thorough understanding of Azure Databricks architecture, including control and compute plane functionalities. 5. Experienced in designing secure networking configurations and governance strategies for Databricks environments. 6. Familiarity with data warehousing approaches like Kimball, Inmon, and Data Vault. 7. Practical experience with CI/CD pipelines for data solutions. 8. Knowledge of SAP ERP is a significant plus. 9. A bachelor’s degree in computer science, Information Technology, or a related discipline. 10. Strong analytical and problem-solving abilities complemented by excellent communication skills to connect technical and non-technical stakeholders. 11. Experience in maintaining a collaborative approach focused on ongoing improvement. Apply Now! Share your profile only if you match the JD at hiring@khey-digit.com. #HiringAlert #UrgentHiring #AzureDataArchitect #SeniorArchitect #Databricks #DataFactory #DataLakeStorage #DataWarehousing #PySpark #SQL #KheyDigit mi
Posted 3 weeks ago
0 years
0 Lacs
india
Remote
Join phData, a dynamic and innovative leader in the modern data stack. We partner with major cloud data platforms like Snowflake, AWS, Azure, GCP, Fivetran, Pinecone, Glean and dbt to deliver cutting-edge services and solutions. We're committed to helping global enterprises overcome their toughest data challenges. phData is a remote-first global company with employees based in the United States, Latin America and India. We celebrate the culture of each of our team members and foster a community of technological curiosity, ownership and trust. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results. 6x Snowflake Partner of the Year (2020, 2021, 2022, 2023, 2024, 2025) Fivetran, dbt, Atlation, Matillion Partner of the Year #1 Partner in Snowflake Advanced Certifications 600+ Expert Cloud Certifications (Sigma, AWS, Azure, Dataiku, etc) Recognized as an award-winning workplace in US, India and LATAM Responsibilities: Propose, design, and provision cloud-native data solutions on AWS/Azure. Leverage a deep understanding of Snowflake, IAM, S3, EC2, Kinesis, Sagemaker, Airflow, Kafka, Azure Data Factory, ADLS, Fivetran, Matillion, DBT, and/or other services and tools in designing and enhancing these solutions. Lead a technical team operating and managing modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack. Provide clear ownership of multiple simultaneous customer accounts across a variety of technical stacks as the technical leader. Skillfully navigate complex customer environments and build Epics, Stories, and Tasks in order to mature and improve our customers’ data platform. Delegate to and coach Engineers and ensure the successful and timely delivery of these Epics. Provide thought leadership by recommending the right technologies and approaches to maturing and solving problems to help ensure performance, security, scalability, and user satisfaction. Lead discussions and contribute to design recommendations for data model design for Data Warehouses and Data Lakes. Continually hunt for ways to automate, optimize, and expand our customers’ data platform and our service offering. Required Experience: Advanced Data Platform expertise in enterprise data platforms such as Snowflake, AWS, Azure, or Databricks. Extensive experience in providing architectural and operational guidance across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift) Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, Cassandra or other NoSQL storage systems. Production experience working with Data integration technologies such as Spark, Kafka, event/streaming, Matillion, Fivetran, HVR, AWS Data Migration Services, Azure DataFactory or others. Production experience working with Workflow Management and Orchestration, such as Airflow, AWS Managed Airflow, and Luigi. Strong working knowledge of SQL and the ability to write, debug, and optimize SQL queries. Deep expertise with cloud-native data technologies in AWS or Azure. Extensive experience with infrastructure as code using Terraform or CloudFormation. Experience building automated solutions for repetitive administrative tasks using Python or a similar language. Well-versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, and Liquibase. Professional track record of creating, challenging, and improving processes and procedures. Unmatched troubleshooting and performance tuning skills. Proven experience as a technical team lead and mentor of engineers. Passion for learning new technology stacks and up-skilling/training team members. Create and deliver detailed technical presentations for an executive audience. Excellent client-facing written and verbal communication skills and experience. Preferred Experience: Snowflake SnowPro Core certification Any Snowflake Advanced Certification Bachelor's degree in Computer Science or a related field Ideal opportunity if you enjoy… Seeing the big picture. You recognize that data challenges are multifaceted, and you understand the technical, business, operational, and human issues at play. Fast-paced operations work. You get energized by working across multiple projects and customers at once as requirements and situations evolve. Solving complex problems. You’re eager to take on big issues that have stumped some of the world’s biggest companies. Venturing boldly into uncharted territory. When facing a problem you haven’t seen before, you enjoy the challenge of navigating through rocky terrain to build an ideal solution. Working across technology stacks and organizational silos. You’ve seen what happens when people get stuck in one platform, application, or way of thinking. You can integrate disparate functions and technologies to make your solutions work. Thinking on your feet. You don’t get rattled by the unexpected. You quickly assess the situation and formulate a plan of action. Following up and following through. Not only can you envision the solution, you can break it into actionable steps and keep pushing toward the final outcome, no matter what gets in the way. Managing the human side. You’ve got the presence and poise to work confidently and respectfully with your teammates, your customers, and more—even when it’s time for difficult conversations. Collaborating with others. Being yourself. You value an environment where it’s safe to ask questions, take calculated risks, and be authentic. You support your teammates and expect others to do the same. phData celebrates diversity and is committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at phData. We are proud to be an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at People Operations.
Posted 3 weeks ago
6.0 - 8.0 years
18 - 20 Lacs
bengaluru
Work from Office
The Development Lead will oversee the design, development, and delivery of advanced data solutions using Azure Databricks, SQL, and data visualization tools like Power BI. The role involves leading a team of developers, managing data pipelines, and creating insightful dashboards and reports to drive data-driven decision-making across the organization. The individual will ensure best practices are followed in data architecture, development, and reporting while maintaining alignment with business objectives. Key Responsibilities: Data Integration & ETL Processes: Design, build, and optimize ETL pipelines to manage the flow of data from various sources into data lakes, data warehouses, and reporting platforms. Data Visualization & Reporting: Lead the development of interactive dashboards and reports using Power BI, ensuring that business users have access to actionable insights and performance metrics. SQL Development & Optimization: Write, optimize, and review complex SQL queries for data extraction, transformation, and reporting, ensuring high performance and scalability across large datasets. Azure Cloud Solutions: Implement and manage cloud-based solutions using Azure services (Azure Databricks, Azure SQL Database, Data Lake) to support business intelligence and reporting initiatives. Collaboration with Stakeholders: Work closely with business leaders and cross-functional teams to understand reporting and analytics needs, translating them into technical requirements and actionable data solutions. Quality Assurance & Best Practices: Implement and maintain best practices in development, ensuring code quality, version control, and adherence to data governance standards. Performance Monitoring & Tuning: Continuously monitor the performance of data systems, reporting tools, and dashboards to ensure they meet SLAs and business requirements. Documentation & Training: Create and maintain comprehensive documentation for all data solutions, including architecture diagrams, ETL workflows, and data models. Provide training and support to end-users on Power BI reports and dashboards. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Development Lead or Senior Data Engineer with expertise in Azure Databricks, SQL, Power BI, and data reporting/visualization. Hands-on experience in Azure Databricks for large-scale data processing and analytics, including Delta Lake, Spark SQL, and integration with Azure Data Lake. Strong expertise in SQL for querying, data transformation, and database management. Proficiency in Power BI for developing advanced dashboards, data models, and reporting solutions. Experience in ETL design and data integration across multiple systems, with a focus on performance optimization. Knowledge of Azure cloud architecture, including Azure SQL Database, Data Lake, and other relevant services. Experience leading agile development teams, with a strong focus on delivering high-quality, scalable solutions. Strong problem-solving skills, with the ability to troubleshoot and resolve complex data and reporting issues. Excellent communication skills, with the ability to interact with both technical and non-technical stakeholders. Preferred Qualifications: Knowledge of additional Azure services (e.g., Azure Synapse, Data Factory, Logic Apps) is a plus. Experience in Power BI for data visualization and custom calculations. Keywords Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architecture Mandatory Key Skills Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architectureKeywords data reporting,database management,microsoft azure,azure data lake,etl,datafactory,sql azure,spark,azure logic apps,bi,designing etl,azure cloud,cloud architecture,agile,sql*,power bi*,data visualization*,azure databricks*,data transformation*Mandatory Key Skills data reporting,database management,microsoft azure,azure data lake,etl,datafactory,sql azure,spark,azure logic apps,bi,designing etl,azure cloud,cloud architecture,agile,sql*,power bi*,data visualization*,azure databricks*,data transformation*
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
pune
Work from Office
We are looking for a skilled Azure Data Engineer to design, build, and maintain scalable data pipelines and cloud-based data solutions on Microsoft Azure. You will work closely with data analysts, scientists, and business stakeholders to ensure efficient data processing, storage, and access. Key Responsibilities: Design and develop robust and scalable data pipelines using Azure Data Factory , Databricks , and Azure Synapse Analytics Develop and optimize ETL/ELT processes Implement data lake and data warehouse solutions using Azure Data Lake Storage , SQL Server , and Synapse Work with structured and unstructured data from various sources (e.g., APIs, databases, files) Ensure data quality, integrity, and security Collaborate with DevOps for CI/CD pipelines and version control using Azure DevOps / Git Participate in requirements gathering, solution design, and architecture discussions Required Skills & Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field 4+ years of experience in Data Engineering with hands-on experience in Azure cloud services Strong knowledge of: Azure Data Factory Azure Data Lake (Gen1/Gen2) Azure Synapse Analytics Databricks / Spark SQL Python / PySpark Experience with data modeling, metadata management, and performance tuning Familiarity with Azure DevOps , Git, and CI/CD practices
Posted 3 weeks ago
3.0 years
0 Lacs
delhi
On-site
About us Bain & Company is a global management consulting that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with This role is based out of the Visualization Centre of Excellence (CoE) at the BCN. Visualization CoE works closely with global Bain case teams, Bain Partners and end-clients providing them data analytics and business intelligence support using advanced data analysis and visualization tools (e.g., SQL, Python, Azure, AWS, Tableau, PowerBI, Alteryx etc.). The CoE is a one-stop shop for all case requests related to converting data into insightful visualizations tools (e.g., survey analytics, leadership KPI dashboards, etc). What you’ll do Responsible for designing, building, and maintaining infrastructure and systems that enable the Extraction, Transformation, and Storage of large datasets for analysis Work with Bain team or end-clients as an expert on specific platform/tool/language (Azure/AWS/Python/SQL etc.) in individual capacity or lead teams of analysts to design and deliver impactful insights Support project lead in end-to-end handling of the entire process, i.e., requirement gathering, data cleaning, processing and automation Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions Ensure that data architecture is scalable and maintainable Apply knowledge of data analysis tools like Azure Data Bricks, AWS Athena, Alteryx, etc. to support case teams with analysis of KPIs Prepare documentation for further reference The working style of the team would be to support product development; hence the pipelines and algorithm built should be scalable and automated Support case leads in managing internal and external stakeholders, across instruments and workstreams to provide expertise in data management and tool expertise Work under the guidance of a Team Lead / Team Manager / Sr. Team Manager, playing a key role in driving the team’s overall answer and final materials, client communication, work planning, and team management May also take responsibility for assigning work streams to Analysts, monitor workload; Provides tool based technical expertise to the junior team members when required May deploy Data engineering solutions using CI/CD pipelines (GitHub, Cloud servers using Azure/AWS) May lead client/ case team calls and communicate data, knowledge, insights and actionable next steps to the case team; relay implications to his/her own internal team Keep abreast of new and current statistical, database and data warehousing tools & techniques About you Candidate should be a Graduate/Post-Graduate from top-tier College with strong academic records and with 3-5 years of relevant work experience in areas related to Data Management, Business Intelligence or Business Analytics. Hands-on experience in data handling and ETL workstreams Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics, or Market Research is strongly preferred Minimum 2+ years of experience in Database development on Cloud based platforms such as AWS/Azure Working Experience with Python and Advanced SQL queries, Stored procedures, query performance tuning, index maintenance, etc., Experience of data modeling, data warehousing principles Experience on ETL tools in anyone of the tools like Azure Datafactory, Databricks, AWS Glue etc. Experience in reading data from different data sources including on premise data servers, cloud services and several file formats Understanding of database architecture Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Should be a motivated and collaborative team player, who is a role-model and at-cause individual within the team and office Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Good to Have: Exposure to CI/CD pipelines: GitHub, Docker, and containerization is a plus Candidates with advanced certifications in AWS and Azure will be preferred Experience on Snowflake/GCP is a plus What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
Posted 3 weeks ago
10.0 years
10 - 45 Lacs
hyderabad, telangana, india
On-site
Experience: 10-15 Years Work Mode: Pune & Hyderabad Job Type: Fulltime Mandatory Skills: Solution Architect, Gen AI, LLM’s, AI/ML, Python, Azure Cloud (Databricks, DataFactory, Azure Purview) or GCP (Big Query, Vertex.AI,Gemini) . Domain - BFSI, Retail, Supply Chain, OR Manufacturing. Role Overview We are seeking a highly experienced Principal Solution Architect to lead the design, development, and implementation of sophisticated cloud-based data solutions for our key clients. The ideal candidate will possess deep technical expertise across multiple cloud platforms (AWS, Azure, GCP), data architecture paradigms, and modern data technologies. You will be instrumental in shaping data strategies, driving innovation through areas like GenAI and LLMs, and ensuring the successful delivery of complex data projects across various industries. Required Qualifications & Skills Experience: 10+ years of experience in IT, with a significant focus on data architecture, solution architecture, and data engineering. Proven experience in a principal-level or lead architect role. Cloud Expertise: Deep, hands-on experience with major cloud platforms: Azure: (Microsoft Fabric, Data Lake, Power BI, Data Factory, Azure Purview ), good understanding of Azure Service Foundry, Agentic AI, copilot GCP: (Big Query, Vertex.AI,Gemini ) Data Science Leadership: Understanding and experience in integrating AI/ML capabilities, including GenAI and LLMs, into data solutions. Leadership & Communication: Exceptional communication, presentation, and interpersonal skills. Proven ability to lead technical teams and manage client relationships. Problem-Solving: Strong analytical and problem-solving abilities with a strategic mindset. Education: Bachelor’s or master’s degree in computer science, Engineering, Information Technology, or a related field. Key Responsibilities Solution Design & Architecture: Lead the architecture and design of robust, scalable, and secure enterprise-grade data solutions, including data lakes, data warehouses, data mesh, and real-time data pipelines on AWS, Azure, and GCP. Client Engagement & Pre-Sales: Collaborate closely with clients to understand their business challenges, translate requirements into technical solutions, and present compelling data strategies. Support pre-sales activities, including proposal development and solution demonstrations. Data Strategy & Modernization: Drive data and analytics modernization initiatives, leveraging cloud-native services, Big Data technologies, GenAI, and LLMs to deliver transformative business value. Industry Expertise: Apply data architecture best practices across various industries (e.g., BFSI, Retail, Supply Chain, Manufacturing). Preferred Qualifications Relevant certifications in AWS, Azure, GCP, Snowflake, or Databricks. Experience with Agentic AI, hyper-intelligent automation Skills: data,azure,architecture,cloud,gcp,aws,data architecture,data solutions,design,ml,solution architecture,gen ai,llms,ai,python,azure cloud,azure datafactory,azure databricks,data science,problem solving
Posted 3 weeks ago
10.0 years
10 - 45 Lacs
pune, maharashtra, india
On-site
Experience: 10-15 Years Work Mode: Pune & Hyderabad Job Type: Fulltime Mandatory Skills: Solution Architect, Gen AI, LLM’s, AI/ML, Python, Azure Cloud (Databricks, DataFactory, Azure Purview) or GCP (Big Query, Vertex.AI,Gemini) . Domain - BFSI, Retail, Supply Chain, OR Manufacturing. Role Overview We are seeking a highly experienced Principal Solution Architect to lead the design, development, and implementation of sophisticated cloud-based data solutions for our key clients. The ideal candidate will possess deep technical expertise across multiple cloud platforms (AWS, Azure, GCP), data architecture paradigms, and modern data technologies. You will be instrumental in shaping data strategies, driving innovation through areas like GenAI and LLMs, and ensuring the successful delivery of complex data projects across various industries. Required Qualifications & Skills Experience: 10+ years of experience in IT, with a significant focus on data architecture, solution architecture, and data engineering. Proven experience in a principal-level or lead architect role. Cloud Expertise: Deep, hands-on experience with major cloud platforms: Azure: (Microsoft Fabric, Data Lake, Power BI, Data Factory, Azure Purview ), good understanding of Azure Service Foundry, Agentic AI, copilot GCP: (Big Query, Vertex.AI,Gemini ) Data Science Leadership: Understanding and experience in integrating AI/ML capabilities, including GenAI and LLMs, into data solutions. Leadership & Communication: Exceptional communication, presentation, and interpersonal skills. Proven ability to lead technical teams and manage client relationships. Problem-Solving: Strong analytical and problem-solving abilities with a strategic mindset. Education: Bachelor’s or master’s degree in computer science, Engineering, Information Technology, or a related field. Key Responsibilities Solution Design & Architecture: Lead the architecture and design of robust, scalable, and secure enterprise-grade data solutions, including data lakes, data warehouses, data mesh, and real-time data pipelines on AWS, Azure, and GCP. Client Engagement & Pre-Sales: Collaborate closely with clients to understand their business challenges, translate requirements into technical solutions, and present compelling data strategies. Support pre-sales activities, including proposal development and solution demonstrations. Data Strategy & Modernization: Drive data and analytics modernization initiatives, leveraging cloud-native services, Big Data technologies, GenAI, and LLMs to deliver transformative business value. Industry Expertise: Apply data architecture best practices across various industries (e.g., BFSI, Retail, Supply Chain, Manufacturing). Preferred Qualifications Relevant certifications in AWS, Azure, GCP, Snowflake, or Databricks. Experience with Agentic AI, hyper-intelligent automation Skills: data,azure,architecture,cloud,gcp,aws,data architecture,data solutions,design,ml,solution architecture,gen ai,llms,ai,python,azure cloud,azure datafactory,azure databricks,data science,problem solving
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |