Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 12.0 years
0 Lacs
pune, maharashtra, india
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. bp is reinventing itself and digital capability is at the core of this vision. As a Senior Enterprise Technology Engineer you are a digital expert bringing deep specialist expertise to bp. Enterprise Technology Engineers work on the strategic technology platforms we exploit from the market, or come with deep skills in the implementation and integration of market solutions into our overall technology landscape. You will bring a broad base of Digital technical knowledge and a strong understanding of software delivery principles. You will be familiar with lifecycle methods, with Agile delivery and the DevOps approach at the core. You will be skilled in the application of approaches such as Site Reliability Engineering in the delivery and operation of the technologies you deliver, working as part of multi disciplinary squads. You thrive in a culture of continuous improvement within teams, encouraging and empowering innovation and the delivery of changes that optimise operational efficiency and user experience. You are curious and improve your skills through continuous learning of new technologies, trends & methods, applying knowledge gained to improve bp standards and the capabilities of the Engineering Community. You coach others in the Field to drive improved performance across our business. You embrace a culture of change and agility, evolving continuously, adapting to our changing world. You are an effective great teammate, looking beyond your own area/organizational boundaries to consider the bigger picture and/or perspective of others, while understanding cultural differences. You continually enhance your self-awareness and seek guidance from others on your impact and effectiveness. Well organized, you balance proactive and reactive approaches and multiple priorities to complete tasks on time. You apply judgment and common sense you use insight and good judgment to inform actions and respond to situations as they arise. What You Will Deliver Design and implement enterprise technology architecture, security frameworks, and platform engineering. Strengthen platform security and ensure compliance with industry standards and regulations. Optimize system performance, availability, and scalability. Advance enterprise modernization and drive seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies. Develop automated security monitoring, vulnerability assessments, and identity management solutions. Drive adoption of CI/CD, DevOps, and Infrastructure-as-Code methodologies. Enhance disaster recovery and resilience planning for enterprise platforms. Partner with technology teams and external vendors to align enterprise solutions with business goals. Lead and mentor engineering teams, fostering a culture of innovation and excellence. Shape strategies for enterprise investments, cybersecurity risk mitigation, and operational efficiency. Collaborate across teams to implement scalable solutions and long-term technology roadmaps. What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors degree in technology, Engineering, or a related technical discipline. 10+ years of experience in enterprise technology, security, and operations in large-scale global environments. Experience implementing CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (AWS Cloud Development Kit, Azure Bicep, etc.). Deep knowledge of ITIL, Agile, and enterprise IT governance frameworks. Proficiency in programming languages such as Python, Java, or Scala. Experience with data pipeline frameworks (e.g., Apache Airflow, Kafka, Spark) and cloud-based data platforms, preferably with AWS & exposure to GCP, Azure. Expertise in database technologies (SQL, NoSQL, Data Lakes), data design and data modeling principles. Essential Skills Proven technical expertise in AWS, Databricks with exposure to Microsoft Azure and Palantir. Strong understanding of data ingestion, pipelines, governance, security, and visualization. Experience designing, deploying, and optimizing multi-cloud data platforms that support large-scale, cloud-native workloads balancing cost efficiency with performance and resilience. Hands-on performance tuning, data indexing, and distributed query optimization. Experience with real-time, and batch data streaming architectures. Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. You are comfortable operating in an environment that is loosely coupled but tightly aligned toward a shared vision Travel Requirement No travel is expected with this role Relocation Assistance: This role is not eligible for relocation Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 3 hours ago
2.0 - 5.0 years
0 Lacs
pune, maharashtra, india
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: What You Will Deliver Develop scripts and code to automate infrastructure provisioning and configuration using Infrastructure-as-Code (IaC) principles and best practices. Optimize the capacity, performance, and cost of cloud resources based on business needs and budget constraints. Manage and ingest persistent data for logging and audit purposes while ensuring data security and compliance. Design and engineer cloud solutions from concept through re-engineeringreducing complexity, reusing code, improving efficiency, and adopting modern technologies. Configure and manage network connectivity, control planes, and internal resource communication across cloud and hybrid environments. Enhance the developer and customer experience by applying engineering best practices, tooling, testing frameworks, and effective written and verbal communication! Implement cloud security controls including Zero Trust, IAM, encryption, firewalls, and thorough code reviewsespecially for AI-generated code or configurations. What you will need to be successful (experience and qualifications) A bachelor&aposs degree in computer science, engineering, or a related field or equivalent work experience. 2 to 5 years of experience in IT, including up to 2 years as a Cloud Engineer or in a similar role. Proficiency in scripting and coding languages such as PowerShell, Python, or C#. Strong knowledge of core cloud services, including virtual machines, containers, PaaS offerings, monitoring, storage, and networking. Experience with CI/CD tools such as Azure DevOps (ADO) or similar platforms for continuous integration and delivery. Familiarity with data platforms including SQL Server, data lakes, and PaaS-based databases. Ability to work both independently and collaboratively within cross-functional teams. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 3 hours ago
14.0 - 20.0 years
0 Lacs
maharashtra
On-site
As a Senior Architect - Data & Cloud at our company, you will be responsible for architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. You will need to have more than 15 years of experience in Technical, Solutioning, and Analytical roles, with 5+ years specifically in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions on Cloud platforms like GCP, AWS, or Azure. Key Responsibilities: - Translate business requirements into functional and non-functional areas, defining boundaries in terms of Availability, Scalability, Performance, Security, and Resilience. - Architect and design scalable data warehouse solutions on cloud platforms like Big Query or Redshift. - Work with various Data Integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Experience in using traditional ETL tools like Informatica, DataStage, OWB, Talend, etc. - Collaborate with internal and external stakeholders to design optimized data analytics solutions. - Mentor young talent within the team and contribute to building assets and accelerators. Qualifications Required: - 14-20 years of relevant experience in the field. - Strong understanding of Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design. - Experience with BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of Security features and Policies in Cloud environments like GCP, AWS, or Azure. - Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. In this role, you will lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. You will interface with multiple stakeholders within IT and business to understand data requirements and take complete responsibility for the successful delivery of projects. Additionally, you will have the opportunity to work in a high-growth startup environment, contribute to the digital transformation journey of customers, and collaborate with a diverse and proactive team of techies. Please note that flexible, remote working options are available to foster productivity and work-life balance.,
Posted 2 days ago
5.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hirePalantir Professionals in the following areas : Job description: We are seeking a highly motivated and technically skilled Senior Data Engineer to join our data team. The ideal candidate will have extensive experience designing, building, and optimizing scalable data pipelines and analytics solutions using Palantir Foundry, along with broader expertise in data architecture, governance, and processing. This role offers the opportunity to work on cutting-edge data platforms and deliver impactful solutions for enterprise data integration, transformation, and advanced analytics. Key Responsibilities: Design, implement, and maintain scalable and reliable data pipelines using Palantir Foundry. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Build reusable data assets, integrate data from multiple sources, and ensure data quality, integrity, and security. Develop and maintain transformation workflows, ontology models, and data governance frameworks within Foundry. Optimize data pipelines for performance, scalability, and cost efficiency. Troubleshoot, monitor, and resolve data issues and pipeline failures in real-time. Implement best practices in data versioning, lineage, and observability. Assist in defining data architecture strategies and tooling roadmaps. Mentor junior engineers and contribute to team knowledge sharing and documentation. Required Qualifications: 5+ years of experience in data engineering, analytics, or software engineering roles. Strong hands-on experience with Palantir Foundry, including ontology modeling, pipelines, data fusion, and transformation logic. Proficient in Python, SQL, and data processing frameworks. Solid understanding of data integration, ETL workflows, and data modeling principles. Experience working with relational databases, data lakes, cloud data warehouses (AWS, Azure, GCP). Strong problem-solving skills and ability to debug complex data workflows. Familiarity with CI/CD pipelines and version control tools like Git. Excellent communication skills and ability to work collaboratively with cross-functional teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 days ago
7.0 - 9.0 years
0 Lacs
hyderabad, telangana, india
On-site
Job Category Software Engineering Job Details About Salesforce Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn't a buzzword - it's a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era You're in the right place! Agentforce is the future of AI, and you are the future of Salesforce. We are seeking an experienced Quality Engineer for Salesforce Agentforce to join our Digital Success Engineering team. The ideal candidate should have a strong background in AI, retrieval-augmented generation (RAG), and Agentic technologies testing, along with expertise in Salesforce and programming languages like APEX, Python or Java. A passion for building robust automated testing solutions for Agentic user experience and application programming interfaces (APIs) is essential. In this role, you will develop automated quality assurance practices for scalable customer-facing and internal Agentforce implementations. Proficiency in testing framework, JavaScript, modern UI frameworks, WebDriver, Selenium, and API testing is required. You will create functional and automation tests for Agent UI and APIs while collaborating with team members to enhance existing code and frameworks. Experience with Jenkins or other CI/CD tools is a plus. Salesforce, the world's #1 AI CRM, has recently unveiled Agentforce, a groundbreaking suite of autonomous AI agents that augment employees and handle tasks in service, sales, marketing and commerce, driving unprecedented efficiency and customer satisfaction. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too - driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good - you've come to the right place. This role is crucial in redefining how we enable innovative experiences driven by Agentforce. You will key to the evolution of Customer Success in the realm of Agentforce. The Team Our Digital Success Engineering team is an interdisciplinary mix of distributed software engineers, architects and engineering managers working collaboratively to build Unified Experiences for our Trailblazers. As customer zero, we drive innovation by harnessing Salesforce technology to create and easy and expert self-service experience that fuels Trailblazer Success. The Role You will develop and implement a quality engineering, testing strategy, testing framework, managing manual and automated test cases to ensure seamless human-agent experiences with Salesforce Agentforce and DataCloud. Your responsibilities include defining and executing a comprehensive testing plan and collaborating with Product Managers, Developers, and cross-functional teams to deliver exceptional Customer Success experiences. PRIMARY RESPONSIBILITIES Quality Engineering Strategy & Leadership: Define, develop, and execute a comprehensive quality engineering and testing strategy specifically for Salesforce Agentforce, Data Cloud, and other relevant Salesforce platform capabilities. Test Automation Expertise: Design, develop, and maintain robust automated test scripts for Agent UI, Experience Site UI, and APIs using industry-standard tools and frameworks such as Selenium, WebDriver, JavaScript frameworks, and API testing tools. AI & Agentforce Testing: Develop specialized testing strategies and test cases for AI-powered features, including Retrieval-Augmented Generation (RAG), Agentic workflows, single turn / multi Turn conversation, Natural Language Processing (NLP), and Large Language Models (LLMs). Use AI and RAG testing metrics to define and measure testing outcome. Data-Driven Quality: Build and maintain datasets for training, fine-tuning, and evaluating the performance and accuracy of LLMs and other AI models. Implement data-driven approaches to measure and improve the quality of AI features. Functional & Salesforce Testing: Design and execute functional test plans and test cases tailored to Salesforce applications, including customizations using Flows, Triggers, Apex Jobs, and more. Possess a strong understanding of Salesforce best practices and governor limits. Continuous Integration & Continuous Delivery (CI/CD): Build and maintain automated quality control pipelines using Jenkins or other CI/CD tools to ensure seamless integration and deployment of high-quality software. Defect Management: Effectively identify, document, track, and manage defects to ensure timely resolution and contribute to root cause analysis. Collaboration & Communication: Partner effectively with Product Managers, Developers, UX Designers, and other stakeholders to understand requirements, provide timely feedback, communicate testing progress, and collaborate on solutions. Best Practices & Standards: Establish and champion quality standards, methodologies, and best practices within the team and across the organization. Performance Optimization: Continuously assess the performance of Agentforce implementations , identify potential bottlenecks, and recommend and implement proactive measures for optimization. Technical Expertise: Stay up-to-date with the latest Salesforce releases, AI advancements, and emerging testing trends. Stakeholder Management: Communicate testing progress, risks, and mitigation strategies effectively to various stakeholders, including both technical and non-technical audiences. Agile Participation: Actively participate in Agile development methodologies, contributing to sprint planning, daily stand-ups, and retrospectives. Problem-Solving & Analytical Skills: Demonstrate strong analytical and problem-solving skills to identify and resolve complex quality issues effectively. Professional Experience/Skills Required: Overall 7+ years of experience in Software Quality Engineering. 2+ years of hands-on experience with the Salesforce platform, including configuration and development across various Salesforce Customer 360 products (e.g., Sales Cloud, Service Cloud, Data Cloud, Einstein ). 3+ years of proven experience in testing Machine Learning (ML), Artificial Intelligence (AI) systems, and/or services. Strong proficiency in at least one programming language such as Python or Java, particularly relevant for test automation and AI/ML testing. Experience with Large Language Models (LLMs) and prompt engineering testing frameworks. Solid understanding of Retrieval-Augmented Generation (RAG) and Agentic technologies. Demonstrated experience in building and maintaining robust automated testing frameworks for UI and APIs. Proficiency in JavaScript and modern UI frameworks (e.g., React, Angular, Vue.js) for UI test automation. Experience with WebDriver and Selenium for browser automation. Strong analytical mindset with experience in using a data-driven approach to measure and improve the quality of complex ML/AI products. Proven experience with Salesforce DevOps, CI/CD processes, and automation tools. Deep understanding of Salesforce architecture, functionalities, and various clouds. Strong understanding of AI, machine learning, and natural language processing principles. Experience with data management principles and Customer Data Platforms (CDPs) or Data Lakes. Familiarity with development tools like SLDS, XML, HTML, SQL, JavaScript, JSON, and CSS. Proven ability to work effectively within cross-functional teams to achieve common goals and business objectives. Experience with Agile development or Scrum project management methodologies and tools. Solid understanding of software engineering principles, key design patterns, and best practices. Strategic thinker with the ability to see the big picture, innovate, and adapt to constant change. Ability to work under pressure, highly adaptable, and well-organized. Highly effective written and verbal communication skills with audiences across all levels of the organization. Strong organizational skills with the ability to establish and manage priorities in a complex and fast-paced environment. Strong knowledge of data management concepts and architectures. Bachelor's or Master's degree in Computer Science, Software Engineering, or equivalent experience. Salesforce Certifications (Preferred): Salesforce Platform App Builder Salesforce Administrator Salesforce-specific certifications related to Agentforce (if and when available) Unleash Your Potential When you join Salesforce, you'll be limitless in all areas of your life. Our benefits and resources support you to find balance and , and our AI agents accelerate your impact so you can . Together, we'll bring the power of Agentforce to organizations of all sizes and deliver amazing experiences that customers love. Apply today to not only shape the future - but to redefine what's possible - for yourself, for AI, and the world. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this . Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that's inclusive, and free from discrimination. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications - without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.
Posted 3 days ago
5.0 - 8.0 years
8 - 10 Lacs
chennai, tamil nadu, india
On-site
Required Skills: Programming: Proficiency in Python is mandatory. Database: Strong SQL skills with the ability to write complex queries. Big Data: Experience with Spark and Hive, including optimization techniques. Data Orchestration: Expertise in Apache Airflow or equivalent tools. Data Lake Development: Hands-on experience in creating and managing data lakes. Preferred Skills: Trino or AWS Athena experience. Snowflake knowledge. Familiarity with data quality frameworks. Experience with file storage systems like AWS S3. Qualifications: Proven experience in data engineering roles. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Ability to work independently and deliver high-quality solutions.
Posted 4 days ago
5.0 - 8.0 years
8 - 10 Lacs
hyderabad, telangana, india
On-site
Required Skills: Programming: Proficiency in Python is mandatory. Database: Strong SQL skills with the ability to write complex queries. Big Data: Experience with Spark and Hive, including optimization techniques. Data Orchestration: Expertise in Apache Airflow or equivalent tools. Data Lake Development: Hands-on experience in creating and managing data lakes. Preferred Skills: Trino or AWS Athena experience. Snowflake knowledge. Familiarity with data quality frameworks. Experience with file storage systems like AWS S3. Qualifications: Proven experience in data engineering roles. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Ability to work independently and deliver high-quality solutions.
Posted 4 days ago
14.0 - 20.0 years
0 Lacs
maharashtra
On-site
Role Overview: As a Principal Architect - Data & Cloud at Quantiphi, you will be responsible for leveraging your extensive experience in technical, solutioning, and analytical roles to architect and design end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. You will play a crucial role in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure. Your expertise will be instrumental in designing scalable data warehouse solutions on Big Query or Redshift and working with various data integration, storage, and pipeline tools on Cloud. Additionally, you will serve as a trusted technical advisor to customers, lead multiple data engagements on GCP Cloud, and contribute to the development of assets and accelerators. Key Responsibilities: - Possess more than 15 years of experience in technical, solutioning, and analytical roles - Have 5+ years of experience in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Ability to understand business requirements, translate them into functional and non-functional areas, and define boundaries in terms of availability, scalability, performance, security, and resilience - Architect, design, and implement end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets - Work with distributed computing and enterprise environments like Hadoop and Cloud platforms - Proficient in various data integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Design scalable data warehouse solutions on Cloud with tools like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. - Experience with Machine Learning Frameworks like TensorFlow, Pytorch - Understand Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Good understanding of BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of security features and policies in Cloud environments like GCP, AWS, Azure - Work on business transformation projects for moving On-Premise data solutions to Cloud platforms - Serve as a trusted technical advisor to customers and solutions for complex Cloud and Data-related technical challenges - Be a thought leader in architecture design and development of cloud data analytics solutions - Liaise with internal and external stakeholders to design optimized data analytics solutions - Collaborate with SMEs and Solutions Architects from leading cloud providers to present solutions to customers - Support Quantiphi Sales and GTM teams from a technical perspective in building proposals and SOWs - Lead discovery and design workshops with potential customers globally - Design and deliver thought leadership webinars and tech talks with customers and partners - Identify areas for productization and feature enhancement for Quantiphi's product assets Qualifications Required: - Bachelor's or Master's degree in Computer Science, Information Technology, or related field - 14-20 years of experience in technical, solutioning, and analytical roles - Strong expertise in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Proficiency in various data integration, ETL technologies on Cloud, and Cloud and On-Premise databases - Experience with Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Knowledge of BI Reporting and Dashboarding tools and security features in Cloud environments Additional Company Details: While technology is the heart of Quantiphi's business, the company attributes its success to its global and diverse culture built on transparency, diversity, integrity, learning, and growth. Working at Quantiphi provides you with the opportunity to be part of a culture that encourages innovation, excellence, and personal growth, fostering a work environment where you can thrive both professionally and personally. Joining Quantiphi means being part of a dynamic team of tech enthusiasts dedicated to translating data into tangible business value for clients. Flexible remote working options are available to promote productivity and work-life balance. ,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Consultant, you will work closely with internal and external stakeholders and deliver high-quality analytics solutions to real-world Pharma commercial organizations" business problems. You will bring deep Pharma/Healthcare domain expertise and use cloud data tools to help solve complex problems. Key Responsibilities: - Collaborate with internal teams and client stakeholders to deliver Business Intelligence solutions that support key decision-making for the Commercial function of Pharma organizations. - Leverage deep domain knowledge of pharmaceutical sales, claims, and secondary data to structure and optimize BI reporting frameworks. - Develop, maintain, and optimize interactive dashboards and visualizations using Tableau (primary), along with other BI tools like Power BI or Qlik, to enable data-driven insights. - Translate business requirements into effective data visualizations and actionable reporting solutions tailored to end-user needs. - Write complex SQL queries and work with large datasets housed in Data Lakes or Data Warehouses to extract, transform, and present data efficiently. - Conduct data validation, QA checks, and troubleshoot stakeholder-reported issues by performing root cause analysis and implementing solutions. - Collaborate with data engineering teams to define data models, KPIs, and automate data pipelines feeding BI tools. - Manage ad-hoc and recurring reporting needs, ensuring accuracy, timeliness, and consistency of data outputs. - Drive process improvements in dashboard development, data governance, and reporting workflows. - Document dashboard specifications, data definitions, and maintain data dictionaries. - Stay up to date with industry trends in BI tools, visualization of best practices, and emerging data sources in the healthcare and pharma space. - Prioritize and manage multiple BI project requests in a fast-paced, dynamic environment. Qualifications: - 2-4 years of experience in BI development, reporting, or data visualization, preferably in the pharmaceutical or life sciences domain. - Strong hands-on experience building dashboards using Tableau (preferred), Power BI, or Qlik. - Advanced SQL skills for querying and transforming data across complex data models. - Familiarity with pharma data such as Sales, Claims, and secondary market data is a strong plus. - Experience in data profiling, cleansing, and standardization techniques. - Ability to translate business questions into effective visual analytics. - Strong communication skills to interact with stakeholders and present data insights clearly. - Self-driven, detail-oriented, and comfortable working with minimal supervision in a team-oriented environment. - Exposure to data warehousing concepts and cloud data platforms (e.g., Snowflake, Redshift, or BigQuery) is an advantage. Education: - Bachelors or Masters Degree (computer science, engineering, or other technical disciplines),
Posted 4 days ago
4.0 - 10.0 years
0 Lacs
ghaziabad, uttar pradesh
On-site
You will be joining our Corporate IT team as a Data Analytics & Digital Solutions Lead, where your primary role will involve driving the implementation and adoption of Data Lakes & Data Analytics platforms and Digital Transformation initiatives. Your responsibilities will include: - Leading the design, implementation, and management of Data Lake and Data Analytics platforms to support business decision-making. - Managing and delivering multiple Data & Analytics implementation projects, ensuring high quality and timely outcomes. - Reviewing effort estimations and providing guidance for SAP and in-house software application development projects. - Collaborating with cross-functional business stakeholders to translate requirements into effective data and analytics solutions. - Overseeing application development initiatives with exposure to SAP modules and integration with data platforms. Ensuring proper documentation, governance, and compliance for data management. - Developing impactful presentations and dashboards to communicate insights clearly to senior leadership. - Providing guidance and mentorship to junior team members on data, analytics, and project delivery best practices. - Staying current with emerging technologies in Data, Analytics, and Digital Transformation to recommend innovative solutions. As for the qualifications required for this role: - Bachelor of Technology (B.Tech) degree in Computer Science, Information Technology, or related field. - 8-10 years of overall IT & Digital experience, with 4-5 years in Data & Analytics platform implementation. - Hands-on expertise with Data Lakes, Data Analytics platforms, and data integration technologies. - Strong working knowledge of SAP modules, effort estimation, and application development. Additionally, it is preferred if you have: - Proven track record of leading Data & Analytics implementations in large organizations. - Advanced knowledge of modern data architectures, cloud platforms, and visualization tools. Experience in FMCG or related industries. Your soft skills and competencies should include: - Strong leadership and team management abilities. - Excellent communication and presentation skills, with proficiency in MS PowerPoint. - Ability to collaborate effectively with business stakeholders and cross-functional teams. Strong analytical and problem-solving mindset with attention to detail. - Adaptability to fast-paced, dynamic environments. We are an Equal Opportunity Employer and believe in the importance of a diverse workforce to cater to the business environment we operate in.,
Posted 5 days ago
10.0 - 15.0 years
40 - 80 Lacs
hyderabad
Work from Office
We are looking for an experienced Vice President – Big Data to lead our data engineering and analytics initiatives. will have a strong background in building and scaling big data platforms, managing large teams
Posted 5 days ago
12.0 - 18.0 years
0 Lacs
pune, maharashtra, india
On-site
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world's most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description Seeking a seasoned Delivery Manager and Solution Architect to lead the design, delivery, and optimization of Manufacturing Execution Systems (MES) and Industry 4.0 solutions, with a strong emphasis on ISA-95 standards , Pharma 4.0 principles , and industrial control systems . This role requires a strategic thinker with deep technical expertise, operational insight, and leadership capabilities to drive digital transformation in regulated manufacturing environments. Key Responsibilities: Lead end-to-end delivery of MES and Industry 4.0 projects across global pharmaceutical and industrial clients. Architect scalable, compliant, and future-ready MES solutions aligned with ISA-95 and GAMP standards. Define and implement Pharma 4.0 strategies, integrating IT/OT systems, automation layers, and cloud platforms. Collaborate with cross-functional teams including automation, IT, quality, and operations to ensure seamless solution delivery. Oversee project governance, risk management, and stakeholder communication. Drive performance assessment and continuous improvement using operational KPIs and management indicators . Ensure compliance with regulatory requirements (e.g., 21 CFR Part 11 , GxP ) and cybersecurity standards. Primary Skills (Must-Have): Deep knowledge of ISA-95 architecture and its application in MES and industrial automation. Proven experience in MES platforms (e.g., Werum PAS-X, Siemens Opcenter, Rockwell PharmaSuite, Apriso). Strong understanding of Pharma 4.0 frameworks and digital maturity models. Expertise in industrial control systems , PLC/SCADA integration , and equipment connectivity . Experience in performance management , OEE , and real-time production monitoring . Familiarity with mechatronics , industrial equipment , and shop floor operations . Secondary Skills (Good-to-Have): Exposure to cloud-based MES , IoT platforms , and edge computing . Knowledge of data lakes , data historians , and analytics platforms . Experience with AI/ML for predictive maintenance and quality analytics. Understanding of IT/OT convergence , cybersecurity , and network architecture . Qualifications: Bachelor's or Master's degree in Engineering , Mechatronics , Computer Science , or related fields. 12-18 years of experience in MES, industrial automation, or digital manufacturing. Certifications in ISA , GAMP , PMP , or ITIL are a plus. Soft Skills: Strong leadership and team management capabilities. Excellent communication, stakeholder engagement, and presentation skills. Strategic thinking with a focus on innovation and operational excellence. Ability to manage complex, multi-site, and multi-vendor project environments. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a highly skilled Snowflake Developer with extensive experience in designing, implementing, and managing Snowflake-based data solutions. Your role will involve developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Your key responsibilities will include designing and implementing scalable, efficient, and secure Snowflake solutions to meet business requirements. You will also develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. In addition, you will implement Snowflake-based data warehouses, data lakes, and data integration solutions, and manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaboration with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals will be crucial. Furthermore, you will drive continuous improvement by leveraging the latest Snowflake features and industry trends. To qualify for this role, you should have a Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field, along with 8+ years of experience in data architecture, data engineering, or a related field. You must possess extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Exposure working in Airflow is also required, along with a proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services is preferred. A Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) would be a plus.,
Posted 6 days ago
14.0 - 18.0 years
0 Lacs
karnataka
On-site
You are looking for an Assistant Vice President (AVP) Databricks Squad Delivery Lead with at least 14 years of experience to join the team in Bangalore/Hyderabad/NCR/Kolkata/Mumbai/Pune. As the Databricks Squad Delivery Lead, you will be responsible for overseeing project delivery, team leadership, architecture reviews, and client engagement. Your role will involve optimizing Databricks implementations across cloud platforms such as AWS, Azure, and GCP, while leading cross-functional teams. Your key responsibilities will include leading and managing end-to-end delivery of Databricks-based solutions, serving as a subject matter expert (SME) for Databricks architecture, collaborating with architects and engineers to design scalable data pipelines and analytics platforms, overseeing Databricks workspace setup, performance tuning, and cost optimization, acting as the primary point of contact for client stakeholders, driving innovation within the team by implementing best practices, tools, and technologies, and ensuring alignment between business goals and technical solutions. The ideal candidate for this role must have a Bachelor's degree in Computer Science, Engineering, or equivalent (Masters or MBA preferred), hands-on experience delivering data engineering/analytics projects using Databricks, experience managing cloud-based data pipelines on AWS, Azure, or GCP, strong leadership skills, and excellent client-facing communication. Preferred skills for this position include proficiency with Spark, Delta Lake, MLflow, and distributed computing, expertise in data engineering concepts (ETL, data lakes, data warehousing), certifications in Databricks or cloud platforms (AWS/Azure/GCP) are a plus, and Agile/Scrum or PMP certification is an added advantage.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Tech Lead at Barclays, you will be responsible for designing, developing, and enhancing software solutions using various engineering methodologies to deliver business, platform, and technology capabilities for both customers and colleagues. Your role will involve working with ETL/API solutions utilizing Ab>Initio with massively parallel processing architecture, as well as hands-on experience in AWS, Glue, Pyspark, and real-time processing. You should possess a strong understanding of ETL design patterns, data warehouse architecture, and data analytics skills to develop and maintain complex datasets using SQL tools. Additionally, you will be tasked with building and maintaining data architecture pipelines, implementing data warehouse and data lakes, and developing processing and analysis algorithms to handle data complexity and volumes effectively. Highly valued skills for this role include risk mitigation, stakeholder management, exceptional communication, exposure to AWS, and the application of advanced analytical techniques such as machine learning and AI. Your success in this role will be assessed based on key critical skills like risk and controls, change and transformation, business acumen, strategic thinking, digital and technology expertise, and job-specific technical skills. This position is based in Pune. As a Senior Tech Lead at Barclays, your main purpose will be to design, develop, and enhance software solutions using various engineering methodologies to provide business, platform, and technology capabilities for customers and colleagues. Your key accountabilities will include: - Developing and delivering high-quality software solutions by utilizing industry-aligned programming languages, frameworks, and tools to ensure scalability, maintainability, and performance optimization. - Collaborating cross-functionally with product managers, designers, and engineers to define software requirements, devise solution strategies, and align with business objectives. - Participating in code reviews, promoting code quality, and fostering a culture of knowledge sharing. - Staying updated on industry technology trends, contributing to technology communities, and adhering to secure coding practices and effective unit testing. For Assistant Vice President Expectations: - Advising and influencing decision-making, contributing to policy development, and ensuring operational effectiveness. - Leading a team to deliver impactful work that affects the entire business function, setting objectives, coaching employees, and fostering a culture of technical excellence. - Demonstrating leadership behaviours for creating an environment for colleagues to thrive and deliver to a consistently excellent standard. For individual contributors: - Leading collaborative assignments, guiding team members, identifying the need for specialized areas, and driving projects to meet required outcomes. - Consulting on complex issues, mitigating risks, and developing new policies and procedures to support control and governance. - Engaging in complex data analysis, solving problems creatively, and effectively communicating complex information to stakeholders. All colleagues at Barclays are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are an experienced Data Architect who will be responsible for leading the transformation of enterprise data solutions, particularly focused on migrating Alteryx workflows into Azure Databricks. Your expertise in the Microsoft Azure ecosystem, including Azure Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and strong background in data architecture, governance, and distributed computing will be crucial for this role. Your strategic thinking and hands-on architectural leadership will ensure the development of scalable, secure, and high-performance data solutions. Your key responsibilities will include defining the migration strategy for transforming Alteryx workflows into scalable, cloud-native data solutions on Azure Databricks. You will architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure Data Lake, and Synapse, while establishing best practices, standards, and governance frameworks for pipeline design, orchestration, and data lifecycle management. Collaborating with business stakeholders, guiding engineering teams, overseeing data quality, lineage, and security compliance, and driving CI/CD adoption for Azure Databricks will also be part of your role. Furthermore, you will provide architectural leadership, design reviews, and mentorship to engineering and analytics teams. Optimizing solutions for performance, scalability, and cost-efficiency within Azure, participating in enterprise architecture forums, and influencing data strategy across the organization are also expected from you. To be successful in this role, you should have at least 10 years of experience in data architecture, engineering, or solution design. Proven expertise in Alteryx workflows and their modernization into Azure Databricks, deep knowledge of the Microsoft Azure data ecosystem, strong background in data governance, lineage, security, and compliance frameworks, and proficiency in Python, SQL, and Apache Spark are essential. Excellent leadership, communication, and stakeholder management skills are also required. Preferred qualifications include Microsoft Azure certifications, experience in leading large-scale migration programs or modernization initiatives, familiarity with enterprise architecture frameworks, exposure to machine learning enablement on Azure Databricks, and understanding of Agile delivery and working in multi-disciplinary teams.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics, and leading visualization tools enabling the company's aim to become a fully digital organization. Development and creation of high-quality analysis and reporting solutions. Development of ETL routines using SSIS and ADF into Azure SQL Datawarehouse. Integration of global applications such as D365, Salesforce, and Workday into cloud data warehouse. Experience using D365 ERP systems and creating BI reports. Development of SQL procedures and SQL views to support data load as well as data visualization. Development of data mapping tables and routines to join data from multiple different data sources. Performance tuning SQL queries to speed up data loading time and query execution. Liaising with business stakeholders, gathering requirements, and delivering appropriate technical solutions to meet the business needs. Strong communications skills and ability to turn business requirements into technical solutions. Experience in developing data lakes and data warehouses using Microsoft Azure. Demonstrable experience designing high-quality dashboards using Tableau and Power BI. Strong database design skills, including an understanding of both normalized form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g. Kimble Data warehousing. Experience in Azure Analysis Services and DAX (Data Analysis Expressions). Experience in Cloud-based data integration tools like Azure Data Factory or Snaplogic. Experience in Azure Dev Ops is a plus. Familiarity with agile development techniques and objectives. Excellent communication skills. Over 5 years of experience in Business Intelligence Analyst or Developer roles. Over 4 years of experience using Microsoft Azure for data warehousing. Experience in designing and performance tuning data warehouses and data lakes. 4+ years extensive experience in developing data models and dashboards using Tableau/Power BI within an IT department. Self-starter who can manage own direction and projects. Being delivery-focused with a can-do attitude in a sometimes-challenging environment is essential. Experience using Tableau to visualize data held in SQL Server. Experience working with finance data highly desirable. Flat Job Hierarchy. Flexible Work Hours. Casual Dress Code. Disciplined Workplace Environment. Team Cohesion. Task Autonomy. Smooth Reporting. Extensive Training. POSH Act for inclusivity. Flexible Leave Policy. Monthly Paid Leaves. Bonus & Incentives. Team Appreciation. Individual Awards. Performance Rewards. Salary Appraisals.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Join our team as a Data Engineer, where you will develop and deliver comprehensive data solutions to meet business requirements. As a member of our dynamic team, your responsibilities will include creating and optimizing data pipelines and data models for data lakes and data warehouses towards specific microservices. The role encompasses responsibility for data ingress into data lake/DWH, ensuring the content, format, and integrity sustain throughout the data lifecycle. You will develop, test, and enhance data solution components and design detailed data solution interfaces and integration specifications. Propose implementation strategies based on designed solutions, estimating effort requirements and guiding data pipeline creation. Perform configuration, code walkthroughs, peer reviews, defect tracking, and unit testing of data solution components. Verify data solution after deployment and provide necessary maintenance and support. Carry out operational tasks including data loading, batch job scheduling, and monitoring. Analyze incidents to identify necessary corrections and areas for improvement. Ensure solution data quality, conduct corrective actions, and devise improvement plans. Support software deployment and integration into the target environment. Troubleshoot issues during system and integration testing, UAT, and production. The skills you bring include expertise in Data Lakes, Ericsson Operations Engine - Mode Of Operations, Software Deployment, User Acceptance Testing (UAT), Business Data Requirement, Statistical Graphics, Design Specifications, Data Quality, Data Solution Roadmap Support, Data Maintenance, Data Pipeline, Data Modeling, Software Testing, Requirements Specifications, and ITIL.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Manager, you will be responsible for leading and overseeing a team of data engineers. Your role will involve defining the overall data architecture strategy, building scalable data pipelines, ensuring data quality and governance, collaborating with cross-functional teams to drive data-driven decision making, and managing the technical roadmap for data engineering initiatives within the organization. To excel in this role, you will need extensive experience in data engineering technologies, strong leadership skills, and a deep understanding of business needs to translate them into technical solutions. Your key responsibilities will include: Strategic Leadership: - Developing and executing a comprehensive data engineering strategy that aligns with business objectives, encompassing data architecture, data pipelines, and data governance policies. - Defining the long-term vision for data infrastructure, identifying opportunities for modernization and innovation. - Collaborating with senior stakeholders across departments to comprehend data needs and translating them into actionable data engineering plans. Team Management: - Building, leading, and mentoring a high-performing team of data engineers, which includes recruiting, performance management, and career development. - Fostering a culture of collaboration, innovation, and continuous improvement within the data engineering team. - Establishing clear technical standards, best practices, and coding guidelines for data engineering projects. Technical Execution: - Designing and implementing scalable data pipelines for data ingestion, transformation, and loading (ETL/ELT) across various data sources. - Architecting and managing data warehouses, data lakes, and other data storage solutions on cloud platforms such as AWS, Azure, and GCP. - Overseeing the development and maintenance of data quality monitoring systems to ensure data accuracy and reliability. In this role, you will play a crucial part in shaping the data engineering landscape of the organization, driving innovation, and enabling data-driven decision-making processes. Your leadership and technical expertise will be instrumental in the success of data engineering initiatives and the overall achievement of business objectives.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Genpact is a global professional services and solutions firm with over 125,000 employees in 30+ countries, driven by curiosity, agility, and the desire to create lasting value for clients. We serve leading enterprises, including the Fortune Global 500, with deep business knowledge, digital operations services, and expertise in data, technology, and AI. We are inviting applications for the position of Vice President, Databricks Squad Delivery Lead. The successful candidate will be responsible for overseeing the end-to-end delivery of Databricks-based solutions for clients, ensuring successful implementation, optimization, and scaling of big data and analytics solutions. This role involves driving the adoption of Databricks as the preferred platform for data engineering and analytics, while managing a team of data engineers and developers. Responsibilities: - Lead and manage Databricks-based project delivery, ensuring solutions meet client requirements, best practices, and industry standards. - Act as the subject matter expert on Databricks, providing guidance on architecture, implementation, and optimization. - Collaborate with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads. - Serve as the primary point of contact for clients, ensuring alignment between business requirements and technical delivery. - Maintain effective communication with stakeholders, providing regular updates on project status, risks, and achievements. - Oversee the setup, deployment, and optimization of Databricks workspaces, clusters, and pipelines. - Ensure Databricks solutions are cost and performance optimized, utilizing best practices for data storage, processing, and querying. - Continuously evaluate the effectiveness of the Databricks platform and processes, suggesting improvements to enhance delivery efficiency and effectiveness. - Drive innovation within the team, introducing new tools, technologies, and best practices to improve delivery quality. Qualifications we seek in you: Minimum Qualifications/Skills: - Bachelor's degree in Computer Science, Engineering, or a related field (Master's or MBA preferred). - Relevant years of experience in IT services with a focus on Databricks and cloud-based data engineering. Preferred Qualifications/Skills: - Proven experience in leading end-to-end delivery of data engineering or analytics solutions on Databricks. - Strong experience in cloud technologies (AWS, Azure, GCP), data pipelines, and big data tools. - Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies. - Expertise in data engineering concepts, including ETL, data lakes, data warehousing, and distributed computing. Preferred Certifications: - Databricks Certified Associate or Professional. - Cloud certifications (AWS Certified Solutions Architect, Azure Data Engineer, or equivalent). - Certifications in data engineering, big data technologies, or project management (e.g., PMP, Scrum Master). Job Details: - Job Title: Vice President - Primary Location: India-Bangalore - Schedule: Full-time - Education Level: Bachelor's/Graduation/Equivalent - Job Posting Date: Jan 30, 2025, 3:09:32 AM - Unposting Date: Mar 1, 2025, 12:29:00 PM - Master Skills List: Digital - Job Category: Full Time,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
We are seeking a Senior Data Architect with over 7 years of experience in the field, specifically in data architecture roles. As a Senior Data Architect, your responsibilities will involve designing and implementing scalable, secure, and cost-effective data architectures utilizing Google Cloud Platform (GCP). You will play a key role in leading the design and development of data pipelines using tools such as BigQuery, Dataflow, and Cloud Storage. Additionally, you will be responsible for architecting and implementing data lakes, data warehouses, and real-time data processing solutions on GCP. It will be your duty to ensure that the data architecture is aligned with business objectives, governance, and compliance requirements. Collaboration with stakeholders to define data strategy and roadmap will be essential. Moreover, you will design and deploy BigQuery solutions for optimized performance and cost efficiency, as well as build and maintain ETL/ELT pipelines for large-scale data processing. Utilizing Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration, you will implement best practices for data security, privacy, and compliance in cloud environments. Integration of machine learning workflows with data pipelines and analytics tools will also be within your scope of work. Your expertise will be crucial in defining data governance frameworks and managing data lineage. Furthermore, you will lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimizing cloud infrastructure for scalability, performance, and reliability will enable you to mentor junior team members and guarantee adherence to architectural standards. Collaboration with DevOps teams for the implementation of Infrastructure as Code (Terraform, Cloud Deployment Manager) will be part of your responsibilities. Ensuring high availability and disaster recovery solutions are integrated into data systems, conducting technical reviews, audits, and performance tuning for data solutions, and designing multi-region and multi-cloud data architecture solutions will be essential tasks. Staying updated on emerging technologies and trends in data engineering and GCP will be crucial to driving innovation in data architecture, including recommending new tools and services on GCP. Preferred qualifications include a Google Cloud Certification, with primary skills encompassing 7+ years of data architecture experience, expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services, strong proficiency in SQL, Python, or other data processing languages, experience in cloud security, data governance, and compliance frameworks, strong problem-solving skills, and the ability to architect solutions for complex data environments. Leadership experience and excellent communication and collaboration skills are also highly valued. Role: Senior Data Architect Location: Trivandrum/Bangalore Close Date: 14-03-2025,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We're looking for a Senior expertise in Data analytics to create and manage large BI and analytics solutions using Visualization Tools such as OBIEE/OAC that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. Having business acumen and problem-solving aptitude would be a plus. Your key responsibilities - Need to work as a team member and Lead to contribute in various technical streams of OBIEE/OAC implementation projects. - Provide product and design level technical best practices. - Interface and communicate with the onsite coordinators. - Completion of assigned tasks on time and regular status reporting to the lead. Skills and attributes for success - Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. - Strong communication, presentation, and team building skills and experience in producing high-quality reports, papers, and presentations. - Exposure to BI and other visualization tools in the market. - Building a quality culture. - Foster teamwork. - Participating in the organization-wide people initiatives. To qualify for the role, you must have - BE/BTech/MCA/MBA with adequate industry experience. - Should have at least around 3 to 7 years of experience in OBIEE/OAC. - Experience in Working with OBIEE, OAC end-to-end implementation. - Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS. - Should have knowledge of reporting, dashboards, RPD logical modeling. - Experience with BI Publisher. - Experience with Agents. - Experience in Security implementation in OAC/OBIEE. - Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets. - Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC. - Experience in creating pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations. - Ability to create a data set, map layers, multiple data visualization, story in OAC. - Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. - Excellent written and verbal communication. - Having Cloud experience is an added advantage. - Migrating OBIEE on-premise to Oracle analytics in the cloud. - Knowledge and working experience with Oracle autonomous database. - Strong knowledge in DWH concepts. - Strong data modeling skills. - Familiar with Agile and Waterfall SDLC processes. - Strong SQL/PLSQL with analytical skill. Ideally, you'll also have - Experience in Insurance and Banking domains. - Strong hold in project delivery and team management. - Excellent written and verbal communication skills.,
Posted 1 week ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
If you are seeking further opportunities to advance your career, you can take the next step in realizing your potential by joining HSBC. HSBC is a global banking and financial services organization operating in 62 countries and territories. The organization's goal is to be present where growth occurs, supporting businesses to thrive, economies to prosper, and individuals to achieve their aspirations. Currently, HSBC is looking for an experienced professional to join the team in the position of Data Technology Lead, focusing on Data Privacy. The role is open in Pune or Hyderabad. As a Data Technology Lead, you will be responsible for driving the strategy, engineering, and governance of Data Privacy technology within the CTO Data Technology function. Your role is crucial in ensuring the bank's compliance with complex global data privacy regulations and commitments to customer trust through scalable, automated, and resilient technology solutions. Your main responsibilities will include designing and executing enterprise-wide capabilities for classifying, protecting, governing, and monitoring personal and sensitive data throughout its lifecycle, from data discovery to secure deletion. This will involve integrating solutions across data platforms, operational systems, and third-party systems. You will lead cross-functional teams and collaborate closely with Group Privacy, Legal, Risk, Cybersecurity, and business-aligned CTOs to implement privacy-by-design practices across platforms and establish robust data protection standards. This leadership role is highly impactful, requiring expertise in privacy technologies, platform engineering, control automation, and global compliance. Key Responsibilities: - Define and lead the enterprise strategy for Data Privacy Technology across different data environments. - Design and implement technology capabilities to support privacy compliance frameworks such as GDPR. - Lead the development and integration of solutions for data classification, consent management, access controls, data subject rights fulfillment, data retention, and disposal. - Govern and oversee the privacy tooling landscape, ensuring robust metadata management and control enforcement. - Collaborate with Group CPO, Legal, and Compliance to translate regulatory mandates into implementable technology solutions. - Embed privacy-by-design principles into data pipelines, software development lifecycles, and DevSecOps practices. - Drive enterprise-wide adoption and monitor privacy controls" effectiveness using metrics, dashboards, and automated audits. - Lead engineering teams to deliver scalable services with resilience, performance, and observability. - Participate in regulatory engagements, internal audits, and risk forums related to data privacy controls. - Cultivate a high-performing team culture and advocate for privacy as a core design principle. Requirements: - 15+ years of relevant experience in enterprise data or technology roles, including senior leadership in data privacy, security, or compliance engineering. - Deep expertise in data privacy technologies, both product-based and in-house development. - Strong knowledge of global privacy regulations and frameworks, such as GDPR. - Technical proficiency in data discovery, classification, masking, encryption, and access control enforcement. - Understanding of metadata-driven architectures and control automation. - Experience in hybrid data estates, including data lakes and multi-cloud environments. - Proven track record of partnering with legal, compliance, and cybersecurity teams to implement privacy programs aligned with business needs. - Ability to lead multi-regional engineering teams, make architectural decisions, and deliver at scale. - Experience with platform monitoring, policy enforcement, and control assurance frameworks. Join HSBC to achieve more in your career. For more information and to explore opportunities, visit www.hsbc.com/careers. Please note that personal data provided by applicants will be handled in accordance with the Bank's Privacy Statement available on the website.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You will be joining a team at PwC that specializes in data and analytics engineering, utilizing advanced technologies and techniques to create strong data solutions for clients. Your role will involve the transformation of raw data into actionable insights, supporting informed decision-making and fostering business growth. In the field of intelligent automation, your focus will be on process mining, designing automation solutions, and implementing process automation, robotic automation, and digital workflow solutions to enhance operational efficiency and cost reduction for clients. With a requirement of at least 8 years of hands-on experience, your responsibilities will include leading and overseeing a team of software engineers working on advanced software solutions for GenAI projects. You will collaborate with senior leadership and cross-functional teams to gather business requirements, identify areas for technological enhancements, and ensure alignment with organizational objectives. Designing event-driven architectures for real-time data processing, utilizing containerization technologies like Kubernetes, managing data lakes effectively, and promoting Python as the primary programming language are key aspects of your role. Moreover, you will facilitate collaboration between software engineers, data scientists, data engineers, and DevOps teams to ensure seamless integration and deployment of GenAI models. Keeping abreast of GenAI technologies, translating complex business needs into technical solutions, documenting software engineering processes, and promoting a culture of excellence will be crucial tasks. Continuous professional development of the team through acquiring new solution architecture certifications and adherence to industry best practices will also be part of your responsibilities. The ideal candidate will possess a degree in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA along with a strong background in software engineering and solution architecture.,
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
mumbai, maharashtra, india
Remote
About company : Netenrich boosts the effectiveness of organizations security and digital operations so they can avoid disruption and manage risk. Resolution Intelligence CloudTM is our native-cloud data analytics platform for enterprises and services providers that need highly scalable, multitenant security operations and/or digital operations management. Resolution Intelligence Cloud transforms security and operations data into intelligence that organizations can act on before critical issues occur. More than 3,000 customers and managed service providers rely on Netenrich to deliver secure operations at scale. Job Title : Implementation Engineer Years of Experience : Relevant 4+ Years Work Location : Mumbai (Remote) Job Summary: We are seeking a skilled and experienced Cybersecurity Implementation Engineer with expertise in customer parser development, Yara rules creation, playbook implementation, and data ingestion techniques. This role presents an exciting opportunity to contribute to the design and implementation of cutting-edge cybersecurity solutions while collaborating with a talented team of professionals. Responsibilities : Develop custom parsers to extract and normalize data from diverse sources, including logs, network traffic, and endpoint data. Design, develop, and maintain Yara rules for threat detection and malware analysis, ensuring high accuracy and effectiveness. Create and implement playbook automation to streamline incident response processes and improve operational efficiency. Design and implement data ingestion pipelines to collect, process, and analyze large volumes of security data from various sources. Collaborate with cross-functional teams to understand customer requirements and customize cybersecurity solutions to meet their needs. Conduct research and analysis to identify emerging threats and vulnerabilities, and develop proactive detection mechanisms. Participate in security incident response activities, providing technical expertise and support as needed. Stay abreast of the latest cybersecurity trends, technologies, and best practices, and share knowledge with the team. Work closely with customers to understand their security challenges and requirements, and provide expert guidance and support. Qualifications : Bachelors degree in Computer Science, Information Security, or related field. 4 years of experience in cybersecurity, with a focus on implementation. Strong expertise in developing custom parsers for log and data normalization. Proficiency in creating and maintaining Yara rules for threat detection and malware analysis. Experience in designing and implementing playbook automation using tools such as Demisto, Phantom, or similar platforms. Solid understanding of data ingestion techniques and technologies, including log management systems and data lakes. Hands-on experience with SIEM (Security Information and Event Management) solutions such as Splunk, ELK, or QRadar. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with internal teams and customers. Relevant cybersecurity certifications (e.g., CISSP, CEH, GIAC) are a plus. If you are a passionate and driven cybersecurity professional with expertise in customer parser development, Yara rules creation, playbook implementation, and data ingestion techniques, we want to hear from you. Join us in our mission to protect our organization and our customers from cyber threats. If your profile is a match to the above requirement, kindly share your updated resume at [HIDDEN TEXT] Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |