Jobs
Interviews

211 Data Lakes Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Job Responsibilities: VOC - VI (Vulnerability Intelligence), ASM (Attack Surface Management) & VM (Vulnerability Management) Expert. Environment / Context Saint Gobain, world leader in the habitat and construction market, is one of the top 100 global industrial groups. Saint-Gobain is present in 68 countries with 171 000 employees. They design, manufacture and distribute materials and solutions which are key ingredients in the wellbeing of each of us and the future of all. They can be found everywhere in our living places and our daily life: in buildings, transportation, infrastructure and in many industrial applications. They provide comfort, performance and safety while addressing the challenges of sustainable construction, resource efficiency and climate change Saint-Gobain GDI Grou p (250 persons at the head office, including 120 that are internal) is responsible for defining, setting up and managing the Group&aposs Information Systems (IS) and Telecom policy with its 1,000 subsidiaries in 6,500 sites worldwide. The GDI Groupe also carries the common means (infrastructures, telecoms, digital platforms, cross-functional applications ). IN DEC, the IT Development Centre of Saint-Gobain, is an entity with a vision to leverage Indias technical skills in the Information Technology domain to provide timely, high-quality and cost-effective IT solutions to Saint-Gobain businesses globally.Within the Cybersecurity Department, t he Cybersecurity Vulnerability Operations Cent er mission is to Identify, assess and confirm vulnerability and threats that can affect the Group. The CyberVOC teams are based out of Paris and Mumbai and consist of skilled persons working in different Service Lines. Mission We are seeking a highly experienced cybersecurity professional to serve as an VOC Expert supporting the Vulnerability Intelligence (VI), Attack Surface Management (ASM), and Vulnerability Management (VM) teams. This role is pivotal in shaping the strategy, defining technical approaches, and supporting day-to-day operationsparticularly complex escalations and automation efforts. The ideal candidate will combine technical mastery in offensive security with practical experience in vulnerability lifecycle management and external attack surface discovery. The expert will act as a senior advisor and technical authority for the analyst teams, while also contributing to the design, scripting, and documentation of scalable security proceess. The VOC Expert is responsible for: Vulnerability Intelligence (VI) Drive the qualification and risk analysis of newly disclosed vulnerabilities. Perform exploit PoC validation when needed to assess practical risk. Maintain and enhance the central VI database, enriched with (EPSS, CVSS, QVS, SG-specific scoring models, and EUVD) Define and automate workflows for: Vulnerability qualification, exposure analysis, and prioritization Ingestion of qualified vulnerability data into the enterprise Data Lake Collaborate on documentation of VI methodology and threat intelligence integration Support proactive communication of high/critical vulnerabilities to asset and application owners Attack Surface Management (ASM): Operate and enhance external asset discovery and continuous monitoring using ASM tools Integrate asset coverage data from CMDB, and other internal datasets Design and implement scripts for: WHOIS/ASN/banner correlation Data enrichment and alert filtering Deploy and maintain custom scanning capabilities (e.g., Nuclei integrations) Provide expert input on threat modeling based on exposed assets and external footprint BlackBox Pentesting: Maintain the service delivery of the BlackBox Pentesting platform Automate the export of pentest data and integrate into Data Lake and Power BI dashboards Define and document onboarding workflows for new applications Actively guide analysts in prioritizing pentest requests and validating results. Vulnerability Management: Vulnerability review, recategorization, and false positive identification Proactive vulnerability testing and replay Pre-analyze and consolidate vulnerability data from various scanning tools Prepare concise syntheses of available vulnerabilities Offer guidance to the SO and CISO on vulnerabilities Collaborate with key stakeholders to develop strategies for vulnerability management Assist in defining vulnerability management KPIs and strategic goals Prepare concise, actionable summaries for high-risk vulnerabilities and trends Automate testing actions: Develop scripts and tooling to automate repetitive and complex tasks across VI, ASM and VM. Implement data pipelines to sync outputs from ASM/VI tools to dashboards and reporting engines. Design streamlined workflows for vulnerability lifecyclefrom detection to closure. Collaborate with both offensive and defensive teams to support App managers and Asset managers in remediating vulnerabilities and issues. Skills and Qualifications: Bachelor&aposs degree in Computer Science, Information Security, EXTC or related field; relevant certifications (e.g., CISSP, CCSP, CompTIA Security+) are a plus Proven experience (10+ years) working within the Cybersecurity field, with a focus on offensive security, vulnerability intelligence and attack surface analysis. Proven experience on Penetration testing actions (web application, infrastructure, ) Proven expertise in: CVE analysis, exploit development/validationExternal asset discovery & mapping Threat modeling and prioritizationAdvanced knowledge of tooling such as: ASM platforms Nuclei, Shodan, Open Source CTI, vulnerability scanners (Qualys, Tenable, ) Pentester tools (Burp, SQLmap, Responder, IDA and Kali environment) Experience in investigating newly published vulnerabilities, assessing their risks, severity. Strong scripting languages (e.g., Python, Bash, Powershell, C#, ) for automation and customization Experience with Pentester tools (Burp, SQLmap and Kali environment) Strong technical skills with an interest in open-source intelligence investigations Experience building dashboards in Power BI or similar tools. Familiarity with data lakes, API integrations, and ETL processes. Knowledge of NIST CVE database, OWASP Top 10, Microsoft security bulletins Excellent writing skills in English and ability to communicate complicate technical challenges in a business language to a range of stakeholders. Personal Skills: Has a systematic, disciplined, and analytical approach to problem solving with Thorough leadership skills & experience Excellent ability to think critically underpressure Strong communication skills to convey technical concepts clearly to both technical and non-technical stakeholders Willingness to stay updated with evolving cyber threats, technologies, and industry trends Capacity to work collaboratively with cross-functional teams, developers, and management to implement robust security measures Additional Information: The position is based in Mumbai (India) Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary This position provides strategic, analytical, and technical support for data and business intelligence activities. This position leverages data to gain key insight into business opportunities, and effectively presents these insights to business stakeholders. This position participates in the creation, distribution, and delivery of analytics reports, tables, graphs, and communication materials that effectively summarize findings and support recommendations. Primary Skills (must Have) Strong hands-on experience Data Warehouse, building Data Lakes Semantic Model Development (Dimensional, Tabular), SSAS, AAS, LookML Strong dashboarding skills - PowerBI (preferred) / Tableau Hands on experience in SQL, DAX, Python, R Hands on experience in Google Cloud Platform & DevOps(CI/CD) Strong analytical skills and attention to detail Proven ability to quickly learn new applications, processes, and procedures. Able and willing to collaborate in a team environment and exercise independent judgement. Excellent verbal and written communication skills. Ability to form good partner relationships across functions. Secondary Skills Google Cloud Platform (preferred), Azure Agile experience (Scrum) Experience in C#, .Net is preferred Responsibilities Designs, develops, and maintains reports and analytical tools and performs ongoing data quality monitoring and refinement. Identifies and analyzes errors and inconsistencies in the data and provides timely resolutions. Translates data results into written reports, tables, graphs, and charts to convey information to management and clients. Creates ad hoc reports and views on a frequent basis to assist management in understanding, researching, and analyzing issues. Uses data mining to extract information from data sets and identify correlations and patterns. Organizes and transforms information into comprehensible structures. Uses data to predict trends in the customer base and consumer populations and performs statistical analysis of data. Identifies and recommends new ways to support budgets by streamlining business processes. Preferences Bachelor&aposs Degree (or internationally comparable degree) Business/Economics, Computer Science, Engineering, Marketing, MIS, Mathematics, or related discipline. Experience with data warehousing, data science software, or similar analytics/business intelligence systems. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Bengaluru, Karnataka, India

On-site

About this role: Wells Fargo is seeking a Senior Data Science Consultant. As Senior Data Science Consultant, you will be responsible for working on projects with opportunities to improve the customer experience using advanced analytics and data solution engineering. The data science team supports automated control and processoptimization/streamliningby developing Advanced Analytical Solutions targeting to minimize compliance and operational risk across multiple lines of business across the bank. More specifically, you will support data exploration, population design, automate data driven review using Advanced automation techniques in SAS/Python/text mining/AI ML techniques. The selected candidate is expected to design analytical solution and generate meaningful business insight and communicate highly complex concepts to business stakeholders in layman term. In this role, you will: Work as technical expert in delivering high quality analytical solution and provide effective business insights Research, design and develop end-to-end advanced analytical solution using data solution engineering, ETL design, applying text mining and NLP Streamline ETL/data flow structure feeding to different analytical solutions through automation Clearly understand and articulate business requirements by leveraging domain understanding of line of business & product/ function and deliver the results underlining the business problem and appropriate business decision levers Identify & leverage appropriate analytical approach from a wide toolkit to make data driven recommendation Required Qualifications: 4+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Master's degree or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, or computer science Desired Qualifications: 4+ years of hands work experience in advanced analytics/ data science and min. 2 years mandatory experience in Risk and Control/Compliance in a Banking domain Engineering Graduate/ Post-graduate in Maths/Stats/Economics/Computer Science Strong expertise in Python and SAS/SQL and text mining/NLP Must have exposure to unstructured data such as contact center technology data (IVR, Telephony, Text, Chat etc) along with Transactional data Exposure to SAS Viya and Data Lakes/Azure/Big data platforms would be a plus Sound knowledge in project documentation framework Must have consultative skills to have the ability to rationalize business need and solution design from business requirements Strong written and verbal communication, presentation and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Cyber Security Engineer, you will collaborate closely with the Engineering Organization, IT, Information Security, Software Engineers, and our DevOps departments. Your team will ensure our back-end and front-end services, cloud infrastructure, DevOps pipelines, data pipelines, software and embedded platforms are secured in the most efficient manner. You will work to develop new systems and procedures to counteract threat vectors that arise within our cloud and embedded environments. The ideal candidate will be a meticulous problem solver who can work under pressure when required and will remain current with the latest attack trends and technologies. Other duties to include: Cloud Security Posture Management: Participate in the planning, development, implementation and management of security measures across various cloud platforms to ensure robust security. Threat Detection and Analysis: Utilize advanced security tools like Wiz, BurpSuite, Sumologic, and Sonarqube to identify, analyze, validate, and stop vulnerabilities from entering the environment. Perform regular penetration testing and vulnerability assessments. Data Analysis and Security Monitoring: Conduct comprehensive analysis of security data from microservice architectures, content distribution networks, data lakes, serverless functions, and databases. Use SIEM tools to correlate security events and identify anomalies. Incident Response and Management: Participate in incident response efforts, perform root cause analysis, and implement or suggest corrective actions to mitigate security breaches. Develop and maintain incident response playbooks. Supply Chain Security: Assess and mitigate security risks associated with the supply chain, like open source libraries, ensuring end-to-end security. Software Security Flaws Mitigation: Identify and address software security flaws and misconfigurations to enhance overall security posture. Perform code reviews and static/dynamic analysis. Languages include but not limited to Python, C++, C#, JS, Python, HCL. Security Solutions Development: Develop and implement custom security solutions, minimizing reliance on paid services. Create security automation scripts and integrate security tools into CI/CD pipelines. Automating Security Test Functions: Develop and implement automated security testing functions to ensure continuous security validation. What we offer: Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you'll experience an inclusive culture of acceptance and belonging, where you'll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You'll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you'll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what's possible and bring new solutions to market. In the process, you'll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you're placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Do you want to help solve the world's most pressing challenges such as feeding the world's growing population and slowing climate change AGCO is looking for individuals to join them in making a difference. Currently, AGCO is seeking a Senior Manager, AI & Data Systems Architecture to lead the design, creation, and evolution of system architectures for AI, analytics, and data systems within the organization. As the Senior Manager, AI & Data Systems Architecture, you will collaborate with cross-functional teams, including data engineers, data scientists, and other IT professionals to create architectures that support cutting-edge AI and data initiatives. Your responsibilities will include leading the end-to-end architecture for AI and data systems, designing and implementing data infrastructure and AI platforms, championing cloud adoption strategies, and driving the continuous improvement and evolution of data and AI architectures to meet emerging business needs and industry trends. To qualify for this role, you should have a minimum of 10 years of experience in data architecture, AI systems, or cloud infrastructure, with at least 3-5 years in a leadership role. You should also possess deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks, as well as familiarity with CRM systems like Salesforce and AI systems within those solutions. Additionally, you should have expertise in data architecture, including data lakes, data warehouses, real-time data streaming, and batch processing frameworks. The ideal candidate will have strong leadership and communication skills, with the ability to drive architecture initiatives in a collaborative and fast-paced environment. A Bachelor's degree in Computer Science, Data Science, or a related field is required, while a Master's degree or relevant certifications such as AWS Certified Solutions Architect are preferred. AGCO offers a positive workplace culture that values inclusion and diversity, providing benefits such as health care and wellness plans, flexible work options, and opportunities for personal development and growth. If you are passionate about leveraging innovative technologies to make a positive impact and contribute to the future of agriculture, apply now to join AGCO in their mission.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

At PwC, our team focused on data and analytics applies data to drive insights and guide strategic business decisions. Utilizing advanced analytics techniques, we assist clients in optimizing operations and achieving their goals. As a member of our data analysis team, you will specialize in leveraging sophisticated analytical methods to extract valuable insights from extensive datasets, enabling data-driven decision-making. Your role will involve utilizing skills in data manipulation, visualization, and statistical modeling to support clients in resolving intricate business challenges. We are seeking a visionary Generative AI Architect at the Manager level to join PwC US - Acceleration Center. In this leadership position, you will be responsible for designing and implementing cutting-edge Generative AI solutions using technologies such as Azure OpenAI Service, GPT models, and multi-agent frameworks. Your role will involve driving innovation through scalable cloud architectures, optimizing AI infrastructure, and leading cross-functional teams in deploying transformative AI solutions. The ideal candidate will possess deep expertise in Generative AI technologies, data engineering, Agentic AI, and cloud platforms like Microsoft Azure, with a strong emphasis on operational excellence and ethical AI practices. Responsibilities: - **Architecture Design:** Design and implement scalable, secure, and high-performance architectures for Generative AI applications. Integrate Generative AI models into existing platforms and lead the development of AI agents capable of orchestrating multi-step tasks. - **Model Development And Deployment:** Fine-tune pre-trained generative models, develop data collection and preparation strategies, and deploy appropriate Generative AI frameworks. - **Innovation And Strategy:** Stay updated on the latest Generative AI advancements, recommend innovative applications, and define and execute AI strategy roadmaps. - **Collaboration And Leadership:** Collaborate with cross-functional teams, mentor team members, and lead a team of data scientists, GenAI engineers, devops, and software developers. - **Performance Optimization:** Monitor and optimize the performance of AI models, agents, and systems to ensure robustness and accuracy, as well as optimize computational costs and infrastructure utilization. - **Ethical And Responsible AI:** Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks, and implement safeguards against bias and misuse. Requirements: - Bachelors or masters degree in computer science, Data Science, or related field. - 8+ years of relevant technical/technology experience, with expertise in GenAI projects. - Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. - Experience with GenAI foundational models and open-source models. - Proficiency in system design for Agentic architecture and real-time data processing systems. - Familiarity with cloud computing platforms and containerization technologies. - Strong leadership, problem-solving, and analytical abilities. - Excellent communication and collaboration skills. Nice To Have Skills: - Experience with technologies like Datadog and Splunk. - Familiarity with emerging Model Context Protocols and dynamic tool integration. - Relevant solution architecture certificates and continuous professional development in data engineering and GenAI. Professional And Educational Background: BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA/ Any Degree.,

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 32 Lacs

Chennai, Bengaluru

Work from Office

Technical Manager, Data Engineering Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are looking for a seasoned Technical Manager to lead our Data Engineering function. This role demands a deep understanding of data architecture, pipeline development, and data infrastructure. The ideal candidate will be a thought leader in the data engineering space, capable of guiding and mentoring a team, collaborating effectively with various business units, and driving the adoption of cutting-edge tools and technologies to build robust, scalable, and efficient data solutions. Responsibilities Define and champion the strategic direction for data engineering, staying abreast of industry trends and emerging technologies. Lead, mentor, and develop a high-performing team of data engineers, fostering a culture of technical excellence, innovation, and continuous learning. Design, implement, and maintain scalable, reliable, and secure data pipelines and infrastructure. Ensure data quality, integrity, and accessibility. Oversee the end-to-end delivery of data engineering projects, ensuring timely completion, adherence to best practices, and alignment with business objectives. Partner closely with pre-sales, sales, marketing, Business Intelligence, Data Science, and other departments to understand data needs, propose solutions, and support resource deployment for active data projects. Evaluate, recommend, and implement new data engineering tools, platforms, and methodologies to enhance capabilities and efficiency. Identify and address performance bottlenecks in data systems, ensuring optimal data processing and storage. Tools & Technologies Cloud Platforms : AWS (S3, Glue, EMR, Redshift, Athena, Lambda, Kinesis), Azure (Data Lake Storage, Data Factory, Databricks, Synapse Analytics), Google Cloud Platform (Cloud Storage, Dataflow, Dataproc, BigQuery). Big Data Frameworks : Apache Spark, Apache Flink, Apache Kafka, HDFS Data Warehousing/Lakes: Snowflake, Databricks Lakehouse, Google BigQuery, Amazon Redshift, Azure Synapse Analytics. ETL/ELT Tools : Apache Airflow, Talend, Informatica, DBT, Fivetran, Stitch. Data Modeling : Star Schema, Snowflake Schema, Data Vault. Databases : PostgreSQL, MySQL, MongoDB, Cassandra, DynamoDB. Programming Languages: Python (Pandas, PySpark), Scala, Java. Containerization/Orchestration : Docker, Kubernetes. Version Control : Git.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

This is a full-time Data Engineer position with D Square Consulting Services Pvt Ltd, based in Pan-India with a hybrid work model. You should have at least 5 years of experience and be able to join immediately. As a Data Engineer, you will be responsible for designing, building, and scaling data pipelines and backend services supporting analytics and business intelligence platforms. A strong technical foundation, Python expertise, API development experience, and familiarity with containerized CI/CD-driven workflows are essential for this role. Your key responsibilities will include designing, implementing, and optimizing data pipelines and ETL workflows using Python tools, building RESTful and/or GraphQL APIs, collaborating with cross-functional teams, containerizing data services with Docker, managing deployments with Kubernetes, developing CI/CD pipelines using GitHub Actions, ensuring code quality, and optimizing data access and transformation. The required skills and qualifications for this role include a Bachelor's or Master's degree in Computer Science or a related field, 5+ years of hands-on experience in data engineering or backend development, expert-level Python skills, experience with building APIs using frameworks like FastAPI, Graphene, or Strawberry, proficiency in Docker, Kubernetes, SQL, and data modeling, good communication skills, familiarity with data orchestration tools, experience with streaming data platforms like Kafka or Spark, knowledge of data governance, security, and observability best practices, and exposure to cloud platforms like AWS, GCP, or Azure. If you are proactive, self-driven, and possess the required technical skills, then this Data Engineer position is an exciting opportunity for you to contribute to the development of cutting-edge data solutions at D Square Consulting Services Pvt Ltd.,

Posted 1 month ago

Apply

3.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As an Enterprise Architect & AI Expert, your role will involve defining and maintaining the enterprise architecture framework, standards, and governance. You will align IT strategy with business goals to ensure architectural integrity across systems and platforms. Leading the development of roadmaps for cloud, data, application, and infrastructure architectures will be a key responsibility. It will also be crucial to evaluate and select technologies, platforms, and tools that align with enterprise goals. You will be responsible for designing and implementing AI/ML solutions to solve complex business problems. Leading AI initiatives such as NLP, computer vision, predictive analytics, and generative AI will be part of your duties. Collaborating with data scientists, engineers, and business stakeholders to deploy AI models at scale will be essential. Ensuring ethical AI practices, data governance, and compliance with regulatory standards will also be critical. In terms of leadership and collaboration, you will act as a strategic advisor to senior leadership on technology trends and innovation. Mentoring cross-functional teams, promoting architectural best practices, and facilitating enterprise-wide workshops and architecture review boards will be part of your role. To qualify for this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. You should possess 14+ years of experience in enterprise architecture, with at least 3 years in AI/ML domains. Proven experience with cloud platforms such as AWS, Azure, GCP, microservices, and API management is required. Strong knowledge of AI/ML frameworks like TensorFlow, PyTorch, Scikit-learn, and MLOps practices is essential. Familiarity with data architecture, data lakes, and real-time analytics platforms is also expected. Excellent communication, leadership, and stakeholder management skills are necessary for this role. Mandatory skills for this position include experience in designing GenAI and RAG architectures, familiarity with AWS, Vector DB - Milvus (preferred), OpenAI or Claude, LangChain, and LlamaIndex. Thank you for considering this opportunity. Siva,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You are a skilled and motivated AI Developer with over 3+ years of hands-on experience in building, deploying, and optimizing AI/ML models. Your expertise includes strong proficiency in Python, Scikit-learn, machine learning algorithms, and practical experience of Azure AI services, Azure AI foundry, Copilot Studio, and Dataverse are mandatory. You will be responsible for designing intelligent solutions using modern deep learning and neural network architectures, integrated into scalable cloud-based environments. Your key responsibilities will include utilizing Azure AI Foundry and Copilot Studio to build AI-driven solutions that can be embedded within enterprise workflows. You will design, develop, and implement AI/ML models using Python, Scikit-learn, and modern deep learning frameworks. Additionally, you will build and optimize predictive models using structured and unstructured data from data lakes and other enterprise sources. Collaborating with data engineers to process and transform data pipelines across Azure-based environments, you will develop and integrate applications with Microsoft Dataverse for intelligent business process automation. Applying best practices in data structures and algorithm design, you will ensure high performance and scalability of AI applications. Your role will involve training, testing, and deploying machine learning, deep learning, and neural network models in production environments. Furthermore, you will be responsible for ensuring model governance, performance monitoring, and continuous learning using Azure MLOps pipelines. Collaborating cross-functionally with data scientists, product teams, and cloud architects, you will drive AI innovation within the organization. As a qualified candidate, you hold a Bachelors or Masters degree in Computer Science, Data Science, AI/ML, or a related field. With 3+ years of hands-on experience in AI/ML development, you possess practical experience with Copilot Studio and Microsoft Dataverse integrations. Expertise in Microsoft Azure is essential, particularly with services such as Azure Machine Learning, Azure Data Lake, and Azure AI Foundry. Proficiency in Python and machine learning libraries like Scikit-learn, Pandas, and NumPy is required. A solid understanding of data structures, algorithms, and object-oriented programming is essential, along with experience in data lakes, data pipelines, and large-scale data processing. Your deep understanding of neural networks, deep learning frameworks (e.g., TensorFlow, PyTorch), and model tuning will be valuable in this role. Familiarity with MLOps practices and lifecycle management on cloud platforms is beneficial. Strong problem-solving abilities, communication skills, and team collaboration are important attributes for this position. Preferred qualifications include Azure AI or Data Engineering certification, experience in deploying AI-powered applications in enterprise or SaaS environments, knowledge of generative AI or large language models (LLMs), and exposure to REST APIs, CI/CD pipelines, and version control systems like Git.,

Posted 1 month ago

Apply

14.0 - 18.0 years

0 Lacs

karnataka

On-site

The AVP Databricks Squad Delivery Lead position is open for candidates with 14+ years of experience in Bangalore/Hyderabad/NCR/Kolkata/Mumbai/Pune. As the Databricks Squad Delivery Lead, you will be responsible for overseeing project delivery, team leadership, architecture reviews, and client engagement. Your role will involve optimizing Databricks implementations across cloud platforms like AWS, Azure, and GCP, while leading cross-functional teams. You will lead and manage end-to-end delivery of Databricks-based solutions, serving as a subject matter expert in Databricks architecture, implementation, and optimization. Collaboration with architects and engineers to design scalable data pipelines and analytics platforms will be a key aspect of your responsibilities. Additionally, you will oversee Databricks workspace setup, performance tuning, and cost optimization, while acting as the primary point of contact for client stakeholders. Driving innovation through the implementation of best practices, tools, and technologies, and ensuring alignment between business goals and technical solutions will also be part of your duties. The ideal candidate for this role must possess a Bachelor's degree in Computer Science, Engineering, or equivalent (Masters or MBA preferred) along with hands-on experience in delivering data engineering/analytics projects using Databricks. Experience in managing cloud-based data pipelines on AWS, Azure, or GCP, strong leadership skills, and effective client-facing communication are essential requirements. Preferred skills include proficiency with Spark, Delta Lake, MLflow, and distributed computing, expertise in data engineering concepts such as ETL, data lakes, and data warehousing, and certifications in Databricks or cloud platforms (AWS/Azure/GCP) as a plus. An Agile/Scrum or PMP certification will be considered an added advantage for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With over 125,000 employees spanning across 30+ countries, we are deeply motivated by our curiosity, agility, and the desire to create enduring value for our clients. We are driven by our purpose - the relentless pursuit of a world that works better for people. We cater to and transform leading enterprises, including the Fortune Global 500, leveraging our profound business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Assistant Vice President, Databricks Squad Delivery Lead. As the Databricks Delivery Lead, you will be responsible for overseeing the complete delivery of Databricks-based solutions for our clients. Your role will involve ensuring the successful implementation, optimization, and scaling of big data and analytics solutions. You will play a crucial role in promoting the adoption of Databricks as the preferred platform for data engineering and analytics, while effectively managing a diverse team of data engineers and developers. Your key responsibilities will include: - Leading and managing Databricks-based project delivery, ensuring that all solutions adhere to client requirements, best practices, and industry standards. - Serving as the subject matter expert (SME) on Databricks, offering guidance to teams on architecture, implementation, and optimization. - Collaborating with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads. - Acting as the primary point of contact for clients, ensuring alignment between business requirements and technical delivery. - Maintaining effective communication with stakeholders, providing regular updates on project status, risks, and achievements. - Overseeing the setup, deployment, and optimization of Databricks workspaces, clusters, and pipelines. - Ensuring that Databricks solutions are optimized for cost and performance, utilizing best practices for data storage, processing, and querying. - Continuously evaluating the effectiveness of the Databricks platform and processes, and proposing improvements or new features to enhance delivery efficiency and effectiveness. - Driving innovation within the team by introducing new tools, technologies, and best practices to improve delivery quality. Qualifications we are looking for: Minimum Qualifications / Skills: - Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). - Relevant years of experience in IT services with a specific focus on Databricks and cloud-based data engineering. Preferred Qualifications / Skills: - Demonstrated experience in leading end-to-end delivery of data engineering or analytics solutions on Databricks. - Strong expertise in cloud technologies (AWS, Azure, GCP), data pipelines, and big data tools. - Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies. - Proficiency in data engineering concepts, including ETL, data lakes, data warehousing, and distributed computing. Preferred Certifications: - Databricks Certified Associate or Professional. - Cloud certifications (AWS Certified Solutions Architect, Azure Data Engineer, or equivalent). - Certifications in data engineering, big data technologies, or project management (e.g., PMP, Scrum Master). If you are passionate about driving innovation, leading a high-performing team, and shaping the future of data engineering and analytics, we welcome you to apply for this exciting opportunity of Assistant Vice President, Databricks Squad Delivery Lead at Genpact.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Team, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products with a focus on security, stability, and scalability. Your responsibilities include executing creative software solutions, designing, developing, and troubleshooting technical issues with an innovative mindset. You are expected to develop high-quality, secure production code, review and debug code by team members, and identify opportunities to automate remediation processes for enhanced operational stability. In this role, you will lead evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical applicability within existing systems. Additionally, you will drive awareness and adoption of new technologies within Software Engineering communities, contributing to a diverse, inclusive, and respectful team culture. To excel in this position, you should possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Strong proficiency in database systems, including SQL & NoSQL, and programming languages like Python, Java, or Scala is essential. Experience in data architecture, data modeling, data warehousing, and data lakes, as well as implementing complex ETL transformations on big data platforms, will be beneficial. Proficiency in the Software Development Life Cycle and agile methodologies such as CI/CD, Application Resiliency, and Security is required. An ideal candidate will have hands-on experience with software applications and technical processes within a specific discipline (e.g., cloud, artificial intelligence, machine learning) and a background in the financial services industry. Practical experience in cloud-native technologies is highly desirable. Additional qualifications such as Java and data programming experience are considered a plus for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a PySpark Data Engineer, you will play a crucial role in developing robust data processing and transformation solutions within our data platform. Your responsibilities will include designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensuring data quality, and integrating with diverse data sources. To excel in this role, you should possess strong PySpark development skills, experience with big data technologies, and the ability to thrive in a fast-paced, data-driven environment. Your primary responsibilities will involve designing, developing, and testing PySpark-based applications to process, transform, and analyze large-scale datasets from various sources such as relational databases, NoSQL databases, batch files, and real-time data streams. You will need to implement efficient data transformation and aggregation techniques using PySpark and relevant big data frameworks, as well as develop robust error handling and exception management mechanisms to maintain data integrity and system resilience within Spark jobs. Additionally, optimizing PySpark jobs for performance through techniques like partitioning, caching, and tuning of Spark configurations will be essential. Collaboration will be key in this role, as you will work closely with data analysts, data scientists, and data architects to understand data processing requirements and deliver high-quality data solutions. By analyzing and interpreting data structures, formats, and relationships, you will implement effective data transformations using PySpark and work with distributed datasets in Spark to ensure optimal performance for large-scale data processing and analytics. In terms of data integration and ETL processes, you will design and implement ETL (Extract, Transform, Load) processes to ingest and integrate data from various sources, ensuring consistency, accuracy, and performance. Integration of PySpark applications with data sources such as SQL databases, NoSQL databases, data lakes, and streaming platforms will also be a part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of hands-on experience in big data development, preferably with exposure to data-intensive applications. A strong understanding of data processing principles, techniques, and best practices in a big data environment is essential, as well as proficiency in PySpark, Apache Spark, and related big data technologies for data processing, analysis, and integration. Experience with ETL development and data pipeline orchestration tools such as Apache Airflow and Luigi will be advantageous. Strong analytical and problem-solving skills, along with excellent communication and collaboration abilities, will also be critical for success in this role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

kolkata, west bengal

On-site

You are a highly skilled and strategic Data Architect with deep expertise in the Azure Data ecosystem. Your role will involve defining and driving the overall Azure-based data architecture strategy aligned with enterprise goals. You will architect and implement scalable data pipelines, data lakes, and data warehouses using Azure Data Lake, ADF, and Azure SQL/Synapse. Providing technical leadership on Azure Databricks for large-scale data processing and advanced analytics use cases is a crucial aspect of your responsibilities. Integrating AI/ML models into data pipelines and supporting the end-to-end ML lifecycle including training, deployment, and monitoring will be part of your day-to-day tasks. Collaboration with cross-functional teams such as data scientists, DevOps engineers, and business analysts is essential. You will evaluate and recommend tools, platforms, and design patterns for data and ML infrastructure while mentoring data engineers and junior architects on best practices and architectural standards. Your role will require a strong background in data modeling, ETL/ELT frameworks, and data warehousing concepts. Proficiency in SQL, Python, PySpark, and a solid understanding of AI/ML workflows and tools are necessary. Exposure to Azure DevOps and excellent communication and stakeholder management skills are also key requirements. As a Data Architect at Lexmark, you will play a vital role in designing and overseeing robust, scalable, and secure data architectures to support advanced analytics and machine learning workloads. If you are an innovator looking to make your mark with a global technology leader, apply now to join our team in Kolkata, India.,

Posted 1 month ago

Apply

14.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

You are hiring for the position of AVP - Databricks with a minimum of 14 years of experience. The role is based in Bangalore/Hyderabad/NCR/Kolkata/Mumbai/Pune. As an AVP - Databricks, your responsibilities will include leading and managing Databricks-based project delivery to ensure solutions are designed, developed, and implemented according to client requirements and industry standards. You will act as the subject matter expert on Databricks, providing guidance on architecture, implementation, and optimization to teams. Collaboration with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads is also a key aspect of the role. You will serve as the primary point of contact for clients to ensure alignment between business requirements and technical delivery. The qualifications we seek in you include a Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). You should have relevant years of experience in IT services with a specific focus on Databricks and cloud-based data engineering. Preferred qualifications/skills for this role include proven experience in leading end-to-end delivery, solution and architecture of data engineering or analytics solutions on Databricks. Strong experience in cloud technologies such as AWS, Azure, GCP, data pipelines, and big data tools is desirable. Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies is a plus. Expertise in data engineering concepts including ETL, data lakes, data warehousing, and distributed computing will be beneficial for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

thane, maharashtra

On-site

As the BI / BW Lead at DMart, you will lead and manage a dedicated SAP BW team to ensure the timely delivery of reports, dashboards, and analytics solutions. Your role will involve managing the team effectively, overseeing all SAP BW operational support tasks and development projects with a focus on high quality and efficiency. You will be responsible for maintaining the stability and performance of the SAP BW environment, managing daily support activities, and ensuring seamless data flow and reporting across the organization. Acting as the bridge between business stakeholders and your technical team, you will play a crucial role in enhancing DMart's data ecosystem. You should possess a Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. While SAP BW certifications are preferred, they are not mandatory. Key Responsibilities: - Lead and manage the SAP BW & BOBJ team, ensuring efficient workload distribution and timely task completion. - Oversee the daily operational support of the SAP BW & BOBJ environment to maintain stability and performance. - Provide direction and guidance to the team for issue resolution, data loads, and reporting accuracy. - Serve as the primary point of contact for business users and internal teams regarding SAP BW support and enhancements. - Ensure the team follows best practices in monitoring, error handling, and performance optimization. - Drive continuous improvement of support processes, tools, and methodologies. - Proactively identify risks and bottlenecks in data flows and take corrective actions. - Ensure timely delivery of data extracts, reports, and dashboards for critical business decisions. - Provide leadership in system upgrades, patching, and data model improvements. - Facilitate knowledge sharing and skill development within the SAP BW team. - Maintain high standards of data integrity and security in the BW environment. Professional Skills: - Strong functional and technical understanding of SAP BW / BW on HANA & BOBJ. - At least 5 years of working experience with SAP Analytics. - Solid knowledge of ETL processes and data extraction. - Experience with Data lakes such as Snowflake, Big Query, Data bricks, and Dashboard tools like Power BI, Tableau is advantageous. - Experience in Retail, CPG, or SCM is a plus. - Experience in managing SAP BW support activities and coordinating issue resolution. - Strong stakeholder management skills with the ability to translate business needs into technical actions. - Excellent problem-solving and decision-making abilities under pressure.,

Posted 1 month ago

Apply

7.0 - 14.0 years

2 - 6 Lacs

Remote, , India

On-site

Job Description Role and Responsibilities : Handle day-to-day administration tasks such as configuring clusters and workspaces, Monitor platform health, troubleshoot issues, and perform routine maintenance and upgrades. Evaluate new features and enhancements introduced by Databricks from Security, Compliance and manageability prospective Implement and maintain security controls to protect the Databricks platform and the data within it. Collaborate with the security team to ensure compliance with data privacy and regulatory requirements. Develop and enforce governance policies and practices, including access management, data retention, and data classification. Optimize the platform's performance by monitoring resource utilization, identifying and resolving bottlenecks, and fine-tuning configurations for optimal performance. Collaborate with infrastructure and engineering teams to ensure that the platform meets the scalability and availability requirements. Work closely with data analysts, data scientists, and other users to understand their requirements and provide technical support Automate platform deployment, configuration, and monitoring processes using scripting languages and automation tools. Collaborate with the DevOps team to integrate the Databricks platform into the overall infrastructure and CI/CD pipelines. What we Look for : 7+ years of experience with Big Data Technologies such as Apache Spark, cloud native Data lakes and Data mesh platforms technical Architecture or consulting role Strong experience in administering and managing Databricks or other big data platforms - AWS cloud Python programming Skills in technical areas which support deployment and integration of Databricks based solutions. Understanding latest services offered by Databricks and evaluation of those services and understanding how these services can fit into the platform

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We help organizations enhance their operations, reduce costs, and boost efficiency by managing key processes and functions on their behalf. Our expertise lies in project management, technology, and process optimization, allowing us to deliver high-quality services to our clients. In managed service management and strategy at PwC, the focus is on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a Managed Services - Data Engineer Senior Associate at PwC, you will be part of a team of problem solvers dedicated to addressing complex business issues from strategy to execution using Data, Analytics & Insights Skills. Your responsibilities will include using feedback and reflection to enhance self-awareness and personal strengths, acting as a subject matter expert in your chosen domain, mentoring junior resources, and conducting knowledge sharing sessions. You will be required to demonstrate critical thinking, ensure quality of deliverables, adhere to SLAs, and participate in incident, change, and problem management. Additionally, you will be expected to review your work and that of others for quality, accuracy, and relevance, as well as demonstrate leadership capabilities by working directly with clients and leading engagements. The primary skills required for this role include ETL/ELT, SQL, SSIS, SSMS, Informatica, and Python, with secondary skills in Azure/AWS/GCP, Power BI, Advanced Excel, and Excel Macro. As a Data Ingestion Senior Associate, you should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines, designing and implementing ETL processes, monitoring and troubleshooting data pipelines, implementing data security measures, and creating visually impactful dashboards for data reporting. You should also have expertise in writing and analyzing complex SQL queries, be proficient in Excel, and possess strong communication, problem-solving, quantitative, and analytical abilities. In our Managed Services platform, we focus on leveraging technology and human expertise to deliver simple yet powerful solutions to our clients. Our team of skilled professionals, combined with advanced technology and processes, enables us to provide effective outcomes and add greater value to our clients" enterprises. We aim to empower our clients to focus on their business priorities by providing flexible access to world-class business and technology capabilities that align with today's dynamic business environment. If you are a candidate who thrives in a high-paced work environment, capable of handling critical Application Evolution Service offerings, engagement support, and strategic advisory work, then we are looking for you to join our team in the Data, Analytics & Insights Managed Service at PwC. Your role will involve working on a mix of help desk support, enhancement and optimization projects, as well as strategic roadmap initiatives, while also contributing to customer engagements from both a technical and relationship perspective.,

Posted 2 months ago

Apply

8.0 - 13.0 years

18 - 33 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Data Modular: JD details Work Location: Bangalore/Chenna/Hyderabad/Gurgaon/Pune Mode of work: Hybrid Exp Level- 7+ Years Looking only for immediate joiners. Experience with RDBMS, No-SQL (Columnar DB like Apache Parquet, Apache Kylin or alike) - Query optimization, performance tuning, caching and filtering strategies. Experience with Data lakes and faster retrieving processes and techniques Dynamic data modelling - enable updated data models based on underlying data Caching and filtering techniques on data Experience with Apache Spark or similar big data technologies Knowledge on AWS - IaaC implementation SQL transpilers and predicate pushing The top layer is GraphQL - good to have

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you'll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Candidates with 8+ years of hands-on experience are invited to join our team as we embark on a journey to drive innovation and change through data-driven solutions. Responsibilities: - Lead and manage a team of software engineers in developing, implementing, and maintaining advanced software solutions for GenAI projects. - Engage with senior leadership and cross-functional teams to gather business requirements, identify opportunities for technological enhancements, and ensure alignment with organizational goals. - Design and implement sophisticated event-driven architectures to support real-time data processing and analysis. - Oversee the use of containerization technologies such as Kubernetes to promote efficient deployment and scalability of software applications. - Supervise the development and management of extensive data lakes, ensuring effective storage and handling of large volumes of structured and unstructured data. - Champion the use of Python as the primary programming language, setting high standards for software development within the team. - Facilitate close collaboration between software engineers, data scientists, data engineers, and DevOps teams to ensure seamless integration and deployment of GenAI models. - Maintain a cutting-edge knowledge base in GenAI technologies to drive innovation and enhance software engineering processes continually. - Translate complex business needs into robust technical solutions, contributing to strategic decision-making processes. - Establish and document software engineering processes, methodologies, and best practices, promoting a culture of excellence. - Ensure continuous professional development of the team by maintaining and acquiring new solution architecture certificates and adhering to industry best practices.,

Posted 2 months ago

Apply

14.0 - 18.0 years

0 Lacs

karnataka

On-site

As the AVP Databricks Squad Delivery Lead, you will play a crucial role in overseeing project delivery, team leadership, architecture reviews, and client engagement. Your primary responsibility will be to optimize Databricks implementations across cloud platforms such as AWS, Azure, and GCP, while leading cross-functional teams. You will lead and manage the end-to-end delivery of Databricks-based solutions. Your expertise as a subject matter expert (SME) in Databricks architecture, implementation, and optimization will be essential. Collaborating with architects and engineers, you will design scalable data pipelines and analytics platforms. Additionally, you will oversee Databricks workspace setup, performance tuning, and cost optimization. Acting as the primary point of contact for client stakeholders, you will ensure effective communication and alignment between business goals and technical solutions. Driving innovation within the team, you will implement best practices, tools, and technologies to enhance project delivery. The ideal candidate should possess a Bachelor's degree in Computer Science, Engineering, or equivalent (Masters or MBA preferred). Hands-on experience in delivering data engineering/analytics projects using Databricks and managing cloud-based data pipelines on AWS, Azure, or GCP is a must. Strong leadership skills and excellent client-facing communication are essential for this role. Preferred skills include proficiency with Spark, Delta Lake, MLflow, and distributed computing. Expertise in data engineering concepts such as ETL, data lakes, and data warehousing is highly desirable. Certifications in Databricks or cloud platforms (AWS/Azure/GCP) and Agile/Scrum or PMP certification are considered advantageous.,

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As an Assistant Manager - MIS Reporting at Axis Max Life Insurance, you will play a crucial role in driving the business intelligence team towards a data-driven culture and leading the transformation towards automation and real-time insights. You will be responsible for ensuring the accurate and timely delivery of reports and dashboards while coaching and mentoring a team of professionals to enhance their skills and capabilities. Your key responsibilities will include handling distribution reporting requirements, supporting CXO reports and dashboards, driving data democratization, collaborating to design data products, and partnering with the data team to build necessary data infrastructure. You will lead a team of 10+ professionals, including partners, and work closely with distribution leaders to understand key metrics and information needs to develop business intelligence products that cater to those needs. Additionally, you will define the vision and roadmap for the business intelligence team, championing a data culture within Max Life and accelerating the journey towards becoming a data-driven organization. To excel in this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams in the financial services sector. Proficiency in tools like Python and PowerBI is essential, as well as demonstrated experience in working with senior leadership, standardizing and automating business reporting, and technical proficiency in BI tech stack. Strong interpersonal skills, excellent verbal and written communication abilities, and a deep understanding of data architecture, data warehousing, and data lakes are also required for this position. If you are passionate about leading change, driving efficiency, and rationalizing information overload, we are looking for you to join our team at Axis Max Life Insurance.,

Posted 2 months ago

Apply

14.0 - 18.0 years

0 Lacs

karnataka

On-site

You are hiring for the role of AVP - Databricks with a requirement of minimum 14+ years of experience. The job location can be in Bangalore, Hyderabad, NCR, Kolkata, Mumbai, or Pune. As an AVP - Databricks, your responsibilities will include leading and managing Databricks-based project delivery to ensure that all solutions meet client requirements, best practices, and industry standards. You will serve as a subject matter expert (SME) on Databricks, providing guidance to teams on architecture, implementation, and optimization. Collaboration with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads will also be part of your role. Additionally, you will act as the primary point of contact for clients, ensuring alignment between business requirements and technical delivery. We are looking for a candidate with a Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred) with relevant years of experience in IT services, specifically in Databricks and cloud-based data engineering. Proven experience in leading end-to-end delivery and solution architecting of data engineering or analytics solutions on Databricks is a plus. Strong expertise in cloud technologies such as AWS, Azure, GCP, data pipelines, and big data tools is desired. Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies is a requirement. An in-depth understanding of data engineering concepts including ETL, data lakes, data warehousing, and distributed computing will be beneficial for this role.,

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies