Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
20 - 35 Lacs
Noida, Bengaluru
Work from Office
Description: We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. Requirements: Need a Python that has good EXperience in Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling Job Responsibilities: Need a Python that has good Experiencein Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 2 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. Job Description - Grade Specific Defines the methods and the business analysis framework for the business analysis work to be carried out in their project/program together with the client.Additionally performs requirements elicitation and modelling. Performs leadership activities within the project and beyond. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 2 weeks ago
4.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
Overview We are seeking a self-driven Senior Tableau Engineer with deep expertise in data modeling, visualization design, and BI-tool migrations. Youll own end-to-end dashboard development, translate complex healthcare and enterprise data into actionable insights, and lead migrations from legacy BI platforms (e.g., MicroStrategy, BusinessObjects) to Tableau. Job Location - Delhi NCR / Bangalore /Pune Key Responsibilities Data Modeling & Architecture Design and maintain logical and physical data models optimized for Tableau performance. Collaborate with data engineers to define star/snowflake schemas, data marts, and semantic layers. Ensure data integrity, governance, and lineage across multiple source systems. Visualization Development Develop high-impact, interactive Tableau dashboards and visualizations for executive-level stakeholders. Apply design best practices: color theory, UX principles, and accessibility standards. Optimize workbooks for performance (efficient calculations, extracts, and queries) BI Migration & Modernization Lead migration projects from MicroStrategy, BusinessObjects, or other BI tools to Tableau. Reproduce and enhance legacy reports in Tableau, ensuring feature parity and improved UX. Validate data accuracy post-migration through sampling, reconciliation, and automated testing. Automation & Deployment Automate data extract refreshes, alerting, and workbook publishing via Tableau Server/Online. Implement CI/CD processes for Tableau content using Git, Tableau APIs, and automated testing frameworks. Establish standardized naming conventions, folder structures, and content lifecycle policies. Collaboration & Mentorship Partner with analytics translators, data engineers, and business owners to gather requirements and iterate solutions. Mentor junior BI developers on Tableau best practices, performance tuning, and dashboard design. Evangelize self-service BI adoption: train users, develop documentation, and host office hours. Governance & Quality Define and enforce Tableau governance: security, permissions, version control, and change management. Implement data quality checks and monitoring for dashboards (row counts, anomalies, thresholds). Track and report key metrics on dashboard usage, performance, and user satisfaction.
Posted 2 weeks ago
7.0 - 10.0 years
18 - 30 Lacs
Hyderabad
Hybrid
We are hiring for multiple roles for our client in Hyderabad. Java Developers, Dot Net Developer, QA Tester, ETL Lead Developer
Posted 2 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Tamil Nadu
Work from Office
Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer , you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python , SQL , AWS , PySpark , and Databricks , you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark , Databricks , and SQL on AWS cloud platforms . Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark . Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3 , Glue , EMR , Athena , Redshift , and Lambda . Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow , Databricks Jobs , or AWS Step Functions . Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability , monitoring , and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight.
Posted 2 weeks ago
5.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
Position Summary: We are seeking a highly motivated and experienced Business Analyst (BA) to act as a critical liaison between our Clients and the Rackspace technical delivery team. The BA will be responsible for eliciting, analyzing, validating, and documenting business requirements related to data ingestion, processing, storage, reporting, and analytics. This role requires a strong understanding of business analysis principles, data concepts, and the ability to quickly grasp the nuances of airline operations (both passenger and cargo) and their supporting systems. Key Responsibilities: Requirement Elicitation & Analysis: Collaborate closely with client stakeholders across various departments to understand their business processes, pain points, and data needs. Conduct workshops, interviews, and document analysis to elicit detailed functional and non-functional requirements for the data platform. Analyze data originating from diverse source systems Translate business needs into clear, concise, and actionable requirements documentation (e.g., user stories, use cases, business process models, data mapping specifications). Data Focus: Analyse source system data structures and data relationships relevant to business requirements. Define business rules for data transformation, data quality, and data validation. Develop detailed source-to-target data mapping specifications in collaboration with data architects and engineers. Define requirements for reporting, dashboards, and analytical use cases, identifying key metrics and KPIs. Contribute to the definition of data governance policies and procedures from a business perspective Stakeholder Management & Communication Serve as the primary bridge between the airline client's business users and the Rackspace technical team (Data Engineers, Data Architects). Clearly articulate business requirements and context to the technical team and translate technical considerations back to the business stakeholders. Facilitate effective communication and collaboration sessions. Documentation & Support Create and maintain comprehensive requirements documentation throughout the project. Develop process flow diagrams (As-Is and To-Be) to visualize data flows. Assist in the creation of test cases and scenarios. Support User Acceptance Testing (UAT) by clarifying requirements and validating results against business needs. Support project management activities, including scope management and change request analysis. Required Qualifications Bachelor's degree in Business Administration, Information Systems, Computer Science, or a related field. 5+ years of experience as a Business Analyst, with a proven track record on data-centric projects (e.g., Data Warehousing, Business Intelligence, Data Analytics, Data Migration, Data Platform implementation). Strong analytical and problem-solving skills with the ability to understand complex business processes and data landscapes. Excellent requirements elicitation techniques (interviews, workshops, surveys, document analysis). Proficiency in creating standard BA artifacts (BRDs, User Stories, Use Cases, Process Flows, Data Mapping). Exceptional communication (written and verbal), presentation, and interpersonal skills. Experience working directly with business stakeholders at various levels. Ability to manage ambiguity and work effectively in a fast-paced, client-facing environment. Understanding of data modelling principles. Preferred Qualifications Experience working within the healthcare industry (knowledge of clinical workflows, EHR/EMR systems, medical billing, patient data privacy, care coordination, or public health analytics is a significant plus). Specific experience analyzing data from or integrating with systems like Epic, Cerner, Meditech, Allscripts, or other healthcare-specific platforms . Proficiency in SQL for data analysis and querying. Familiarity with Agile/Scrum methodologies. Experience with BI and data visualization tools (e.g., Tableau, Power BI, Qlik). CBAP or similar Business Analysis certification.
Posted 2 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.
Posted 2 weeks ago
10.0 - 17.0 years
20 - 35 Lacs
Gurugram
Hybrid
Who We Are: As the worlds leading sustainability consulting firm, ERM is uniquely positioned to contribute to the environment and society through the expertise and energy of our employees worldwide. Sustainability is what we do, and is at the heart of both our service offerings and how we operate our business. For our people, our vision means attracting, inspiring, developing and rewarding our people to work with the best clients and on the biggest challenges, thus creating valuable careers. We achieve our vision in a sustainable manner by maintaining and living our ERM values that include Accountability, Caring for our People, Client Focus, Collaboration, Empowerment, and Transparency. ERM does not accept recruiting agency resumes. Please do not forward resumes to our jobs alias, ERM employees or any other company location. ERM is not responsible for any fees related to unsolicited resumes. ERM is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, marital status or disability status. Job Description An exciting opportunity has emerged for a seasoned Data Architect to become a vital member of our ERM Technology team. You will report to the Lead Enterprise Architect and join a dynamic team focused on delivering corporate and technology strategic initiatives. The role demands high-level analytical, problem-solving, and communication skills, along with a strong commitment to customer service. As the Data Architect for ERM, you will work closely with both business and technology stakeholders, utilizing your expertise in business intelligence, analytics, data engineering, data management, and data integration to significantly advance our data strategy and ecosystem. Key responsibilities include: Empowered to define the data and information management architecture for ERM. Collaborate with product owners, engineers, data scientists, and business stakeholders to understand data needs across the full product lifecycle. Ensure a shared understanding of our data, including its quality, ownership, and lineage throughout its lifecycle, from initial capture via client interaction to final consumption by internal and external processes and stakeholders. Ensure that our data landscape effectively meets corporate and regulatory reporting requirements. Establish clear ownership and governance for comprehensive data domain models, encompassing both data in motion and data at rest. Provide expert guidance on solution architecture, engineering principles, and the implementation of data applications utilizing both existing and cutting-edge technology platforms. Build a robust data community by collaborating with architects and engineers, leveraging this community to implement solutions that enhance client and business outcomes through data. The successful candidate will have: Proven experience as an enterprise data architect. Experience in end-to-end implementation of data-intensive analytics-based projects encompassing data acquisition, ingestion, integration, transformation and consumption. Proven experience in the design, development, and implementation of data engineering technologies. Strong knowledge of data management and governance principles. A strong understanding of Azure and AWS service landscapes, particularly data services. Proven experience with various data modelling techniques. Understanding of big data architectures and emerging trends in technology. A solid familiarity with Agile methodologies, test-driven development, source control management, and automated testing. Thank you for your interest in ERM.
Posted 2 weeks ago
10.0 - 15.0 years
11 - 15 Lacs
Gurugram
Work from Office
Position Summary: We are seeking a Senior BI Platform Engineer with 10+ years of experience and deep expertise in Tableau, Power BI, Alteryx, and MicroStrategy (MSTR) . The ideal candidate will serve as a technical lead and administrator for our BI platforms, ensuring reliable performance, advanced user support (L3), and stakeholder engagement. This role also includes implementing and maintaining CI/CD pipelines for BI assets to ensure scalable, automated, and governed deployment processes. Key Responsibilities: Serve as platform administrator for Tableau, Power BI, Alteryx, and MSTRmanaging permissions, data sources, server performance, and upgrades. Provide Level 3 (L3) support for BI platforms, resolving complex technical issues, root cause analysis, and platform-level troubleshooting. Design, implement, and maintain CI/CD pipelines for BI dashboards, dataflows, and platform configurations to support agile development and deployment. Collaborate with cross-functional teams to gather requirements and ensure proper implementation of dashboards and analytics solutions. Monitor and optimize BI platform performance, usage, and adoption. Work closely with data engineering teams to ensure data quality and availability for reporting needs. Create and maintain documentation for governance, support processes, and best practices. Train and mentor users and junior team members on BI tools and reporting standards. Act as a liaison between business stakeholders and technical teams, ensuring alignment and timely resolution of issues. Manage all BI upgrades Manage Power BI gateway, Tableau bridge, Alteryx server and other BI platforms capacity optimally Manage and enable new features in each of the BI platforms Manage licenses for each platform automated assignments, off-boarding users off the licensing and manage the licensing optimally Manage RBAC for all the BI platforms Required Qualifications: 10+ years of experience in a BI support or engineering role. Advanced experience with Tableau, Power BI, Alteryx, and MSTR , including administrative functions, troubleshooting, and user support. Proven experience providing L3 support and managing CI/CD pipelines for BI platforms. Strong knowledge of BI architecture, data visualization best practices, and data modelling concepts. Excellent roblem-solving and communication skills, with the ability to interact confidently with senior business leaders. Experience with SQL, data warehouses, and cloud platforms (e.g. Azure, Snowflake) is preferred. Bachelors degree in computer science, Information Systems, or a related field. Preferred Qualifications: Experience with Tableau Server/Cloud, Power BI Service, and MSTR administration. Familiarity with enterprise data governance and access control policies. Certifications in Tableau, Power BI, Alteryx, or MSTR are a plus.
Posted 2 weeks ago
3.0 - 8.0 years
22 - 32 Lacs
Pune, Bengaluru
Hybrid
Role & responsibilities 3+ years of relevant consulting-industry experience Deep understanding of data management best practices and data analytics Ability to translate functional / nonfunctional requirements to systems requirements Logical thinking and problem-solving skills along with an ability to collaborate. Coding skills in Python Proven abilities in writing SQL queries to retrieve data for data analysis Proficiency in Microsoft Office products, with advanced Excel skills required. Undertake primary ownership in driving self and team members effort across all phases of a project lifecycle Oversees and develops Data Warehouse projects to ensure methodological soundness; deliver quality client deliverables within expected timeframe Assist in designing data models that reflect business needs and support analytical objectives Partner with Project lead/ Program lead in delivering projects and assist in project management responsibility like - project plans, and risk mitigation Develop and maintain comprehensive documentation for data warehouse architecture, ETL processes, and system configurations to ensure clarity and compliance Participate in design reviews and code reviews to ensure adherence to quality standards and best practices in data warehousing Foster collaboration with onshore and offshore team members to ensure seamless communication and effective task execution Lead task planning and task distribution across team members and ensure timely completion with high quality and report accurate status to project lead Mentor/coach the junior members in the team Oversee direct and indirect client communications based on assigned project responsibilities: -Foster a culture of continuous improvement and innovation while demonstrating the ability to learn new technologies, business domains, and project management processes -Analyze problem statements and client requirements to design and implement complex solutions using Programing languages, ETL platform Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management Provides project leadership for team members regarding process improvements, planned approaches for client requests, or transition of new deliverables. Mandates: Pharma Domain experience with Data Bricks, Python/Pyspark, ETL, Snowflake and SQL
Posted 2 weeks ago
10.0 - 15.0 years
12 - 16 Lacs
Gurugram
Remote
Job Summary The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description Provides Level 3 operational coverage: Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. KNOWLEDGE/SKILLS/ABILITY Good hands-on Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. Ability to read and write sql and stored procedures. Good hands-on experience in configuring, managing and troubleshooting along with general analytical and problem-solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. JOB COMPLEXITY: This role requires extensive problem-solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. EXPERIENCE/EDUCATION Requires a Bachelors degree in computer science or other related field plus 10+ years of hands-on experience in configuring and managing AWS/tableau and databricks solution. Experience with Databricks and tableau environment is desired.
Posted 2 weeks ago
6.0 - 10.0 years
13 - 17 Lacs
Chennai
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role You act as a contact person for our customers and advise them on data-driven projects. You are responsible for architecture topics and solution scenarios in the areas of Cloud Data Analytics Platform, Data Engineering, Analytics and Reporting. Experience in Cloud and Big Data architecture. Responsibility for designing viable architectures based on Microsoft Azure, AWS, Snowflake, Google (or similar) and implementing analytics. Experience in DevOps, Infrasturcure as a code, DataOps, MLOps. Experience in business development (as well as your support in the proposal process). Data warehousing, data modelling and data integration for enterprise data environments. Experience in design of large scale ETL solutions integrating multiple / heterogeneous systems. Experience in data analysis, modelling (logical and physical data models) and design specific to a data warehouse / Business Intelligence environment (normalized and multi-dimensional modelling). Experience with ETL tools primarily Talend and/or any other Data Integrator tools (Open source / proprietary), extensive experience with SQL and SQL scripting (PL/SQL & SQL query tuning and optimization) for relational databases such as PostgreSQL, Oracle, Microsoft SQL Server and MySQL etc., and on NoSQL like MongoDB and/or document-based databases. Must be detail oriented, highly motivated and work independently with minimal direction. Excellent written, oral and interpersonal communication skills with ability to communicate design solutions to both technical and non-technical audiences. IdeallyExperience in agile methods such as safe, scrum, etc. IdeallyExperience on programming languages like Python, JavaScript, Java/ Scala etc. Your Profile Provides data services for enterprise information strategy solutions - Works with business solutions leaders and teams to collect and translate information requirements into data to develop data-centric solutions. Design and develop modern enterprise data centric solutions (e.g. DWH, Data Lake, Data Lakehouse) Responsible for designing of data governance solutions. What you will love about working here We recognize the significance of flexible work arrangements to provide support . Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Chennai
Work from Office
About The Role Business Advisors shape the vision with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Conducts appropriate meetings/workshops to elicit/understand and document the business requirements using their domain expertise. In addition, may also produce process and data models of the current and/or future state. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Chennai
Work from Office
About The Role About The Role : Interpret business requirements and translate them into technical specifications. Design, develop, and maintain Qlik Sense dashboards, reports, and data visualizations. Perform data extraction, transformation, and loading (ETL) from various sources. Create and manage QVD files and implement data modeling best practices. Ensure data accuracy and consistency through validation and testing. Optimize Qlik Sense applications for performance and scalability. Collaborate with business analysts, data engineers, and stakeholders. Provide technical support and troubleshoot issues in Qlik Sense applications. Document development processes, data models, and user guides. 4+ years of experience in Qlik Sense development and dashboarding. Strong knowledge of data modeling, set analysis, and scripting in Qlik. Proficiency in SQL and experience with RDBMS like MS SQL Server or Oracle. Familiarity with Qlik Sense integration with web technologies and APIs. Understanding of BI concepts and data warehousing principles. Excellent problem-solving and communication skills. Qlik Sense certification is a plus. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 2 weeks ago
0.0 - 1.0 years
15 - 16 Lacs
Thane
Work from Office
The main purpose of this role is to maintain good data health and analytics practices in the company- This role would help the company adopt data-driven decision-making in all its functions- Key Performance Indicators Standardizing data practices for the company, creating dashboards and providing insights Key Responsibilities Gathering unstructured data from different departments in the company, collating and maintaining a data warehouse Building comprehensive reports and guiding different departments such as sales, marketing, supply chain and finance by identifying trends, formulating and testing hypotheses from data and providing actionable insights End-to-end problem solving for the business, right from identifying gaps/opportunities to proposing innovative changes Always supporting key functions by responding to ad-hoc data/dashboarding requests Potential future responsibilities: using sophisticated statistical techniques to solve business problems (predictive modelling, optimization algorithms, etc-) Experience & Qualification Preferred: 0-1 years of experience working on data manipulation (R, Python or similar), data visualisation (Power BI, Looker, etc) Mandatory: MS Excel, SQL Strong Problem Solving and Analytical Thinking
Posted 2 weeks ago
4.0 - 9.0 years
11 - 15 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Cp\Were Hiring: Cloud Data Architect Azure DatabricksC/p\Cp\Key Responsibilities:C / p\Cul\Cli\Design and implement scalable, efficient cloud data models using Azure Data Lake and Azure Databricks.C/li\Cli\Ensure data quality, consistency, and integrity across all data models and platforms.C/li\Cli\Define and enforce development standards and best practices.C/li\Cli\Architect and model Business Intelligence (BI) and Analytics solutions to support data-driven decision-making.C / li\Cli\Collaborate with stakeholders to gather business requirements and translate them into technical specifications.C/li\Cli\Develop and maintain data models, data integration pipelines, and data warehousing solutions.C/li\Cli\Build and manage ETL (Extract, Transform, Load) processes to ensure timely and accurate data availability.C / li\C / ul\Cp\Required Qualifications:C / p\Cul\Cli\Proven experience as a Data Scientist, Data Architect, Data Analyst, or similar role.C/li\Cli\Strong understanding of data warehouse architecture and principles.C/li\Cli\Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or PostgreSQL.C/li\Cli\Hands-on experience with Databricks for data engineering and SQL warehouse MUST.C/li\Cli\Familiarity with data visualization tools like Power BI or Qlik.C/li\Cli\Experience with data warehousing platforms such as Snowflake or Amazon Redshift.C/li\Cli\Strong analytical and problem-solving skills.C/li\Cli\Excellent communication and collaboration abilities.C / li\Cli\Bachelordegree in Computer Science, Engineering, or a related field; Masterdegree preferred.C/li\Cli\Demonstrated expertise in BI and Analytics architecture, ETL design, and data integration workflows.C/li\C/ul\Cp\Cbr\C/p
Posted 2 weeks ago
1.0 - 5.0 years
11 - 12 Lacs
Hyderabad
Work from Office
Responsibilities Participating in the development of OnCall Analytics Solutions using safe-agile methodologies. Design and implementation of ETL changes as per the customer needs. Supporting the Design and Implementation of functionalities in OnCall Analytics Portal, Configuration Manager and Link Analysis web applications. Design and implementation of continuous integration and deployment Deliver the features (user stories) with utmost quality. Creating cloud offerings of the BI products. Provide ad-hoc analytics support and activities in a collaborative work environment. Should be able to deliver code supporting different versions of MSSQL and environment. Building & deploying custom components on ETL supporting different version. Education / Qualifications B.E, B.Tech, M.Tech (CSE) Advanced analytical and technical skills; MSSQL, SSIS mandatory Extensive hands-on experience on building ETL from scratch. Having knowledge on SSAS and PowerBi additional Query optimization techniques, Performance oriented design planning. Structuring of DataWarehouse efficiently Knowledgeable in Cloud, preferably Azure. Familiarity with build and CI tools. Strong communication skills Understand Cloud Concepts.
Posted 2 weeks ago
4.0 - 7.0 years
25 - 30 Lacs
Ahmedabad
Work from Office
ManekTech is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 weeks ago
8.0 - 10.0 years
6 - 10 Lacs
Noida
Work from Office
Profile- Power BI developer Exp- 8-10 years Location - Noida JD- Skills and Responsibilities: 1. Proficiency in Power BI: You should be able to create, publish, and maintain Power BI reports and dashboards. 2. DAX and Power Query: Youll likely be writing DAX formulas and using Power Query for data transformation. 3. Data Modeling: Experience in data modeling and integrating reporting components from multiple data sources is expected. 4. Data Analysis: You should be able to gather, analyze, and present data effectively. 5. SQL: Familiarity with SQL and data warehousing concepts is highly beneficial. 6. ETL Processes: Understanding ETL processes is also valuable. 7. Project Management: Experienced professionals may be involved in project management aspects. 8. Emerging Trends: Staying updated with emerging trends like AI/ML integration, cloud-based BI, and data visualization techniques is crucial. 9. Data Governance and Security:Understanding data governance and security practices is increasingly important. Key Skills - Changes, Enhancement And New Dashboard Building In QLIKVIEW And QLIKSENCE PowerBI Application Apply Now Apply For Job
Posted 2 weeks ago
3.0 - 5.0 years
12 - 14 Lacs
Bengaluru
Work from Office
Senior Ab Initio Developer Job Description: Responsibilities : Design, develop, and maintain efficient ETL processes to extract, transform, and load data from various sources into data warehouse using Ab Initio. Perform data analysis and develop data models to support business requirements. Troubleshoot and debug ETL processes, identify and resolve issues in a timely manner. Optimize performance of ETL jobs to meet SLAs and performance requirements. Work closely with cross-functional teams to understand data needs and develop solutions accordingly. Collaborate with onshore clients to gather requirements. Develop and maintain documentation for data processes, data models. Flexible to work on weekend shifts and night shifts as required. Technical Skills: 3-5 years of hands-on experience in Ab Initio. Proficient in SQL, Unix/Linux and shell scripting. Excellent analytical and problem-solving skills. Experienced in working with onshore clients and distributed teams. Can work independently as well as collaboratively in a team environment. Strong communication skills. Familiar with cloud platforms like Azure. Experience with other data integration tools and technologies is a plus. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 2 weeks ago
1.0 - 2.0 years
8 - 9 Lacs
Pune
Work from Office
Role purpose Senior Integration Developer will be responsible for low level design, development and implementation support for application middleware integration as part of Syngenta s Integration Center of Excellence. As an integral member of the Enterprise Integration Team, Integration Developer will: Analyze business requirements and convert to integration requirements. Create and take complete ownership of Low level integration design for integration requirement (Projects, Changes, etc. ) Provide test support and work with source teams to resolve integration defects. Be a focal point for all middleware technical issues on the project. Identify issues in interfaces (performance, systems errors) and work on technical POCs in SAP IS independently. Adhere to Syngenta Integration design and coding standards. Exposure to Apigee or any API Management tool is nice to have. Accountabilities Ensure requirements and design are fit for purpose and properly documented. Adhere to coding standards and best practices and do peer code and low level design review. Work with development team to identify root cause of major integration defects and implement POCs / CSIs for any technical gaps. Defend Integration estimates for in-scope work. Critical success factors & key challenges Reacts quickly and adapts to changes in priorities, circumstances, and direction Effectively multi-tasks and manages multiple projects concurrently in a time-sensitive work environment Think end to end, focused on ensuring a quality business outcome and avoiding issues that introduce business impact. Knowledge, experience & capabilities Critical Knowledge: Bachelor s Degree in Computer Science, Mathematics, Engineering or related field; equivalent work experience may be considered. Good knowledge of enterprise level application integration and middleware concepts. Well versed with Enterprise database and data warehouse concepts. Critical Experience: Overall 3 + years of experience in Integration space. Strong experience in integration build using SAP-CPI or SAP-IS . Atleast 3+ years experience developing Enterprise Service Bus solutions using SAP- PO / SAP CPI / SAP-IS . Understanding of APIs. Exposure to Apigee or any other API Management is a nice to have. Decent understanding of integration design activities how to deisgn a flow, field mapping etc. Experience in a Devops or Agile work environment a plus. Critical Capabilities: To create the optimum technical solution in the context of the customer s environment, requirements, and financial resources Ability and willingness to learn and scale-up on any new integration product. Strong probing, analysis and problem-solving skills Excellent verbal and written English communication skills, including the ability to communicate clearly to customers over the phone. Team player who is willing to go above and beyond to help others Positive, professional attitude and ability to establish and maintain effective working relationships in cross-functional and team environment. Thinking end to end focused on business outcome.
Posted 2 weeks ago
2.0 - 7.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer Dear Candidates, Greetings from ExxonMobil! Please copy and paste the below link into your browser to apply for the position in the company website. Link to apply: https://jobs.exxonmobil.com/job-invite/80614/ Please find the JD below, What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Support in developing and owning ETL pipelines within cloud data platforms Data extraction and transformation pipeline Automation using Python/Airflow/Azure Data Factory/Qlik/Fivetran Delivery of task monitoring and notification system for data pipeline status Supporting data cleansing, enrichment, and curation / enrichment activities to enable ongoing business use cases Developing and delivering data pipelines through a CI/CD delivery methodology Developing monitoring around pipelines to ensure uptime of data flows Optimization and refinement of current queries against Snowflake Working with Snowflake, MSSQL, Postgres, Oracle, Azure SQL, and other relational databases Work with different cloud databases such as Azure SQL, Azure PostgreSQL, Etc. Working with Change-Data-Capture ETL software to populate Snowflake such as Qlik and Fivetran Identification and remediation of failed and long running queries Development of large aggregate queries across a multitude of schemas About You Skills and Qualifications Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Preferred Qualifications/ Experience Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills. Thanks & Regards, Anita
Posted 2 weeks ago
8.0 - 13.0 years
13 - 18 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role: Lead Software Engineer The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering, and distribution to power our Financial Institution Group business and customer needs. We focus on platform scalability to support business operations by following a common data lifecycle that accelerates business value. Our team provides essential intelligence for the Financial Services, Real Estate, and Insurance industries. The Impact: The FIG Data Engineering team will be responsible for implementing and maintaining services and tools to support existing feed systems. This enables users to consume FIG datasets and makes FIG data available for broader consumption and processing within the company. Whats in it for you: Opportunity to work with global stakeholders and engage with the latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using advanced data processing and cloud technologies. Collaborate with the broader technology team, including information architecture and data integration teams, to align pipelines with strategic initiatives. What Were Looking For: Bachelors degree in computer science or a related field, with at least 8+ years of professional software development experience. Must have: Programming languages commonly used for data processing, Data orchestration and workflow management systems,Distributed data processing framework, relational database management systems, Big data processing frameworks Experience with large-scale data processing platforms. Deep understanding of RESTful services, good API design, and object-oriented programming principles. Proficiency in object-oriented or functional scripting languages. Good working knowledge of relational and NoSQL databases. Experience in maintaining and developing software in production environments utilizing cloud-based tools. Strong collaboration and teamwork skills, along with excellent written and verbal communication abilities. Self-starter and motivated individual with the ability to thrive in a fast-paced software development environment. Agile experience is highly desirable. Experience with data warehousing and analytics platforms will be a significant advantage. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Hyderabad, Chennai
Hybrid
Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.
Posted 2 weeks ago
9.0 - 13.0 years
10 - 14 Lacs
Pune
Work from Office
Reporting to the General Manager - Data Science,theMachine Learning Engineering Managerwill bea key part of the Data Science Management Team, leading the productionising of advanced analytics and Machine Learning initiatives. In this exciting new role, you will be expected to collaborate with technical and data teams, building out platform capability and processes to serve our Data Science and analytics community. Roles and Responsibilities What will you do in the role? With strong experience in managing technical teams and Machine Learning development lifecycles,you will be responsible for theday to day management of the Machine Learning Engineering team, ensuring they are delivering high quality solutions. You will also: Coach and develop the MLE team to leverage cutting edge Data Science, Machine Learning & AI technology. Maintain, evolve and develop our platforms to ensure that we have robust, scalable environments for the Data Scientists Provide technical guidance, support and mentorship to team members, helping them grow in their roles. Stay current with industry trends and advancements in AI and ML to support the company’s data strategy. Establish and enforce best practice for ML & AI model deployment and monitoring. What are the key skills / experience you’ll already have? You will be highly numerate with a strong technical background with a proven ability to maintain hands on technical contribution whilst managing a team. You will have: Experience of training, evaluating, deploying and maintaining Machine Learning models Sound understanding of data warehousing and ETL tools Strong technical skills in following key tools & technologies Python and PySpark for data processing Familiarity with Snowflake, RDBMS or other databases Experience of working with Cloud infrastructure Experience of building infrastructure as code using technologies such as Terraform Exposure to ML Frameworks like Scikit Learn/TensorFlow/PyTorch Strong drive to master new tools, platforms, and technologies. Methodical approach with good attention to detail Effective communication skills – Ability to work with international teams and across cultures.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France