Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
45 - 50 Lacs
Bengaluru
Work from Office
The Risk and Identity Solutions (RaIS) team provides risk management services for banks, merchants, and other payment networks. Machine learning and AI models are the heart of the real-time insights used by our clients to manage risk. Created by the Visa Predictive Models (VPM) team, continual improvement and efficient deployment of these models is essential for our future success. To support our rapidly growing suite of predictive models we are looking for engineers who are passionate about managing large volumes of data, creating efficient, automated processes and standardizing ML/AI tools. Job Description This is a great opportunity to work with a new Data Engineering and MLOps team to scale and structure large scale data engineering and ML/AI that drives significant revenue for Visa. As a member of the Risk and Identify Solutions modeling organization (VPM), your role will involve developing and implementing practices that will allow deployment of machine learning models in large data science projects. You must be a hands-on expert able to navigate both data engineering and data science disciplines to build effective engineering solutions that support ML/AI models. You will partner closely with global stakeholders in RaIS Product, VPM Data Science and Visa Research to help create and prioritize our strategic roadmap. You will then leverage your expert technical knowledge of data engineering, tools and data architecture in the design and creation of the solutions on our roadmap. The position is based at Visas offices in Bangalore, India. Qualifications 7+ yrs. work experience with a bachelors degree or 6+ years of work experience with a Masters or Advanced Degree in an analytical field such as computer science, statistics, finance, economics, or relevant area. Working knowledge of Hadoop ecosystem and associated technologies, (For e.g. Apache Spark, Python, Pandas etc.) Technical skills: Strong experience in creating large scale data engineering pipelines, data-based decision-making, and quantitative analysis. Experience with SQL for extracting, aggregating, and processing big data Pipelines using Hadoop, EMR & NoSQL Databases. Experience with complex, high volume, multi-dimensional data, as well as machine learning models based on unstructured, structured, and streaming datasets. Preferred skills: ETL processes: The role also involves developing and executing large scale ETL processes to support data quality, reporting, data marts, and predictive modeling. Spark pipelines: The role requires building and maintaining efficient and robust Spark pipelines to create and access data sets and feature stores for ML models. Experience in writing and optimizing spark code and Hive code to process Large Data Sets in Big-Data Environments. Strong Development experience in more than one of the following: Golang, Java, Python, Rust. Knowledge of standard big data and Real Time stack such as Hadoop, Spark, Kafka, Redis, Flink and similar technologies Hands on experience in building and maintaining data pipelines, feature engineering pipelines and comfortable with core ML concepts. Hands on experience in engineering, testing, validating and productizing AL/ML models for high performance use cases. Exposure to model serving engines such as TensorFlow, Triton etc. Exposure to model development frameworks like Ml flow. Proficient in managing and operating AWS services including EC2, S3, SageMaker etc. Proficient in setting up and managing distributed data and computing environments using AWS services. Knowledge about DR / HA topologies, Reliability Engineering with hands on experience in implementing the same. Knowledge of using and maintaining DevOps tools and implementing automations for production Experience of working with containerized and virtualized environments (Docker, K8s) Experience with Unix/Shell or Python scripting and exposure to Scheduling tools like Airflow and Control - M. Experience creating/supporting production software/systems and a proven track record of identifying and resolving performance bottlenecks for production systems. Exposure to deploying large scale ML/AI models built by the data science teams and experience with development of models is a strong plus. Exposure to public cloud equivalents, and ecosystem shall be a plus. Strong Experience with Visualization Tools like Tableau, Power BI, is a plus.
Posted 1 month ago
4.0 - 8.0 years
11 - 15 Lacs
Pune
Work from Office
Join us as a Data Records Governance Lead at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. you'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with Data and Records Management Governance, Data Lineage, Data Controls, as we'll as job-specific skillsets. To be successful as a Data Records Governance Lead, you should have experience with: Basic/ Essential Qualifications: Strategic Vision and Leadership. Data Governance and Quality Management. Knowledge that includes data architecture, integration, analytics, Artificial Intelligence, or Cloud computing. Desirable skillsets/ good to have: Data Modelling. Knowledge of Data Architecture or experience with working with Data Architects. Data Sourcing Provisioning. Data Analytics. Data Privacy and Security. This role will be based out of Pune. Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using we'll developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.
Posted 1 month ago
3.0 - 6.0 years
7 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. we'll versed in Data Quality rules, principles and implementation. Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies
Posted 1 month ago
1.0 - 6.0 years
5 - 6 Lacs
Nagercoil
Work from Office
Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.
Posted 1 month ago
9.0 - 12.0 years
1 - 2 Lacs
Hyderabad
Remote
Job Title: Data Architect Location: Remote Employment Type: Full-Time Reports to: Lead Data Strategist About Client / Project: Client is a specialist data strategy and AI consultancy that empowers businesses to unlock tangible value from their data assets. We specialize in developing comprehensive data strategies tailored to address core business and operational challenges. By combining strategic advisory with hands-on implementation, we ensure data becomes a true driver of business growth, operational efficiency, and competitive advantage for our clients. As a solutions-focused and forward-thinking consultancy, we help organizations transform their data capabilities using modern technology, reduce costs, and accelerate business growth by aligning every initiative directly with our clients core business objectives. Role Overview We are seeking a highly experienced Data Architect to lead the design and implementation of scalable data architectures for global clients across industries. You will define enterprise-grade data platforms leveraging cloud-native technologies and modern data frameworks. Key Responsibilities Design and implement cloud-based data architectures (GCP, AWS, Azure, Snowflake, Redshift, Databricks, or Hadoop)• Develop conceptual, logical, and physical data models Define data flows, ETL/ELT pipelines, and ingestion strategies Design and maintain data catalogs, metadata, and domain structures Establish data architecture standards, reference models, and blueprints Oversee data lineage, traceability, and audit readiness Guide integration of AI/ML pipelines and analytics solutions Ensure data privacy, protection, and compliance (e.g., GDPR, HIPAA) Collaborate closely with Engineers, Analysts, and Strategists Required Skills & Qualifications 8+ years of experience in data architecture or enterprise data platform roles Deep experience with at least two major cloud platforms (AWS, Azure, GCP) Proven hands-on work with modern data platforms: Snowflake, Databricks, Redshift, Hadoop Strong understanding of data warehousing, data lakes, lakehouse architecture Advanced proficiency in SQL, Python, Spark, and/or Scala Experience with data cataloging and metadata tools (e.g., Informatica, Collibra, Alation) Knowledge of data governance frameworks and regulatory compliance Strong documentation, stakeholder communication, and architectural planning skills Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred)
Posted 1 month ago
8.0 - 12.0 years
18 - 27 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar 8+ years of experience in Data Engineering. Microsoft Azure Synapse Analytics experience is essential. (Azure Data Factory, Dedicated SQL Pool, Lake Database, Azure Storage) Hands on experience in Spark notebooks(python or Scala) is mandatory End-to-end Data Warehouse experience: Ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and Data Security. Advanced SQL/relational database knowledge and query authoring. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Strong skills in handling and analysing complex, high volume data with excellent attention in details. Knowledge of data modelling and data warehousing concepts, such as DataVault or 3NF. Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects.
Posted 1 month ago
6.0 - 8.0 years
30 - 32 Lacs
Bengaluru
Work from Office
We are seeking an experienced ER Modeling Expert / Data Modeler to design, develop, and maintain conceptual, logical, and physical data models for enterprise applications and data warehouses. The ideal candidate should have a deep understanding of relational databases, normalization, data governance, and schema design while ensuring data integrity and scalability. Key Responsibilities: Design and develop Entity-Relationship (ER) models for databases, data warehouses, and data lakes. Create conceptual, logical, and physical data models using tools like Erwin, Visio, Lucidchart, or PowerDesigner. Define primary keys, foreign keys, relationships, cardinality, and constraints for optimal data integrity. Work closely with DBAs, data architects, and software developers to implement data models. Optimize database performance, indexing, and query tuning for relational databases. Define and enforce data governance, data quality, and master data management (MDM) standards. Develop and maintain metadata repositories, data dictionaries, and schema documentation. Ensure compliance with data security and privacy regulations (GDPR, HIPAA, etc.). Support ETL/ELT pipeline design to ensure smooth data flow between systems. Work with big data platforms (Snowflake, Databricks, Redshift, BigQuery, or Synapse) to support modern data architectures. Required Skills & Qualifications: 6+ years of experience in data modeling, database design, and ER modeling. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL, MySQL, etc.). Hands-on experience with data modeling tools (Erwin, PowerDesigner, DB Designer, Visio, or Lucidchart). Proficiency in SQL, indexing strategies, query performance tuning, and stored procedures. Deep understanding of normalization, denormalization, star schema, and snowflake schema. Experience with data governance, data quality, and metadata management. Strong knowledge of ETL processes, data pipelines, and data warehousing concepts. Familiarity with NoSQL databases (MongoDB, Cassandra, DynamoDB) and their modeling approaches. Ability to collaborate with cross-functional teams including data engineers, architects, and business analysts. Strong documentation and communication skills. Preferred Qualifications: Certifications in Data Management, Data Architecture, or Cloud Databases. Experience with cloud-based databases (AWS RDS, Azure SQL, Google Cloud Spanner, Snowflake). Knowledge of Graph Databases (Neo4j, Amazon Neptune) and hierarchical modeling.
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Do you want to help solve the world's most pressing challengesFeeding the world's growing population and slowing climate change are two of the world's greatest challenges AGCO is a part of the solution! Join us to make your contribution AGCO is looking to hire candidates for the position of Senior Manager, AI & Data Systems Architecture We are seeking an experienced and innovative Senior Manager, AI & Data Systems Architecture to lead the design, creation, and evolution of system architectures for AI, analytics, and data systems within our organization The ideal candidate will have extensive experience delivering scalable, high-performance data and AI architectures across cloud platforms such as AWS, Google Cloud Platform, and Databricks, with a proven ability to align technology solutions with business goals This individual will collaborate with cross-functional teams, including data engineers, data scientists, and other IT professionals to create architectures that support cutting-edge AI and data initiatives, driving efficiency, scalability, and innovation Your Impact Architecture Leadership: Lead the end-to-end architecture for AI and data systems, ensuring cost-effective scalability, performance, and security across cloud and on-premises environments The goal is to build and support a modern data stack AI & Data Systems: Design, implement, and manage data infrastructure and AI platforms, including but not limited to AWS, Azure, Google Cloud Platform, Databricks, and other key data tools Lead the data model approach for all data products and solutions Cloud Expertise: Champion cloud adoption strategies, optimizing data pipelines, analytics workloads, and AI/ML model deployment, end point creation and app integration System Evolution: Drive the continuous improvement and evolution of data and AI architectures to meet emerging business needs, technological advancements, and industry trends Collaboration & Leadership: Work closely with delivery teams, data engineers, data scientists, software engineers, and IT operations to implement comprehensive data architectures that support AI and analytics initiatives focused on continuous improvement Strategic Vision: Partner with business and technology stakeholders to understand long-term goals, translating them into architectural frameworks and roadmaps that drive business value Governance & Best Practices: Ensure best practices in data governance, security, and compliance, overseeing the implementation of standards across AI and data systems Performance Optimization: Identify opportunities to optimize performance, cost-efficiency, and operational effectiveness of AI and data systems including ETL, ELT and data pipeline creation and evolution and optimizing of AI resource models Functional Knowledge Experience: 10+ years of experience in data architecture, AI systems, or cloud infrastructure, with at least 3-5 years in a leadership role Proven experience driving solutions from ideation to delivery and support Cloud Expertise: Deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks Familiarity with other data and AI platforms is a plus CRM Expertise: Hands-on experience with key CRM systems like Salesforce and AI systems inside of those solutions (ex Einstein) AI & Analytics Systems: Proven experience designing architectures for AI, machine learning, analytics, and large-scale data processing systems Technical Knowledge: Expertise in data architecture, including data lakes, data warehouses, real-time data streaming, and batch processing frameworks Cross-Platform Knowledge: Solid understanding of containerization (Docker, Kubernetes), infrastructure as code (Terraform, CloudFormation), and big data ecosystems (Spark, Hadoop) Experience in applying Agile methodologies, including Scrum, Kanban or SAFe Experience in top reporting solutions, including/preferred Tableau which is one of our cornerstone reporting solutions Leadership: Strong leadership and communication skills, with the ability to drive architecture initiatives in a collaborative and fast-paced environment Excellent problem solving skills and a proactive mindset Education: Bachelors degree in Computer Science, Data Science, or related field Masters degree or relevant certifications (e g , AWS Certified Solutions Architect) is preferred Business Expertise Experience in industries such as manufacturing, agriculture, or supply chain, particularly in AI and data use cases Familiarity with regulatory requirements related to data governance and security Experience with emerging technologies like edge computing, IoT, and AI/ML automation tools Your Experience And Qualifications Excellent communication / interpersonal skills, capable of interacting with multiple levels of IT and business management/leadership Hands on experience with SAP Hana, SAP Data Services or similar data storage, warehousing and/or ETL solutions 10+ years of progressive IT experience Experience creating data models, querying data, business process and technical process mapping Successfully influences diverse groups and teams in a complex, ambiguous and rapidly changing environment to deliver value-added solutions Effective working relationship with the business to ensure business requirements are accurately captured, agreed, and accepted Adaptable to new technologies/practices and acts as change agent within teams Your Benefits GLOBAL DIVERSITY Diversity means many things to us, different brands, cultures, nationalities, genders, generations even variety in our roles You make us unique! ENTERPRISING SPIRITEvery role adds value We're committed to helping you develop and grow to realize your potential POSITIVE IMPACT Make it personal and help us feed the world INNOVATIVE TECHNOLOGIES You can combine your love for technology with manufacturing excellence and work alongside teams of people worldwide who share your enthusiasm MAKE THE MOST OF YOU Benefits include health care and wellness plans and flexible and virtual work option??? Your Workplace AGCO is Great Place to Work Certified and has been recognized for delivering exceptional employee experience and a positive workplace culture We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruiting, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruitment efforts, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives Join us as we bring agriculture into the future and apply now! Please note that this job posting is not designed to cover or contain a comprehensive listing of all required activities, duties, responsibilities, or benefits and may change at any time with or without notice AGCO is proud to be an Equal Opportunity Employer
Posted 1 month ago
1.0 - 3.0 years
16 - 19 Lacs
Bengaluru
Work from Office
About The Position Chevron invites applications for the role of Cloud Engineer Data Hosting within our team in India This position supports Chevrons data hosting environment by delivering modern digital data hosting capabilities in a cost competitive, reliable, and secure manner This position will provide broad exposure to the application of technology to enable business with many opportunities for growth and professional development for the candidate Key Responsibilities Design, implement, and manage scalable and secure data hosting solutions on Azure Develop and maintain data architectures, including data models, data warehouses, and data lakes Refine data storage and extraction procedures to enhance performance and cost-effectiveness Uphold stringent data security measures and ensure adherence to relevant industry standards and regulatory requirements Collaborate with data scientists, analysts, and other stakeholders to understand and address their data needs Monitor and troubleshoot data hosting environments to ensure high availability and reliability Streamline data workflows and operations through the automation capabilities of Azure Data Factory and comparable technologies Design, develop, and deploy modular cloud-based systems Develop and maintain cloud solutions in accordance with best practices Required Qualifications Must have bachelors degree in computer science engineering or related discipline 0-5 years' experience At least 2 years of experience in data hosting for both on-premises and azure environments Microsoft AZ900 Certification Proficient in utilizing Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Data Factory In-depth understanding of cloud infrastructure, encompassing virtual networks, storage solutions, and compute resources within Azure Extensive hands-on experience with Azure services such as Azure SQL Database, Azure Blob Storage, Azure Data Lake, and Azure Synapse Analytics Well-versed in on-premises storage systems from vendors like NetApp, Dell, and others Skilled proficiency in scripting languages like Ansible, PowerShell, Python, and Azure CLI for automation and management tasks Comprehensive knowledge of Azure security best practices, including identity and access management, encryption, and compliance standards Preferred Qualifications Demonstrated proficiency in architecting, deploying, and managing secure and scalable data hosting solutions on the Azure platform Extensive experience in developing and maintaining robust data architectures, including data models, data warehouses, and data lakes, utilizing Azure services Expertise in optimizing data storage and retrieval processes for superior performance and cost efficiency within Azure environments In-depth knowledge of data security protocols and compliance with industry standards and regulations, with a focus on Azure cloud compliance Proven ability to collaborate effectively with data scientists, analysts, and other stakeholders to address their data needs using Azure's capabilities Strong track record of monitoring and troubleshooting Azure data hosting environments to ensure high availability and system reliability Skilled in automating data workflows and processes using Azure Data Factory and other Azure-based automation tools Experience in designing, developing, and deploying modular, cloud-based systems, with a particular emphasis on Azure solutions Commitment to maintaining cloud solutions in alignment with Azure best practices and continuously integrating Azure's latest updates and features Possession of Azure certifications, such as the Azure Data Engineer Associate or Azure Database Administrator Associate, with a preference for candidates holding the Azure Solutions Architect Expert certification or equivalent advanced credentials Chevron ENGINE supports global operations, supporting business requirements across the world Accordingly, the work hours for employees will be aligned to support business requirements The standard work week will be Monday to Friday Working hours are 8:00am to 5:00pm or 1 30pm to 10 30pm Chevron participates in E-Verify in certain locations as required by law
Posted 1 month ago
2.0 - 6.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Title: Data Governance & Management Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Banks Enterprise Data Management Frameworkfocusing on controls, culture, and capabilitiesto drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. Your key responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process including data collection template generation, quality checks, and stakeholder escalation. Execution of data cleansing and transformation tasks to prepare data for analysis. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Documentation of data quality findings and recommendations for improvement/feeding into the technology requirements. Work with Data Architecture & developers to design and build data FCRA Risk data metrics. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. To ensure new data sources align with Deutsche Banks Data Governance standards, maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, Power Query and MS excel to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your skills and experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query/excel for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams.
Posted 1 month ago
8.0 - 13.0 years
2 - 30 Lacs
Pune
Work from Office
Step into role of a Senior Data Engineer At Barclays, innovation isnt encouraged, its expected As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Posted 1 month ago
3.0 - 7.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a skilled Escalation Engineer with expertise in NetApp ONTAP, data center operations, and storage concepts The ideal candidate will possess a robust technical background in data storage, coupled with extensive experience in providing technical support and leading teams in resolving complex issues This role requires a deep understanding of product sustainability, engineering cycles, and a commitment to delivering exceptional customer service Job Requirements Serve as a subject matter expert in NetApp ONTAP and related storage technologies Lead and coordinate resolution efforts for escalated technical issues, collaborating closely with cross-functional teams Provide advanced troubleshooting and problem-solving expertise to address complex customer issues Conduct in-depth analysis of customer environments to identify root causes and develop effective solutions Actively participate in product sustainability initiatives, including product lifecycle management and engineering cycles Mentor and guide junior team members, fostering a culture of continuous learning and development Communicate effectively with customers, internal stakeholders, and management, both verbally and in writing Document technical solutions, best practices, and knowledge base articles to enhance team efficiency and customer satisfaction Education & Requirements Bachelors degree in Computer Science, Information Technology, or related field Extensive experience for 10+ years in technical support as a Sr Engineer/Principal Engineer, handling escalations preferably in a storage or data center environment In-depth knowledge of NetApp ONTAP and storage concepts such as SAN, NAS, RAID, and replication Strong understanding of data center architectures, virtualization technologies, and cloud platforms Proven track record of leading teams in resolving technical escalations and driving issue resolution Excellent collaboration skills with the ability to work effectively in a cross-functional team environment Exceptional verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences Demonstrated ability to prioritize and manage multiple tasks in a fast-paced environment Relevant certifications such as NetApp Certified Implementation Engineer (NCIE) or equivalent are a plus At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification Why NetApp We are all about helping customers turn challenges into business opportunity It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better but also to innovate We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches We enable a healthy work-life balance Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life If you want to help us build knowledge and solve big problems, let's talk
Posted 1 month ago
3.0 - 8.0 years
11 - 30 Lacs
Gurugram
Work from Office
Mulesoft Architect at N Consulting Ltd | Jobs at N Consulting Ltd 11 LPA to 30 LPA Hi Jobseeker, We are hiring Mulesoft Architect for our MNC client. Location-Gurgaon Interview Mode- Virtual Experience- 7yrs to 15yrs Notice Period- only immediate to 30days Below I have mentioned the JD Key responsibilities of this role include: Support the design and evolution of scalable and reusable systems architecture across products and services. Collaborate with engineering, DevOps, product, compliance, and security teams to ensure solutions are robust and compliant. Translate business requirements into architectural designs that scale and adapt to multi-cloud environments and complex data landscapes. Act as a technical authority and guide for engineering teams, enforcing best practices in integration, data architecture, and performance. Provide technical direction on MuleSoft integrations and ensure robust API strategy and lifecycle management. Ensure architectural alignment with regulatory requirements Document architectural standards, patterns, and decision-making rationales. Evaluate emerging technologies and their fit with the business and technical ecosystem. The Requirements Critical 3+ years of experience as a Solutions Architect in a startup or scale-up environment. Proven experience in designing systems to support massive scale, high availability, and modularity. Strong backend engineering foundation with fluency in distributed systems, APIs, and service-oriented architectures, using asynchronous patterns. Hands-on experience with MuleSoft, including API-led connectivity, integration patterns, and Anypoint Platform. Deep familiarity with multi-cloud infrastructures (min. AWS & Azure) and cloud-native architectural principles. Deep familiarity with designing complex database integrations (specifically MongoDB). Demonstrated success in environments requiring strict regulatory compliance. Ability to manage the architectural landscape solo, making informed decisions and justifying trade-offs effectively. Experience with complex database integrations, covering both relational and non-relational databases at large scale. Preferred Practical experience with DevSecOps principles and CI/CD pipelines. Familiarity with containerisation (Kubernetes) and microservices patterns. Strong stakeholder communication and documentation skills. Experience mentoring or guiding development teams on architecture patterns and decisions. Comfort working in agile, cross-functional teams. Strong problem-solving skills with a pragmatic approach to design and implementation. Role-Specific Tools & Technologies Core Tools: MuleSoft, REST APIs, MongoDB. Cloud & Infrastructure: AWS, Azure, Terraform, Kubernetes Success In This Role Looks Like Delivery of robust, scalable architecture that supports rapid product ideation and delivery, without compromising performance or compliance. High developer velocity due to reusable and well-documented architectural patterns. Seamless integration between systems and data sources across a multi-cloud environment. Recognition as an architectural authority within the organisation, driving strategic technical decisions confidently. Interested candidates please share your resume to
Posted 1 month ago
10.0 - 20.0 years
40 - 50 Lacs
Bengaluru
Work from Office
10+ years of database experience MS-SQL, Hbase, Casandra, MongoDB etc. Knowledge on CEPH is mandatory Experience as a software developer or data architect or in a data management role Experience working with at least two, and expert-level knowledge of at least two database technologies like'Mysel, Postgres, MS sQL Server, vertica, Snowflake, Dynamoolg, MongoDB, DocumentDB, Map-R, Cassandra etc. Understanding of various relational and non-relational database technologies along with their benefits, downsides, and best use case willingness and commitment to learn other database, automation, and Cloud technologies Proficiency in automation Experience working with the databases in public clouds, preferably AWS Strong analytical skills Ability to perform system monitoring and address various issues in the system Ability to do performance tuning and database development Creating and maintaining high pa stable data architectures for the onto Innovation product suite Research new database methods and technologies to fully utilize platform feature s working with customer, development squads, and product management to identiff and document use case scenarios Lead on all data solution aspects including setting data standards and providing your deep technical expertise to development teams for best practices, systems, and architectures Design data architectures and solutions that are highly available and meet disaster recovery requirements Design effective data store solutions that account for effective capacity planning, datatiering, and data life-cycle management Design automated deployment procedures for on-premises and cloud work with QA to create detailed test plans and needed tooling to account targeted performance for production volume data new and existing database systems Participate in code reviews and create engineering and cross-functional practices Work with cross-functional teams to technical and business viability documentation for ensure end-to-end Work as the member of cross-platform Database Services team Lead data tier architecture and design processes, participate in system and application architecture Help with performance tuning and help with database devllopment of performance-critical code Architect and implement HA/DRlBackup/TVlaintenance sffategies Collaborate with development and business teams Be responsible for meeting various SLAs for multiple database platforms used Troubleshoot and address various issues in the systems Learn other database platforms and technologie
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Navi Mumbai
Work from Office
Project Role : Security Delivery Practitioner Project Role Description : Assist in defining requirements, designing and building security components, and testing efforts. Must have skills : Informatica PowerCenter Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Delivery Practitioner, you will assist in defining requirements, designing and building security components, and testing efforts. Your typical day will involve collaborating with various teams to ensure that security measures are effectively integrated into the project lifecycle. You will engage in discussions to understand security needs, contribute to the design of security frameworks, and participate in testing to validate the effectiveness of security solutions. Your role will be pivotal in ensuring that security considerations are embedded in all aspects of project delivery, fostering a culture of security awareness and compliance within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions to enhance team knowledge on security practices.- Monitor and evaluate the effectiveness of security measures implemented across projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Good To Have Skills: Experience with Python (Programming Language).- Strong understanding of data integration and ETL processes.- Experience with data quality and governance frameworks.- Familiarity with security compliance standards and best practices. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
10.0 - 15.0 years
7 - 11 Lacs
Noida
Work from Office
R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description : We are seeking a highly skilled and motivated Data Cloud Architect to join our Product and technology team. As a Data Cloud Architect, you will play a key role in designing and implementing our cloud-based data architecture, ensuring scalability, reliability, and optimal performance for our data-intensive applications. Your expertise in cloud technologies, data architecture, and data engineering will drive the success of our data initiatives. Responsibilities: Collaborate with cross-functional teams, including data engineers, data leads, product owner and stakeholders, to understand business requirements and data needs. Design and implement end-to-end data solutions on cloud platforms, ensuring high availability, scalability, and security. Architect delta lakes, data lake, data warehouses, and streaming data solutions in the cloud. Evaluate and select appropriate cloud services and technologies to support data storage, processing, and analytics. Develop and maintain cloud-based data architecture patterns and best practices. Design and optimize data pipelines, ETL processes, and data integration workflows. Implement data security and privacy measures in compliance with industry standards. Collaborate with DevOps teams to deploy and manage data-related infrastructure on the cloud. Stay up-to-date with emerging cloud technologies and trends to ensure the organization remains at the forefront of data capabilities. Provide technical leadership and mentorship to data engineering teams. Qualifications: Bachelors degree in computer science, Engineering, or a related field (or equivalent experience). 10 years of experience as a Data Architect, Cloud Architect, or in a similar role. Expertise in cloud platforms such as Azure. Strong understanding of data architecture concepts and best practices. Proficiency in data modeling, ETL processes, and data integration techniques. Experience with big data technologies and frameworks (e.g., Hadoop, Spark). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data warehousing solutions (e.g., Redshift, Snowflake). Strong knowledge of security practices for data in the cloud. Excellent problem-solving and troubleshooting skills. Effective communication and collaboration skills. Ability to lead and mentor technical teams. Additional Preferred Qualifications: Bachelors degree / Master's degree in Data Science, Computer Science, or related field. Relevant cloud certifications (e.g., Azure Solutions Architect) and data-related certifications. Experience with real-time data streaming technologies (e.g., Apache Kafka). Knowledge of machine learning and AI concepts in relation to cloud-based data solutions. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Mumbai
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Pune
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Navi Mumbai
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to align the data architecture with business objectives, ensuring that the data platform meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of data modeling techniques.- Familiarity with cloud-based data storage solutions.- Experience in implementing data integration strategies. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Functional Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data platform blueprint and design.- Experience with data integration and data modeling.- Hands-on experience with data platform components.- Knowledge of data platform security and governance. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be responsible for analyzing requirements and translating them into effective data solutions, ensuring that the data platform meets the needs of various stakeholders. Additionally, you will participate in team meetings to share insights and contribute to the overall strategy of the data platform. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with data modeling and database design.- Familiarity with cloud-based data solutions and architectures.- Knowledge of data governance and data quality principles. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France