Home
Jobs

29 Data Lakes Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

6 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The S r. Data Engineer will be responsible for the end-to-end development of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions , and be exceptionally skilled with data analysis and profiling . You will collaborate closely with stakeholders , product team members , and related I T teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities: Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with stakeholders to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Masters degree with 4 to 6 years of experience in Product Owner / Platform Owner / Service Owner OR Bachelors degree with 8 to 10 years of experience in Product Owner / Platform Owner / Service Owner Functional Skills: Must-Have Skills : Minimum of 3 years of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 6 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design, DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms (AWS), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 1 day ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Delhi, India

On-site

Foundit logo

Responsibilities: Data Analytics & Insight Generation (30%) Analyze marketing, digital, and campaign data to uncover patterns and deliver actionable insights. Support performance measurement, experimentation, and strategic decision-making across the marketing funnel. Translate business questions into structured analyses and data-driven narratives. Data Infrastructure & Engineering (30%) Design and maintain scalable data pipelines and workflows using SQL , Python , and Databricks . Build and evolve a marketing data lake , integrating APIs and data from multiple platforms and tools. Work across cloud environments (Azure, AWS) to support analytics-ready data at scale. Project & Delivery Ownership (25%) Serve as project lead or scrum owner across analytics initiatives planning sprints, managing delivery, and driving alignment. Use tools like JIRA to manage work in an agile environment and ensure timely execution. Collaborate with cross-functional teams to align priorities and execute on roadmap initiatives. Visualization & Platform Enablement (15%) Build high-impact dashboards and data products using Tableau , with a focus on usability, scalability, and performance. Enable stakeholder self-service through clean data architecture and visualization best practices. Experiment with emerging tools and capabilities, including GenAI for assisted analytics. Experience 5+ years of experience in data analytics, digital analytics, or data engineering, ideally in a marketing or commercial context. Hands-on experience with Responsibilities: Data Analytics & Insight Generation (30%) Analyze marketing, digital, and campaign data to uncover patterns and deliver actionable insights. Support performance measurement, experimentation, and strategic decision-making across the marketing funnel. Translate business questions into structured analyses and data-driven narratives. Data Infrastructure & Engineering (30%) Design and maintain scalable data pipelines and workflows using SQL , Python , and Databricks . Build and evolve a marketing data lake , integrating APIs and data from multiple platforms and tools. Work across cloud environments (Azure, AWS) to support analytics-ready data at scale. Project & Delivery Ownership (25%) Serve as project lead or scrum owner across analytics initiatives planning sprints, managing delivery, and driving alignment. Use tools like JIRA to manage work in an agile environment and ensure timely execution. Collaborate with cross-functional teams to align priorities and execute on roadmap initiatives. Visualization & Platform Enablement (15%) Build high-impact dashboards and data products using Tableau , with a focus on usability, scalability, and performance. Enable stakeholder self-service through clean data architecture and visualization best practices. Experiment with emerging tools and capabilities, including GenAI for assisted analytics. Experience 5+ years of experience in data analytics, digital analytics, or data engineering, ideally in a marketing or commercial context. Hands-on experience with SQL , Python , and tools such as Databricks , Azure , or AWS . Proven track record of building and managing data lakes , ETL pipelines , and API integrations . Strong proficiency in Tableau ; experience with Tableau Prep is a plus. Familiarity with Google Analytics (GA4) , GTM , and social media analytics platforms. Experience working in agile teams , with comfort using JIRA for sprint planning and delivery. Exposure to predictive analytics , modeling, and GenAI applications is a plus. Strong communication and storytelling skills able to lead high-stakes meetings and deliver clear insights to senior stakeholders. Excellent organizational and project management skills; confident in managing competing priorities. High attention to detail, ownership mindset, and a collaborative, delivery-focused approach. and tools such as Databricks , Azure , or AWS . Proven track record of building and managing data lakes , ETL pipelines , and API integrations . Strong proficiency in Tableau ; experience with Tableau Prep is a plus. Familiarity with Google Analytics (GA4) , GTM , and social media analytics platforms. Experience working in agile teams , with comfort using JIRA for sprint planning and delivery. Exposure to predictive analytics , modeling, and GenAI applications is a plus. Strong communication and storytelling skills able to lead high-stakes meetings and deliver clear insights to senior stakeholders. Excellent organizational and project management skills; confident in managing competing priorities. High attention to detail, ownership mindset, and a collaborative, delivery-focused approach.

Posted 1 day ago

Apply

8.0 - 10.0 years

8 - 10 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Amgen's Clinical Computation Platform Product Team manages a core set of clinical computation solutions that support global clinical development. This team is responsible for building and maintaining systems for clinical data storage, data auditing and security management, analysis and reporting capabilities. These capabilities are pivotal in Amgen's goal to serve patients. The Principal IS Architect will define the architecture vision, create roadmaps, and support the design and implementation of advanced computational platforms to support clinical development, ensuring that IT strategies align with business goals. The Principal IS Architect will work closely with partners across departments, including CfDA, GDO, CfOR, CfTI, CPMS and IT teams, to design and implement scalable, reliable, and hard-working solutions. Key Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Basic Qualifications: Master's degree with 8 to 10 years of experience in Computer Science, IT, or related field OR Bachelor's degree with 10 to 14 years of experience in Computer Science, IT, or related field OR Diploma with 14 to 18 years of experience in Computer Science, IT, or related field Proficiency in designing scalable, secure, and cost-effective solutions Expertise in cloud platforms (AWS, Azure, GCP), data lakes, and data warehouses Experience in evaluating and selecting technology vendors Ability to create and demonstrate proof-of-concept solutions to validate technical feasibility Strong knowledge of Clinical Research and Development domain Experience working in Agile methodology, including Product Teams and Product Development models Preferred Qualifications: Strong solution design and problem-solving skills Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Experience with machine learning and artificial intelligence applications in clinical research Strong programming skills in languages such as Python, R, or Java Experience with DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills

Posted 5 days ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do The R&D Precision Medicine team is responsible for Data Standardization, Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with access to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These data include clinical data, omics, and images. These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications. The Data Engineer will be responsible for full stack development of enterprise analytics and data mastering solutions leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that support research cohort-building and advanced AI pipelines. The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions, and be exceptionally skilled with data analysis and profiling. You will collaborate closely with partners, product team members, and related IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a solid background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data management tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with partners to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree with 1 to 3 years of experience in Data Engineering OR Bachelors degree with 1 to 3 years of experience in Data Engineering Must-Have Skills: Minimum of 1 year of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 1 year of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Experience using cloud platforms (AWS), data lakes, and data warehouses. Working knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling and data anlysis Good-to-Have Skills: Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications: ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft CertifiedData Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams, specifically including using of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 6 days ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role you will support an ambitious program to evolve how Amgen does forecasting, moving from batch processes (e.g., sales forecasting to COGS forecast, clinical study forecasting) to a more continuous process. The hardworking professional we seek is curious by nature, organizationally and data savvy, with a strong record of Finance transformation, partner management and accomplishments in Finance, Accounting, or Procurement. This role will help redesign existing processes to incorporate Artificial Intelligence and Machine Learning capabilities to significantly reduce time and resources needed to build forecasts. As the Next Gen Forecasting Senior Associate at Amgen India, you will drive innovation and continuous improvement in Finances planning, reporting and data processes with a focus on maximizing current technologies and adapting new technologies where relevant. This individual will collaborate with cross-functional teams and support business objectives. This role reports directly to the Next Gen Forecasting Manager in Hyderabad, India. Roles & Responsibilities: Priorities can often change in a fast-paced technology environment like Amgens, so this role includes, but is not limited to, the following: Support implementation of real-time / continuous forecasting capabilities Establish baseline analyses, define current and future state using traditional approaches and emerging digital technologies Identify which areas would benefit most from automation / AI / ML Identify additional process / governance changes to move from batch to continuous forecasting Closely partner with Business, Accounting, FP&A, Technology and other impacted functions to define and implement proposed changes Partners with Amgen Technology function to support both existing and new finance platforms Partners with local and global teams on use cases for Artificial Intelligence (AI), Machine Learning (ML) and Robotic Process Automation (RPA) Collaborate with cross-functional teams and Centers of Excellence globally to drive operational efficiency Contributes to a learning environment and enhances learning methodologies of technical tools where applicable. Serve as local financial systems and financial data subject matter expert, supporting local team with questions Supports global finance teams and business partners with centrally delivered financial reporting via tableau and other tools Supports local adoption of Anaplan for operating expense planning / tracking What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 1 to 3 years of Finance experience OR Bachelors degree and 3 to 5 years of Finance experience OR Diploma and 7 to 9 years of Finance experience Consistent record of launching new finance capabilities Proficiency in data analytics and business intelligence tools. Experience with finance reporting and planning system technologies Experience with technical support of financial platforms Knowledge of financial management and accounting principles. Experience with ERP systems Resourceful individual who can connect the dots across matrixed organization Preferred Qualifications: Experience in pharmaceutical and/or biotechnology industry. Experience in financial planning, analysis, and reporting. Experience with global finance operations. Knowledge of advanced financial modeling techniques. Business performance management Finance transformation experience involving recent technology advancements Prior multinational capability center experience Experience with Oracle Hyperion/EPM, S4/SAP, Anaplan, Tableau/PowerBI, DataBricks, Alteryx, data lakes, data structures Soft Skills: Excellent project management abilities. Strong communication and interpersonal skills. High level of integrity and ethical standards. Problem-solving and critical thinking capabilities. Ability to influence and motivate change. Adaptability to a dynamic and fast-paced environment. Strong organizational and time management skills

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of S enior Manager , Sourcing and Procurement Responsibilities Project management which will include developing project plans, processes, and timeline Design, manage and run customized reports for category managers, directors, and executives Liaison with IT to ensure necessary data requirements and increase knowledgebase of how data flows, where it is stored, and how to access it in our enterprise system Assist in developing commercial evaluation criteria Supplier Segmentation support Identify & implement standardization within process Develop and own critical Master Data Management inputs. These include, but are not limited to BU Tier mapping (DRM), Sourcing Taxonomy, and Region/Country/City/Site mappings Maintain rhythms with key stakeholders/customers as a means of checking in on data quality issues and pending action items Partner and influence external stakeholders who control the input data process. This includes building strong relationships with IT in GE Businesses and at the corporate level Conduct Executive Presentations for project updates as well as extended training of spend analytics tools, often to an audience of 100+ participants Lead periodic perception alignments with Sourcing leaders, business and category teams alike, to ensure data accuracy exceeds 90% best in class benchmarks (Tier 1-3 levels) through validation of rule-based supplier classification Partner with Data Science team to ensure implementation of MDM standards across platforms (Spend, Savings, Cash, Contracts, Preferred Supplier, and others that will be implemented) Own and Manage reporting and analytics tools built on top of the Finance Data Lake, including wing to wing management of enhancement implementation Executive Dashboard - Summary level Data Visualizations on Tableau Data Extraction Tool - Finance Data Store Change of Classification Workflow - Interactive workflow tool for user input on Spend Categorization, Supplier Normalization, and Spend Exclusion Lead Project to integrate the Tableau Spend Dashboard and the raw data extraction tool (passing filters between the two tools) Remain agile to deliver on other Adhoc Fire Drill Data Requests Manage Project of developing new tools/visualization aimed at proactive analytics/forecasting models Qualifications Minimum qualifications Excellent Interpersonal Communication Skills. Excellent in Project Management & execution Analytical & Problem-Solving Skills Exposure to Spend Analytics and opportunity assessment Good Understanding of Sourcing & Procurement Domain. Good Understanding of Technology Landscape like Data Science / AI / ML / Data Engineering concepts Familiarity with big data and analytics architecture (e.g., Data Lakes) Executive Presentation Skills People Management Experience Preferred qualifications Good competency in any Indirect Sourcing Category Knowledge of systems landscape in S2P space Effective at narrating the story from the data using compelling presentations Communicate clearly and concisely, both orally and in writing Establish and maintain effective working relationships with those contacted in the course of work Experience managing data for multiple Oracle, SAP, and other ERP platforms Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at www.genpact.com and on , Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

10.0 - 17.0 years

12 - 19 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Purpose: We are seeking an experienced ADF Technical Architect with over 10 to 17 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required with strong communication skills. Requirements: We are seeking an experienced ADF Technical Architect with over 10 to 17 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required with strong communication skills. The ideal candidate should have: Key Responsibilities: Participate in data strategy and road map exercises, data architecture definition, business intelligence/data warehouse solution and platform selection, design and blueprinting, and implementation. Lead other team members and provide technical leadership in all phases of a project from discovery and planning through implementation and delivery. Work experience in RFP, RFQ's. Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions. Lead source to target mapping, define interface process and standards, and implement the standards Perform Root Cause Analysis and develop data remediation solutions Develop and implement proactive monitoring and alert mechanism for data issues. Collaborate with other workstream leads to ensure the overall developments are in sync Identify risks and opportunities of potential logic and data issues within the data environment Guide, influence, and mentor junior members of the team Collaborate effectively with the onsite-offshore team and ensure day to day deliverables are met Qualifications & Key skills required: Bachelor's degree and 10+ years of experience in related data and analytics area Demonstrated knowledge of modern data solutions such as Azure Data Fabric, Synapse Analytics, Lake houses, Data lakes Strong source to target mapping experience and ETL principles/knowledge Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required Excellent verbal and written communication skills. Strong quantitative and analytical skills with accuracy and attention to detail Ability to work well independently with minimal supervision and can manage multiple priorities Proven experiences with Azure, AWS, GCP, OCI and other modern technology platforms is required

Posted 1 week ago

Apply

7.0 - 9.0 years

22 - 35 Lacs

New Delhi, Gurugram, Greater Noida

Work from Office

Naukri logo

Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 7–9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark. Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse. Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Preferred Skills: Strong knowledge of data engineering concepts (data pipeline creation, data warehousing, data marts/cubes, data reconciliation and audit, data management). Working knowledge of DevOps processes (CI/CD), Git/Jenkins version control tools, Master Data Management (MDM), and data quality tools. Strong experience in ETL/ELT development, QA, and operations/support processes (RCA of production issues, code/data fix strategy, monitoring and maintenance). Hands-on experience with databases like Azure SQL DB, Snowflake, MySQL, Cosmos DB, Blob Storage, Python/Unix Shell scripting. ADF, Databricks, and Azure certifications are a plus. Technologies We Use: Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, scripting (PowerShell, Bash), Git, Terraform, Power BI Responsibilities: Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms. Lead the technical execution of non-domain-specific initiatives (e.g., reusable dimensions, TLOG standardization, enablement pipelines). Architect data models and reusable layers consumed by multiple downstream pods. Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks. Mentor and coach team members. Partner with product and platform leaders to ensure engineering consistency and delivery excellence. Act as an L3 escalation point for operational data issues impacting foundational pipelines. Own engineering best practices, sprint planning, and quality across the Enablement pod. Contribute to platform discussions and architectural decisions across regions.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)

Posted 2 weeks ago

Apply

10.0 - 14.0 years

10 - 16 Lacs

Pune

Work from Office

Naukri logo

Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Mumbai

Work from Office

Naukri logo

Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. Key Responsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities Nice To Have Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS)

Posted 3 weeks ago

Apply

8 - 13 years

12 - 22 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Greetings of The Day...!!! We have an URGENT on-rolls opening for the position of "Snowflake Architect" at One of our reputed clients for WFH. Name of the Company - Confidential Rolls - Onrolls Mode of Employment - FTE / Sub-Con / Contract Job Location - Remote Job Work Timings Night Shift – 06.00 pm to 03.00 am IST Nature of Work – Work from Home Working Days – 5 Days Weekly Educational Qualification - Bachelor's degree in computer science, BCA, engineering, or a related field. Salary – Maximum CTC Would be 23LPA (Salary & benefits package will be commensurate with experience and qualifications, PF, Medical Insurance cover available) Language Known - English, Hindi, & local language. Experience – 9 Years + of relevant experience in the same domain. Job Summary: We are seeking a highly skilled and experienced Snowflake Architect to lead the design, development, and implementation of scalable, secure, and high-performance data warehousing solutions on the Snowflake platform. The ideal candidate will possess deep expertise in data modelling, cloud architecture, and modern ELT frameworks. You will be responsible for architecting robust data pipelines, optimizing query performance, and ensuring enterprise-grade data governance and security. In this role, you will collaborate with data engineers, analysts, and business stakeholders to deliver efficient data solutions that drive informed decision-making across the organization. Key Responsibilities: Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Required Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Preferred Skills: Technical Expertise Cloud & Integration Performance & Optimization Security & Governance Soft Skills THE PERSON SHOULD BE WILLING TO JOIN IN 07-10 DAYS TIME OR IMMEDIATE JOINER. Request for interested candidates; Please share your updated resume with us below Email-ID executivehr@monalisammllp.com, also candidate can call or WhatsApp us at 9029895581. Current /Last Net in Hand - Salary will be offered based on the interview /Technical evaluation process -- Notice Period & LWD was/will be - Reason for Changing the job - Total Years of Experience in Specific Field – Please specify the location which you are from – Do you hold any offer from any other association - ? Regards, Monalisa Group of Services HR Department 9029895581 – Call / WhatsApp executivehr@monalisammllp.com

Posted 1 month ago

Apply

5 - 8 years

16 - 18 Lacs

Pune

Work from Office

Naukri logo

Oracle PLSQL Developer The Associate shall perform the role of PLSQL Developer and shall be responsible for the following: Hands on coding in SQL and PLSQL Create/implement database architecture for new applications and enhancements to existing applications Hands-on experience in Data Modeling, SSAS, Cubes, query Optimization Create/implement strategies for partitioning, archiving and maturity models for applications. Review queries created by other developers for adherence to standards and performance issues PLSQL, TSQL, SQL Query Optimization, Data Models, Data lakes Interact with Database, Applications analysts and Business users for estimations. Do impact analysis of existing applications and suggest best ways of incorporating new requirements Proactively engage in the remediation of software issues related to code quality, security, and/or pattern/frameworks.

Posted 1 month ago

Apply

10 - 14 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Overview of the Role: We have an outstanding opportunity for an expert AI and Data Cloud Solutions Engineer to work with our trailblazing customers in crafting ground-breaking customer engagement roadmaps demonstrating the Salesforce applications, platform across the machine learning and LLM/GPT domains in India! The successful applicant will have a track record in driving business outcomes through technology solutions, with experience in engaging at the C-level with Business and Technology groups. Responsibilities: Primary pre-sales technical authority for all aspects of AI usage within the Salesforce product portfolio - existing Einstein ML based capabilities and new (2023) generative AI Majority of time (60%+) will be customer/external facing Evangelisation of Salesforce AI capabilities Assessing customer requirements and use cases and aligning to these capabilities Solution proposals, working with Architects and wider Solution Engineer (SE) teams Building reference models/ideas/approaches for inclusion of GPT based products within wider Salesforce solution architectures, especially involving Data Cloud Alignment with customer security and privacy teams on trust capabilities and values of our solution(s) Presenting at multiple customer events from single account sessions through to major strategic events (World Tour, Dreamforce) Representing Salesforce at other events (subject to PM approval) Sales and SE organisation education and enablement e.g. roadmap - all roles across all product areas Bridge/primary contact point to product management Provide thought leadership in how large enterprise organisation can drive customer success through digital transformation. Ability to uncover the challenges and issues a business is facing by running successful and targeted discovery sessions and workshops. Be an innovator who can build new solutions using out-of-the-box thinking. Demonstrate business value of our AI solutions to business using solution presentations, demonstrations and prototypes. Build roadmaps that clearly articulate how partners can implement and accept solutions to move from current to future state. Deliver functional and technical responses to RFPs/RFIs. Work as an excellent teammate by chipping in, learning and sharing new knowledge. Demonstrate a conceptual knowledge of how to integrate cloud applications to existing business applications and technology. Lead multiple customer engagements concurrently. Be self-motivated, flexible, and take initiative.| Required Qualifications: Experience will be evaluated based on the core proficiencies of the role. 4+ years working directly in the commercial technology space with AI products and solutions. Data knowledge - Data science, Data lakes and warehouses, ETL, ELT, data quality AI knowledge - application of algorithms and models to solve business problems (ML, LLMs, GPT) 10+ years working in a sales, pre-sales, consulting or related function in a commercial software company Strong focus and experience in pre-sales or implementation is required. Experience in demonstrating Customer engagement solution, understand and drive use cases, customer journeys, ability to draw Day in life of across different LOBs. Business Analysis/ Business case/return on investment construction. Demonstrable experience in presenting and communicating complex concepts to large audiences A broad understanding of and ability to articulate the benefits of CRM, Sales, Service and Marketing cloud offerings Strong verbal and written communications skills with a focus on needs analysis, positioning, business justification, and closing techniques. Continuous learning demeanor with a demonstrated history of self enablement and advancement in both technology and behavioural areas. Preferred Qualifications: expertise in an AI related subject (ML, deep learning, NLP etc.) Familiar with technologies such as OpenAI, Google Vertex, Amazon Sagemaker, Snowflake, Databricks etc

Posted 2 months ago

Apply

1 - 6 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The Data Engineer will be responsible for the end-to-end development of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions , and be exceptionally skilled with data analysis and profiling . You will collaborate closely with stakeholders , product team members , and related I T teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities: Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with stakeholders to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Masters degree with 1 to 3years of experience in Data Engineering OR Bachelors degree with 4 to 5 years of experience in Data Engineering Diploma and 7 to 9 years of experience in Data Engineering. Functional Skills: Must-Have Skills : Minimum of 3 years of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 3years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design, DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms (AWS), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 2 months ago

Apply

4 - 6 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a Senior Data Science Consultant. As Senior Data Science Consultant, you will be responsible for working on projects with opportunities to improve the customer experience using advanced analytics and data solution engineering. The data science team supports automated control and process optimization/streamlining by developing Advanced Analytical Solutions targeting to minimize compliance and operational risk across multiple lines of business across the bank. More specifically, you will support data exploration, population design, automate data driven review using Advanced automation techniques in SAS/Python/text mining/AI ML techniques. The selected candidate is expected to design analytical solution and generate meaningful business insight and communicate highly complex concepts to business stakeholders in layman term. In this role, you will: Work as technical expert in delivering high quality analytical solution and provide effective business insights Research, design and develop end-to-end advanced analytical solution using data solution engineering, ETL design, applying text mining and NLP Streamline ETL/data flow structure feeding to different analytical solutions through automation Clearly understand and articulate business requirements by leveraging domain understanding of line of business & product/ function and deliver the results underlining the business problem and appropriate business decision levers Identify & leverage appropriate analytical approach from a wide toolkit to make data driven recommendation Required Qualifications: 4+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Master's degree or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, or computer science Desired Qualifications: 4+ years of hands work experience in advanced analytics/ data science and min. 2 years mandatory experience in Risk and Control/Compliance in a Banking domain Engineering Graduate/ Post-graduate in Maths/Stats/Economics/Computer Science Strong expertise in Python and SAS/SQL and text mining/NLP Must have exposure to unstructured data such as contact center technology data (IVR, Telephony, Text, Chat etc) along with Transactional data Exposure to SAS Viya and Data Lakes/Azure/Big data platforms would be a plus Sound knowledge in project documentation framework Must have consultative skills to have the ability to rationalize business need and solution design from business requirements Strong written and verbal communication, presentation and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management

Posted 2 months ago

Apply

8 - 10 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption and work closely with ISVs and with the startup ecosystem in the Virtualization, Cloud, AI/ML and Gen AI domains to build solutions that matter for the customers. You will work closely with product owner and product lead on the company's current and future strategies related to said domains. Job Requirements Lead to deliver features, including participating in the full software development lifecycle. Deliver reliable, innovative solutions and products. Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. Write unit and automated integration?tests and project documentation. Technical Skills: Understanding of Software development lifecycle Strong proficiency in full stack development ~ MERN Stack, Python, Container Ecosystem, Cloud and Modern ML frameworks. Knowledge of Data storage, virtualization, knowledge on hypervisors such as VMware ESX, Linux KVM and Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model training, inferencing as well as RAG workflows. Knowledge of Unix based operating system kernels and development environments, e.g. Linux or FreeBSD. A strong understanding of basic to complex concepts related to computer architecture, data structures, and new programming paradigms Education A minimum of 8+ years of software development experience. A Bachelor of Science Degree in Electrical Engineering or Computer Science, a Master degree, or a PhD; or equivalent experience is required.

Posted 2 months ago

Apply

10 - 15 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

An EDAP Solution Architect plays a crucial role in designing and implementing cloud solutions based on Microsoft Azure. The role involves a blend of technical expertise in cloud infrastructure, architectural design, and leadership to guide teams and stakeholders through Azure deployments. Heres a detailed job description, including both technical and non-technical skills. Responsibilities Design, develop, and implement Azure-based cloud solutions tailored to business requirements. Participate in architectural discussions and design solutions for complex cloud applications, ensuring optimal performance, scalability, and security. Oversee the migration of legacy systems to Azure, ensuring minimal downtime and seamless integration. Create architecture blueprints and documentation for stakeholders and development teams. Collaborate with development, operations, and security teams to ensure solutions meet non-functional requirements such as security, scalability, and performance. Ensure governance by implementing best practices and compliance policies, especially in industries with stringent regulatory requirements. Provide expert-level guidance on the adoption of Azure services such as IaaS, PaaS, serverless computing, containers, and microservices architecture. Engage with clients and internal stakeholders to identify business challenges and develop appropriate cloud solutions. Conduct performance tuning and optimization of Azure resources. Continuously evaluate new Azure services and stay updated with platform enhancements. Participate in R&D for exploring emerging cloud technologies to enhance architectural practices. Qualifications Expertise inAzure IaaS (VirtualMachines, Storage, Networking). Proficiency in PaaS (App Services, Azure SQL Database, Azure Kubernetes Service (AKS), and Azure Functions). Experience with Serverless architecture and microservices design. Strong understanding of Azure DevOps, CI/CD pipelines, and Infrastructure as Code (IaC) using tools like ARM templates, Terraform. Strong experience in designing cloud-native applications and implementing multi-tier, distributed cloud applications. Familiarity with cloud design patterns (event-driven, microservices, etc.). Knowledge of API management, load balancing, and traffic distribution in Azure. Understanding of Azure security best practices, including identity management (Azure AD), encryption, security policies, and role-based access control (RBAC). Experience in designing secure cloud architectures for highly sensitive data environments. Knowledge of Azure Virtual Network, VPNs, ExpressRoute, DNS, Azure Firewall, and Application Gateway. Proficiency in designing network architectures that support high availability, disaster recovery, and hybrid cloud models. Experience with Azure SQL, Cosmos DB, Data Lakes, Azure Databricks and Azure Storage. Knowledge of data replication, disaster recovery, and backup solutions in the cloud. Understanding of real-time data processing using Azure services like Event Hubs, Azure Data Factory, and Stream Analytics. Hands-on experience with PowerShell, Azure CLI, and automation frameworks. Ability to script and automate cloud infrastructure tasks. Expertise in using Azure Monitor, Application Insights, and Log Analytics for performance monitoring and troubleshooting. Knowledge of APM (Application Performance Management) tools and practices. Proficiency in Docker, Kubernetes (AKS), and container orchestration in Azure. Experience with deploying containerized applications using Azure Container Registry (ACR). Familiarity with other cloud platforms (AWS, GCP) is a plus. ** Non-Technical skills** Fluency in English and Chinese languages for effective communication with team members and business teams. Strong verbal and written communication skills, with the ability to articulate technical concepts to non-technical stakeholders. Ability to create and present detailed architectural designs, reports, and recommendations to clients and management. Proven experience leading cloud migration and implementation projects, guiding cross-functional teams in adopting cloud services. Ability to mentor and provide technical guidance to engineers and developers. Collaborative mindset, working closely with DevOps, security, and infrastructure teams. Ability to understand business requirements and translate them into technical solutions. Forward-thinking, with a focus on scalability, cost-effectiveness, and long-term sustainability of cloud solutions. Strong analytical skills, with the ability to troubleshoot complex architectural and platform-related issues. Ability to innovate and think creatively when faced with challenges in cloud solution design. Ability to engage with senior management, IT leaders, and business stakeholders to align technical solutions with organizational goals. Adept at gathering requirements, managing expectations, and balancing business priorities with technical feasibility. **Qualifications:** Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree is a plus. 10 + years of experience in the data and analytics platform 7+ years of experience in designing and deploying cloud solutions with at least 5 years on the Azure platform.

Posted 2 months ago

Apply

3 - 6 years

8 - 12 Lacs

Kolhapur

Work from Office

Naukri logo

Job Description : - Ensure security standards are followed for all structured and unstructured data platforms (e.g., AzureAWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregationsegmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security eventsincidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on AzureAWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.

Posted 2 months ago

Apply

3 - 6 years

8 - 12 Lacs

Agra

Work from Office

Naukri logo

Job Description : - Ensure security standards are followed for all structured and unstructured data platforms (e.g., AzureAWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregationsegmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security eventsincidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on AzureAWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.

Posted 2 months ago

Apply

4 - 5 years

9 - 12 Lacs

Ernakulam, Kochi

Work from Office

Naukri logo

TNP is looking for an extraordinary Data Engineer who loves to push boundaries to solve complex business problems using creative solutions. As a Data Engineer, you will work in the Technology team that helps deliver our Data Engineering offerings on a large scale to clients worldwide. Role Responsibilities: Design, develop, and maintain scalable data pipelines and architectures for batch and real-time processing. Build and optimize data integration workflows, ETL/ELT processes, and data transformation pipelines. Implement data modeling, schema design, and data governance strategies to ensure data quality and consistency. Work with relational and NoSQL databases, data lakes, and distributed systems to manage and store structured and unstructured data. Develop, test, and deploy custom data solutions using programming languages such as Python and SQL. Collaborate with cross-functional teams to identify data requirements and deliver solutions that meet business needs. Monitor data pipelines for performance, reliability, and scalability, and troubleshoot issues as they arise. Ensure data security and compliance with company policies and industry standards. Document processes, tools, and systems for knowledge sharing and scalability. Must-Have Skills: Expertise in SQL and relational database systems (e.g., PostgreSQL, MySQL, Oracle). Proficiency in programming languages like Python, Java, or Scala. Hands-on experience with ETL tools. Experience with Big Data frameworks such as Apache Spark, Hadoop, or Kafka. Knowledge of cloud platforms (AWS, Azure, GCP) and tools like Redshift, Snowflake, or BigQuery. Proficiency in working with data lakes, data warehouses, and real-time streaming architectures. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Strong problem-solving, analytical, and communication skills. Good to Have: Experience with data visualization tools (e.g., Tableau, Power BI) Knowledge of machine learning pipelines and collaboration with Data Scientists. Exposure to containerization technologies like Docker and orchestration tools like Kubernetes. Understanding of DevOps practices and Infrastructure as Code (IaC) tools such as Terraform. Certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools.

Posted 2 months ago

Apply

10 - 12 years

35 - 37 Lacs

Pune

Work from Office

Naukri logo

Company Description: Extentia, a Merkle Company, is a global technology and services firm that helps clients transform and realize their digital strategies. With a unique Experience Centric Transformation approach, Extentias ground-breaking solutions are in the space of mobile, cloud, and design. The team is differentiated by an emphasis on excellent design skills that they bring to every project. Focused on enterprise mobility, cloud computing, and user experiences, Extentia strives to accomplish and surpass their customers business goals. The companys inclusive work environment and culture inspire team members to be innovative and creative and to provide clients with an exceptional partnership experience. Job Details: Position: Data Architect Experience: 10-12 years Work Mode: Onsite Location: Pune Notice Period: Immediate Budget: 38LPA Job Responsibilities: The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de- duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up to date with industry trends and best practices to continuously innovate and enhance the data architecture strategy.

Posted 2 months ago

Apply

8 - 10 years

40 - 55 Lacs

Noida

Work from Office

Naukri logo

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes, with team handling. Responsibilities: Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes. Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts. Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes. Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency. Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements. Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms). Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes. (Immediate Joiners)

Posted 2 months ago

Apply

7 - 11 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About the role: This open position is part of CEDA Advanced Analytics Team under Enterprise Customer Excellence Data and Analytics. As Lead Data Science Consultant, you will be responsible to lead projects with opportunities to improve the customer experience using advanced analytics and data solution engineering. The data science team supports automated control and process optimization/streamlining by developing Advanced Analytical Solutions targeting to minimize compliance and operational risk across multiple lines of business across the bank.More specifically, you will support data exploration, population design, automate data driven review using Advanced automation techniques in SAS/Python/text mining/AI ML techniques. The selected candidate is expected to design analytical solution and generate meaningful business insight and communicate highly complex concepts to business stakeholders in layman term. Responsibilities Work as technical lead in developing high quality analytical solution and provide effective business insights Lead the business problem discovery, research, design and develop end-to-end advanced analytical solution using data solution engineering, ETL design, applying text mining and NLP Streamline ETL/data flow structure feeding to different analytical solutions through automation Clearly understand and articulate business requirements by leveraging domain understanding of line of business & product/ function and deliver the results underlining the business problem and appropriate business decision levers Identify & leverage appropriate analytical approach from a wide toolkit to make data driven recommendation Required Qualifications: Min. 7 years of hands work experience in advanced analytics/ data science and min. 2 years mandatory experience in Risk and Control/Compliance in a Banking domain Having prior experience in Quality Assurance and Monitoring is a plus Engineering Graduate/ Post-graduate in Maths/Stats/Economics/Computer Science Strong expertise in Python and SAS/SQL and text mining/AI ML techniques/NLP. Strong expertise in NLP and text summarization is a plus. Must have exposure to unstructured data such as contact center technology data (IVR, Telephony, Text, Chat etc) along with Transactional data Exposure to SAS Viya and Data Lakes/Azure/Big data platforms would be a plus Sound knowledge in project documentation framework Must have consultative skills to have the ability to rationalize business need and solution design from business requirements Strong written and verbal communication, presentation and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management

Posted 2 months ago

Apply

5 - 8 years

4 - 8 Lacs

Maharashtra

Work from Office

Naukri logo

Design, develop, and optimize data integration workflows using Apache NiFi. Work on mission critical projects, ensuring high availability, reliability, and performance of data pipelines. Integrate NiFi with cloud platforms (e.g., AWS, Azure, GCP) for scalable data processing and storage. Develop custom NiFi processors and extensions using Java. Implement real time data streaming solutions using Apache Kafka. Work with MongoDB for NoSQL data storage and retrieval. Use GoldenGate for real time data replication and integration. Troubleshoot and resolve complex issues related to NiFi workflows and data pipelines. Collaborate with cross functional teams to deliver robust, production ready solutions. Follow best practices in coding, testing, and deployment to ensure high quality deliverables. Mentor junior team members and provide technical leadership. Mandatory Skills and Qualifications: 5+ years of hands on experience in Apache NiFi for data integration and workflow automation. Senior level Java programming knowledge, including experience in developing custom NiFi processors and extensions. Strong knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., S3, EC2, Lambda, Azure Data Lake, etc.). Proficiency in Linux environments, including shell scripting and system administration. Experience with Apache Kafka for real time data streaming and event driven architectures. Hands on experience with MongoDB for NoSQL data management. Familiarity with GoldenGate for real time data replication and integration. Experience in performance tuning and optimization of NiFi workflows. Solid understanding of data engineering concepts, including ETL/ELT, data lakes, and data warehouses. Ability to work independently and deliver results in a fast paced, high pressure environment. Excellent problem solving, debugging, and analytical skills. Good to Have Skills: Experience with containerization tools like Docker and Kubernetes. Knowledge of DevOps practices and CI/CD pipelines. Familiarity with big data technologies like Hadoop, Spark, or Kafka. Understanding of security best practices for data pipelines and cloud environments. Interview Focus Areas: Hands on NiFi Development: Practical assessment of NiFi workflow design and optimization. Java Programming: Senior level coding skills, including custom NiFi processor development. Cloud Integration: Understanding of how NiFi integrates with cloud platforms for data processing and storage. Kafka and MongoDB: Expertise in real time data streaming and NoSQL data management. GoldenGate: Knowledge of real time data replication and integration. Linux Proficiency: Ability to work in Linux environments and troubleshoot system level issues. Problem Solving: Analytical skills to resolve complex data integration challenges. Shift Requirements: Flexible shift hours with the shift ending by midday US time. Willingness to adapt to dynamic project needs and timelines.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies