Jobs
Interviews

1019 Data Lake Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingesti...

Posted 3 months ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingesti...

Posted 3 months ago

Apply

12.0 - 17.0 years

37 - 55 Lacs

Bengaluru

Work from Office

Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Role Definition Participates in defining functional design, development and systems architecture for Caterpillar s state-of-the-art digital platform aligning to common de...

Posted 3 months ago

Apply

10.0 - 12.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Job Information Job Opening ID ZR_2063_JOB Date Opened 17/11/2023 Industry Technology Job Type Work Experience 10-12 years Job Title Azure Data Architect City Hyderabad Province Telangana Country India Postal Code 500003 Number of Positions 4 LocationCoimbatore & Hyderabad : Key-Azure+ SQL+ ADF+ Databricks +design+ Architecture( Mandate) Total experience in data management area for 10 + years with Azure cloud data platform experience Architect with Azure stack (ADLS, AALS, Azure Data Bricks, Azure Streaming Analytics Azure Data Factory, cosmos DB & Azure synapse) & mandatory expertise on Azure streaming Analytics, Data Bricks, Azure synapse, Azure cosmos DB Must have worked experience in lar...

Posted 3 months ago

Apply

8.0 - 12.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data t...

Posted 3 months ago

Apply

5.0 - 7.0 years

15 - 25 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

About the Role: We are seeking a skilled and experienced Data Engineer to join our remote team. The ideal candidate will have 5-7 years of professional experience working with Python, PySpark, SQL, and Spark SQL, and will play a key role in building scalable data pipelines, optimizing data workflows, and supporting data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines using PySpark and SQL. Develop and optimize Spark jobs for large-scale data processing. Collaborate with data scientists, analysts, and other engineers to ensure data quality and accessibility. Implement data integration from multiple sources...

Posted 3 months ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Bengaluru

Hybrid

Hi All , we have senior position for databricks expert Job Location :Pune and Bangalore(hybrid) Perks :pick and drop provided Role & responsibilities Kindly Note : Overall experience should be 7 Yrs+ and immediate joiner Data Engineering - Data pipeline development using Azure Databricks 5+ years • Optimizing data processing performance, efficient resource utilization and execution time. Workflow orchestration 5+ years • Databricks features like Databricks SQL, Delta Lake, and Workflows to orchestrate and manage complex data workflows – 5 + years • Data modelling – 5 + Years 6. Nice to Haves: Knowledge of PySparks, Good knowledge of data warehousing

Posted 3 months ago

Apply

10.0 - 18.0 years

22 - 27 Lacs

Hyderabad

Remote

Role: Data Architect / Data Modeler - ETL, Snowflake, DBT Location: Remote Duration: 14+ Months Timings: 5:30pm IST to 1:30am IST Note: Looking for Immediate Joiners Job Summary: We are seeking a seasoned Data Architect / Modeler with deep expertise in Snowflake , DBT , and modern data architectures including Data Lake , Lakehouse , and Databricks platforms. The ideal candidate will be responsible for designing scalable, performant, and reliable data models and architectures that support analytics, reporting, and machine learning needs across the organization. Key Responsibilities: Architect and design data solutions using Snowflake , Databricks , and cloud-native lakehouse principles . Lead...

Posted 3 months ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficientl...

Posted 3 months ago

Apply

3.0 - 8.0 years

20 - 32 Lacs

Bengaluru

Work from Office

Translate ideas&designs into running code Automate business processes using Office 365 Power Automate,Power Apps,Power BI Perform softwaredesign,debugging,testing,deployment Implement custom solutions leveragingCanvasApps,Model-Driven Apps,Office 365 Required Candidate profile production-level app development exp using PowerApps,Power Automate,Power BI Exp in C#, JavaScript, jQuery, Bootstrap, HTML Exp in SAP HANA,ETLprocesses,data modeling,data cleaning,data pre-processing

Posted 3 months ago

Apply

3.0 - 6.0 years

5 - 15 Lacs

Kochi, Thiruvananthapuram

Hybrid

Hiring for Azure Data Engineer in Kochi Location Experience - 3 to 6 years Location - Kochi JD Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF Interested candidat...

Posted 3 months ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to transl...

Posted 3 months ago

Apply

18.0 - 23.0 years

15 - 19 Lacs

Hyderabad

Work from Office

About the Role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 18 years of experience in Data engineering and Analytics and a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will be expected to design, create, deploy, and manage Blackbaud’s data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. This individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifi...

Posted 3 months ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Hyderabad, Chennai

Work from Office

Key skills: Python, SQL, Pyspark, Databricks, AWS (Manadate) Added advantage: Life sciences/Pharma Roles and Responsibilities 1.Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingesting, processing, and transforming large datasets from diverse sources into usable formats. 2.Data Integration and Transformation: Integrate data from multiple sources, ensuring data is accurately transformed and stored in optimal formats (e.g., Delta Lake, Redshift, S3). 3.Performance Optimization: Optimize data processing and storage systems for cost efficiency and high performance, including managing compute resources and cluster configurations. 4.Automation and Workflow Manag...

Posted 3 months ago

Apply

4.0 - 9.0 years

0 - 1 Lacs

Ahmedabad

Work from Office

Skills & Tools: Platforms: Oracle Primavera P6/EPPM, Microsoft Project Online, Planisware, Clarity PPM Integration Tools: APIs (REST/SOAP), ETL tools (Informatica, Talend), Azure Data Factory IAM/Security: Azure AD, Okta, SAML/OAuth, RBAC, SIEM tools Data Technologies: Data Lakes (e.g., AWS S3, Azure Data Lake), SQL, Power BI/Tableau Languages: Python, SQL, PowerShell, JavaScript (for scripting and integrations) Role & responsibilities Technical Consultant EPPM Platform, Cybersecurity, and Data Integrations Role Overview: As a Technical Consultant, you will be responsible for end-to-end setup and configuration of the Enterprise Project Portfolio Management (EPPM) platform, ensuring secure, e...

Posted 3 months ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Bengaluru

Hybrid

Mandatory Skills : Data engineer , AWS Athena, AWS Glue,Redshift,Datalake,Lakehouse,Python,SQL Server Must Have Experience: 6+ years of hands-on data engineering experience Expertise with AWS services: S3, Redshift, EMR, Glue, Kinesis, DynamoDB Building batch and real-time data pipelines Python, SQL coding for data processing and analysis Data modeling experience using Cloud based data platforms like Redshift, Snowflake, Databricks Design and Develop ETL frameworks Nice-to-Have Experience : ETL development using tools like Informatica, Talend, Fivetran Creating reusable data sources and dashboards for self-service analytics Experience using Databricks for Spark workloads or Snowflake Working...

Posted 3 months ago

Apply

9.0 - 14.0 years

55 - 60 Lacs

Bengaluru

Hybrid

Dodge Position Title: Technology Lead STG Labs Position Title: Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the ...

Posted 3 months ago

Apply

10.0 - 17.0 years

9 - 19 Lacs

Bengaluru

Remote

Azure Data Engineer Skills Req: Azure Data Engineer Big Data , hadoop Develop and maintain Data Pipelines using Azure services like Data Factory PysparkSynapse, Data Bricks Adobe,Spark Scala etc

Posted 3 months ago

Apply

7.0 - 9.0 years

1 - 6 Lacs

Bengaluru

Work from Office

Designation: Data Engineer Job Location: Bangalore About Digit Insurance : Digits mission is to ‘Make Insurance, Simple’. We are backed by Fairfax- one of the largest global investment firms. We have also been ranked as 'LinkedIn top 5 startup of 2018’ and 2019 and are the fastest growing insurance company. We have also been certified as a Great Place to Work! Digit has entered the Unicorn club with a valuation of $1.9 billion and while most companies take about a decade to get here, we have achieved this in just 3 years. And we truly believe that this has happened as a result of the mission of the company i.e. to make insurance simple along with the sheer hard work & endeavors of our employ...

Posted 3 months ago

Apply

3.0 - 6.0 years

7 - 9 Lacs

Jaipur, Bengaluru

Work from Office

We are seeking a skilled Data Engineer to join ,The ideal candidate will have strong experience with Databricks, Python, SQL, Spark, PySpark, DBT, and AWS, and a solid understanding of big data ecosystems, data lake architecture, and data modeling.

Posted 3 months ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

Gurugram

Work from Office

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteri...

Posted 3 months ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad

Remote

Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to en...

Posted 3 months ago

Apply

3.0 - 5.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Job Summary The NetApp Keystone team is responsible for cutting-edge technologies that enable NetApp’s pay as you go offering. Keystone helps customers manage data on prem or in the cloud and have invoices that are charged in a subscription manner.As an engineer in the NetApp’s Keystone organization, you will be executing our most challenging and complex projects. You will be responsible for decomposing complex product requirements into simple solutions, understanding system interdependencies and limitations and engineering best practices. Job Requirements • Strong knowledge of Python programming language, paradigms, constructs, and idioms • Bachelor’s/master’s degree in computer science, in...

Posted 3 months ago

Apply

5.0 - 8.0 years

25 - 35 Lacs

Gurugram, Bengaluru

Hybrid

Role & responsibilities Work with data product managers, analysts, and data scientists to architect, build and maintain data processing pipelines in SQL or Python. Build and maintain a data warehouse / data lake-house for analytics, reporting and ML predictions. Implement DataOps and related DevOps focused on creating ETL pipelines for data analytics / reporting, and ELT pipelines for model training. Support, optimise and transition our current processes to ensure well architected implementations and best practices. Work in an agile environment within a collaborative agile product team using Kanban Collaborate across departments and work closely with data science teams and with business (eco...

Posted 3 months ago

Apply

5.0 - 7.0 years

15 - 25 Lacs

Chennai

Work from Office

Job Summary: We are seeking a skilled Big Data Tester & Developer to design, develop, and validate data pipelines and applications on large-scale data platforms. You will work on data ingestion, transformation, and testing workflows using tools from the Hadoop ecosystem and modern data engineering stacks. Experience - 6-12 years Key Responsibilities: • Develop and test Big Data pipelines using Spark, Hive, Hadoop, and Kafka • Write and optimize PySpark/Scala code for data processing • Design test cases for data validation, quality, and integrity • Automate testing using Python/Java and tools like Apache Nifi, Airflow, or DBT • Collaborate with data engineers, analysts, and QA teams Key Skill...

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies