1693 Data Transformation Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 7 Lacs

telangana

Work from Office

Skill Name:Graph Database Ontologist Experience:4-8 yrs Job Location:Any Tech Mahindra Location Pan India As an Anzo Graph DB/ Graph Database Ontologist you will play a crucial role in creating and maintaining ontologies that enable data integration, search, and analytics across various data sources. Your expertise will help us unlock insights and drive business decisions. Responsibilities: - Design, develop, and maintain ontologies using Anzo, ensuring data consistency and integrity - Collaborate with data architects, engineers, and stakeholders to identify and prioritize ontology requirements - Develop and implement data governance policies and standards for ontology management - Perform d...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

karnataka

Work from Office

Description: Requisition ID - BXTRJP00025222 - Sr Data Engineer Essential Duties and Responsibilities: This section contains a list of five to eight primary responsibilities of this role that account for 5% or more of the work. The incumbent will perform other duties assigned. Development of new ETL/data transformation jobs, using PySpark or Python in AWS. Enhancement and support on existing ETL/data transformation jobs. Can explain technical solutions and resolutions with internal customers and communicate feedback to the ETL team. Perform technical code reviews for peers moving code into production. Perform and review integration testing before production migrations. Provide high level of ...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 12.0 years

7 - 11 Lacs

noida

Work from Office

Primary Responsibility We are seeking a highly skilled and self-motivatedData Migration Analyst with 7+ years of experience in Advanced SQL, Python and exposure to Peoplesoft ERP data structure system to join our team. This role is focused on supporting the migration of business data from legacy billing systems into the PeopleSoft ERP platform. The ideal candidate will have deep expertise in Advanced SQL and PeopleSoft ERP data structures, along with a strong understanding of data transformation, modeling, and automation practices.The successful candidate will be responsible for designing and developing scalable ETL processes and ensuring operational excellence across performance, security, ...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

7 - 12 Lacs

karnataka

Work from Office

Description: Summary Within this role, you would be a hands-on experience in data engineering functions including schema design, data movement, data transformation, encryption, and monitoring:all the activities needed to build, sustain, and govern big data pipelines. Responsibilities Own development of large-scale data platform including operational data store, real time metrics store and attribution platform, data warehouses and data marts for advertising planning, operation, reporting and optimization Wider team collaboration and system documentation Maintain next-gen cloud based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalab...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

karnataka

Work from Office

Experienced data modelers, SQL, ETL, with some development background to provide defining new data schemas, data ingestion for Adobe Experience Platform customers. Interface directly with enterprise customers and collaborate with internal teams. 10+ years of strong experience with data transformation & ETL on large data sets/5+ years of Data Modelling experience (i.e., Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts

Posted 3 weeks ago

AI Match Score
Apply

10.0 years

9 Lacs

kochi, thiruvananthapuram

Hybrid

Senior Data Architect (Snowflake | SAP | Cloud), Snowflake, SAP BW/ECC, and Cloud platforms (AWS, Azure, GCP),10+ years of experience in Data Architecture, ETL/ELT, and DataOps Location: Ernakulam/Trivandrum(hybrid) Contract-3 months

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

gandhinagar, gujarat

On-site

As an experienced professional with a minimum of 2 years in a relevant field, your role will involve various tasks related to project support, documentation, testing, training, client interaction, learning and development, and implementation support. Your key responsibilities will include: - Assisting in the implementation of Odoo modules and configurations based on client requirements - Supporting senior consultants in gathering and analyzing client business requirements - Helping with the setup and configuration of Odoo modules such as Sales, Purchase, Inventory, and Accounting - Preparing and maintaining documentation for system configurations, user guides, and project reports - Assisting...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a data engineer at Verizon, you will be part of a world-class team dedicated to driving the telecom business to its full potential. You will play a crucial role in building data products for telecom wireless and wireline business, including consumer analytics, telecom network performance, and service assurance analytics. Working with cutting-edge technologies like digital twin, you will collaborate with business product owners, data scientists, and system architects to develop strategic data solutions from various sources. Your responsibilities will include data ingestion, preparation, transformation, developing data streaming applications, ETL/ELT development, and contributing to DevOps ...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As an experienced and highly motivated Oracle Fusion Technical Consultant, your role will involve delivering robust Oracle Fusion implementations, enhancements, and integrations, with a focus on Oracle Integration Cloud (OIC), Oracle Fusion Technical components, and BI Publisher. You should have a strong technical background and hands-on experience in designing, developing, and supporting Oracle Fusion applications across multiple modules. Key Responsibilities: - Design, develop, test, and deploy Oracle Fusion Technical components such as BI Publisher reports, OTBI, and FBDI-based integrations. - Collaborate with Oracle Integration Cloud (OIC) to create complex integrations between Oracle Fu...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a PySpark Developer at Capgemini Invent, you will play a crucial role in designing, developing, and implementing data solutions using PySpark for large-scale data processing and analytics. You will be working with Azure Databricks (ADB) and Azure Data Factory (ADF) to create efficient data pipelines and conduct financial risk assessments. Here is what you can expect in this role: - Design, develop, and deploy PySpark applications and workflows on Azure Databricks for data transformation, cleansing, and aggregation. - Implement data pipelines using Azure Data Factory (ADF) to orchestrate ETL/ELT processes across heterogeneous data sources. - Conduct regular financial risk assessments to id...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

4 - 8 Lacs

pune, chennai, bengaluru

Work from Office

Job Title: PySpark Data Engineer Summary: We are seeking a skilled PySpark Data Engineer to join our team and drive the development of robust data processing and transformation solutions within our data platform. You will be responsible for designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensure data quality, and integrate with diverse data sources. The ideal candidate possesses strong PySpark development skills, experience with big data technologies, and the ability to work in a fast-paced, data-driven environment. Key Responsibilities: Data Engineering Development: Design, develop, and test PySpark-based applications to process, t...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

7 - 12 Lacs

kochi

Work from Office

6+ years of experience in Data Engineer to design and implement scalable data solutions. Deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence. Support design and implementation of end-to-end data solutions leveraging AWS Redshift, Apache Airflow, dbt, and other modern data tools including DataBricks. Experience in Azure Data Stack can also be considered. Develop data models and implement data pipelines to ingest, transform, and load data from various sources into data warehouse. Create and maintain Apache Airflow DAGs to orchestrate complex data workflows and ETL processes. Implement data transformations and modeling using dbt to ensure data q...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

9 - 13 Lacs

pune, chennai, bengaluru

Work from Office

Job Summary We are seeking a Data Engineer to help build and integrate a Generative AI-powered conversational assistant, into our website and mobile app. This role is crucial in handling data pipelines, model training, and infrastructure setup to deliver a seamless, privacy-compliant experience for users seeking personalized health insights. The Data Engineer will work closely with our AI and software development teams to design scalable data solutions within Google Cloud Platform (GCP) to support this next-generation AI service. Key Responsibilities Data Integration & Pipeline Development : Design and implement data pipelines to support training and finetuning of knowledge base and user dat...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 9 Lacs

pune, chennai, bengaluru

Work from Office

Develop and maintain data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery of data. Design and implement custom connectors to facilitate the ingestion of diverse data sources into our platform, including structured and unstructured data from various document formats . Collaborate closely with cross-functional teams to gather requirements, understand data needs, and translate them into technical solutions. Implement DataOps principles and best practices to ensure robust data operations and efficient data delivery. Design and implement data CI/CD pipelines to enable automated and efficient data integrati...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

8 - 18 Lacs

bengaluru

Remote

Location: Remote Required Experience: 2 to 5 years Shift timing: 5 PM to 2 AM Job responsibilities : Set up and manage data flows from various databases, ensuring efficient extraction, transformation, and loading (ETL). Write, optimize, and maintain complex SQL queries for data extraction and analysis. Develop, maintain, and optimize data pipelines using Python, DBT, and Airflow to ensure smooth data processing. Perform data modelling to structure raw data for optimal performance and insights. Automate data visualizations and build interactive dashboards for both web and mobile apps using Power BI. Work on the development and setup of Power BI, ensuring end-to-end data integration, rather th...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 9.0 years

15 - 25 Lacs

kolkata

Work from Office

Skill: AWS Glue Experience: 6 to 9 years Location: Kolkata Job description Technical Skills : AWS Glue: 3+ years of hands-on experience in AWS Glue ETL development Python/PySpark: Strong programming skills in Python and PySpark for data transformation AWS Services: Proficiency in S3, Redshift, Athena, Lambda, and EMR Data Formats: Experience with Parquet, Avro, JSON, CSV, and ORC file formats SQL: Advanced SQL skills for data querying and transformation ETL Concepts: Deep understanding of ETL/ELT design patterns and best practices Data Modeling: Knowledge of dimensional modeling, star/snowflake schemas Version Control: Experience with Git/Bitbucket for code management Preferred Skills: Exper...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 4.0 years

4 - 7 Lacs

andhra pradesh

Work from Office

Candidates should have experience in leading the team setting goals and objectives to achieve desired outcomes. Designing and developing SSIS ETL solutions to acquire and prepare data for numerous downstream systems Design and develop SQL Server stored procedures functions views and triggers to be used during the ETL process Builds data transformations with SSIS including importing data from files moving data from one database platform to another Debug and tune SSIS processes to ensure accurate and efficient movement of data Test and prepare ETL processes for deployment to production and non-production environments Ability to work independently as well as in a team-oriented fast-paced enviro...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

6 - 11 Lacs

hyderabad, chennai, bengaluru

Work from Office

Job Summary We are seeking a Data Engineer to work on a Generative AI initiative to join our team. The ideal candidate will have a deep understanding of data modeling, data schemas and has developed ETLs from various sources eensuring high data availability, fault tolerance and security and governance. Responsibilities: Collaborate with stakeholders to understand data requirements and design scalable and efficient data models, schemas, and architectures on the Azure platform. Develop and implement data integration and ETL (Extract, Transform, Load) processes to ingest, transform, and load data from various sources into Azure data storage systems. Build and maintain scalable and reliable data...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 6.0 years

5 - 9 Lacs

uttar pradesh

Work from Office

Skill : Big Data Need to Have : Python, Pyspark, Trino, Hive Good to Have : Snowflake, SQL, Airflow, Openshift, Kubernentes Location : Hyderabad, Pune, Bangalore Job Description : Develop and maintain data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery of data. Design and implement custom connectors to facilitate the ingestion of diverse data sources into our platform, including structured and unstructured data from various document formats . Collaborate closely with cross functional teams to gather requirements, understand data needs, and translate them into technical solutions. Implement DataOps pr...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

3 - 7 Lacs

lakshadweep, chandigarh

Work from Office

Data Engineer Skills Required: Strong proficiency in ADF, Snowflake and SQL(All 3 are mandatory) Experience Required: Minimum 5 years of relevant experience Location: Available across all UST locations Notice Period: Immediate joiners (Candidates available to join by 31st January 2025 ) SO - 48778470 Roles Open: 5 positions available Budget - 25LPA to 28 LPA We are looking for profiles that meet the above requirements. Kindly share profiles of suitable candidates at your earliest convenience. For any urgent queries or additional support, feel free to reach out to Vidyalakshmi Murali(UST,IN)- 9605562549 directly. JD FYR: We are seeking a highly skilled Data Engineer to join our team. The idea...

Posted 3 weeks ago

AI Match Score
Apply

8.0 years

20 - 21 Lacs

bengaluru

Work from Office

Description: As a Software Engineer at Aurigo’s Research & Development Centre at Bangalore, you will work with a skilled team of Technical Architects, Business Analysts and work closely with the developers and quality engineers to build solutions. Your technical competency, solution mindset, development skills, project planning and execution skills are critical to professional growth and project success. If terminology like Azure DevOps, Critical Path, Scope of Work, Technical Interface Design, Integration touchpoints, Data Migration, Scrum Meetings, Agile delivery process, customer presentations, solution analysis and design sessions, etc., feel home to you, this is the right role for you. ...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

8 - 12 Lacs

uttar pradesh

Work from Office

Hyd preference ( but open to other locations) . At least 6+ years of experience in designing, developing, and implementing Informatica Power Center and SQL server solutions. . Expertise in ingesting, transforming, and exporting JSON and XML data files with complex nested hierarchies. . Expertise in working with SQL databases and writing complex SQL queries, stored procedures, and performance tuning. . Familiarity with both batch and real time data ingestion into SQL server using Informatica Power Center mappings and creating balancing processes and writing to audit tables. . Strong understanding of data integration, data quality, and data transformation concepts. . Ability to troubleshoot an...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

lakshadweep, chandigarh, dadra & nagar haveli

Work from Office

SAP: 7+yrs ALL PSL Location Rate below 22 LPA - JD for SAP resources:Resource SkillsSAP Data Services (BODS)SAP Landscape Transformation (SLT),Data Replication, ETL Other Skills:SQL Scripting, Data Validation, Performance Optimization, Data Transformation Minimum 3 years of experience in data extraction from SAP ERP systems using SAP SLT and SAP BODS and loading into data lake ETL experience for complex data transformationsStrong SQL skillsGood communication skills Location - Lakshadweep,Chandigarh,Dadra & Nagar Haveli,Daman,New Delhi,Diu,Goa,Haveli,Puducherry,Sikkim

Posted 3 weeks ago

AI Match Score
Apply

10.0 - 14.0 years

35 - 40 Lacs

pune

Work from Office

Experienced data modelers, SQL, ETL, with some development background to provide defining new data schemas, data ingestion for Adobe Experience Platform customers. Interface directly with enterprise customers and collaborate with internal teams. 10+ years of strong experience with data transformation & ETL on large data sets/5+ years of Data Modelling experience (i.e., Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts.

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

pune

Work from Office

JD for Unity Catalog and Databricks Engineer Role C2H Experience-6 to 9yrs overall, Also experience in Pyspark, Databricks Location-PAN INDIA(except Hyderabad and Chennai as preferred location) NP-Immediate/15 days Hybrid-2 or 3 days WFO Unity Catalog Adoption Data Engineer 3-6 Years, PySpark, Databricks, Unity Catalog JD Design, develop, and maintain efficient and scalable data pipelines using PySpark, Databricks. Utilize Databricks to orchestrate and automate data workflows. Implement and manage data storage solutions using Unity Catalog for effective data governance. Collaborate with data scientists, analysts, and business stakeholders to ensure smooth data flow and integration. Write opt...

Posted 3 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies