Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: As an ETL Pentaho Developer with a minimum of 6 years of experience, you will be responsible for utilizing your excellent data analysis skills and expertise in the Pentaho BI Suite. You will be creating and managing ETL pipelines, performing data masking/protection, and working within RDBMS systems. Your analytical mindset and problem-solving skills will be key in managing multiple projects in a fast-paced environment. Key Responsibilities: - Develop and implement strategies for false positive reduction - Act as a liaison between business stakeholders and technical teams - Provide expert guidance on sanctions compliance and regulatory requirements - Communicate complex technic...
Posted 3 days ago
3.0 - 7.0 years
4 - 8 Lacs
bengaluru, karnataka, india
On-site
We are actively seeking a highly skilled Snowflake Fivetran Developer to join our client's team through Acme Services . This pivotal role is crucial for designing, building, and maintaining robust data platforms in the cloud. The ideal candidate will possess strong Cloud Data Warehouse experience , preferably with AWS , and be proficient in Snowflake, Fivetran (as a mandatory ETL tool) , and DBT (Data Build Tool) . Experience in building comprehensive data platforms and the ability to work effectively with sales and marketing use cases are highly preferred. Key Responsibilities Cloud Data Warehousing : Design, implement, and manage scalable data solutions within a Cloud Data Warehouse enviro...
Posted 1 week ago
5.0 - 9.0 years
30 - 37 Lacs
hyderabad
Hybrid
We are seeking a Data Engineer with strong expertise in Data Engineeringincluding data modeling, ETL pipeline development, data governance, and Data Build Tool (DBT)—along with proficiency in Python, PySpark, and AWS services such as Glue, Redshift, Lambda, and DynamoDB. Candidates should also have in-depth knowledge of MySQL, including core and advanced concepts. JD: Proficiency with Python and SQL for data processing (spark is a bonus) Working experience with AWS storage and database/data warehouse services - s3, Redshift, RDS, DynamoDB etc. Hands on experience with AWS compute services such as EC2, ECR, Lambda (layers/triggers) etc. Ensure proper logging, error handling, and performance m...
Posted 3 weeks ago
12.0 - 16.0 years
19 - 25 Lacs
bengaluru
Work from Office
Role: Data Engineering Lead Location: Bangalore (Work from Office) Experience: 12+ Years Notice Period: Immediate to 30 Days Job Description: Design, develop, and optimize scalable data architectures using Databricks and Apache Spark . Build and maintain ETL/ELT pipelines for large-scale data processing. Ensure data governance , security , and compliance across the Databricks environment. Collaborate with analytics, marketing, and business teams to deliver data-driven solutions. Integrate Databricks with AWS/Azure/GCP , and CRM tools like Salesforce or Veeva . Lead and mentor a team of data engineers, ensuring delivery excellence. Implement Medallion Architecture and Delta Lake for efficient...
Posted 3 weeks ago
0.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Role Overview We are seeking a proficient ETL Developer with a strong emphasis on Software Development Engineer in Test (SDET) responsibilities. The ideal candidate will possess hands-on experience in developing, testing, and maintaining ETL pipelines, with a deep understanding of data integration, transformation, and validation processes. This role requires expertise in Google Cloud Platform (GCP) services, advanced SQL, and robust testing methodologies. #Mandatory skills - ETL Pipeline Big query , Strong SQL , Airflow and Dag #Details Experience in Data Engineering - Development and Testing Strong knowledge in Database Concepts, ETL/ELT, star schema, data modelling Strong experience in dat...
Posted 4 weeks ago
8.0 - 13.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be responsible for designing, building, and maintaining robust data models and pipelines using Azure Synapse and MS SQL to support analytics, reporting, and performance needs. Your role will involve implementing scalable ingestion frameworks for structured and unstructured data into the data warehouse (DWH), ensuring data integrity and consistency. Additionally, you will use Python for data processing, automation, and analytics, following best practices for maintainability and performance. Managing tasks and progress using Jira will be crucial to ensure timely delivery of high-quality, production-ready data solutions aligned with business goals. Collaboration with sta...
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
lululemon is an innovative performance apparel company dedicated to supporting individuals in their yoga, running, training, and other athletic pursuits. With a focus on technical fabrics and functional design, we aim to create transformational products and experiences that encourage movement, growth, connection, and overall well-being. Our success is attributed to our groundbreaking products, our commitment to our team members, and the strong connections we establish within every community we engage with. As a company, we are dedicated to fostering positive change and building a healthier, thriving future. Central to this mission is the establishment of an equitable, inclusive, and growth-o...
Posted 2 months ago
4.0 - 6.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Profile Description Were seeking someone to join our team as (Associate) who will work as a hands-on Data Base & Pyspark Developer within the AIDT Data & Services RWMS Squad. This position is focused on delivering tactical and strategic solutions to support the Squad for analysis, processing, and reporting. WM_Technology Wealth Management Technology is responsible for the design, development, delivery, and support of the technical solutions behind the products and services used by the Morgan Stanley Wealth Management Business. Practice areas include: Analytics, Intelligence, & Data Technology (AIDT), Client Platforms, Core Technology Services (CTS), Financial Advisor Platforms, Global Bankin...
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 6 years of experience as an ETL Pentaho Developer. Candidates with exposure to the latest versions like 7/8 are preferred. Your data analysis skills should be excellent, and you should have a good amount of experience with the Pentaho BI Suite, including Pentaho Data Integration Designer / Kettle, Pentaho Report Designer, Pentaho Design Studio, Pentaho Enterprise Console, Pentaho BI Server, Pentaho Metadata, Pentaho Analysis View, Pentaho Analyser & Mondrian. You should also have experience in performing Data Masking/Protection using Pentaho Data Integration. Your responsibilities will include creating ETL pipelines, which involves extraction, transformation, mer...
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for performing comprehensive testing of ETL pipelines to ensure data accuracy and completeness across different systems. This includes validating Data Warehouse objects such as fact and dimension tables, designing and executing test cases and test plans for data extraction, transformation, and loading processes, as well as conducting regression testing to validate enhancements with no breakage of existing data flows. You will also work with SQL to write complex queries for data verification and backend testing, and test data processing workflows in Azure Data Factory and Databricks environments. Collaboration with developers, data engineers, and business analysts to u...
Posted 3 months ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate for this position should have advanced proficiency in Python, with a solid understanding of inheritance and classes. Additionally, the candidate should be well-versed in EMR, Athena, Redshift, AWS Glue, IAM roles, CloudFormation (CFT is optional), Apache Airflow, Git, SQL, Py-Spark, Open Metadata, and Data Lakehouse. Experience with metadata management is highly desirable, particularly with AWS Services such as S3. The candidate should possess the following key skills: - Creation of ETL Pipelines - Deploying code in EMR - Querying in Athena - Creating Airflow Dags for scheduling ETL pipelines - Knowledge of AWS Lambda and ability to create Lambda functions This role is fo...
Posted 3 months ago
9.0 - 14.0 years
9 - 14 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Designing and developing custom SharePoint solutions based on business requirements. Creating and configuring SharePoint sites, lists, libraries, workflows, and web parts. Developing custom forms and templates using InfoPath and SharePoint Designer. Providing technical support for SharePoint users. Developing and implementing SharePoint security and access controls. Monitoring SharePoint performance and troubleshooting issues. Collaborating with other developers and stakeholders to ensure solutions meet business needs. Excellent knowledge of understanding organization structures and business operations and accordingly designing a SharePoint implementation Experience on SharePoint Online Expe...
Posted 4 months ago
5.0 - 10.0 years
27 - 32 Lacs
Gurugram, Bengaluru
Work from Office
Job Title - Technical Project Manager Location - Gurgaon/ Bangalore Nature of Job - Permanent Department - data analytics What you will be doing Demonstrated client servicing and business analytics skills with at least 5 - 9 years of experience as data engineer, BI developer, data analyst, technical project manager, program manager etc. Technical project management- drive BRD, project scope, resource allocation, team coordination, stakeholder communication, UAT, Prod fix, change requests, project governance Sound knowledge of banking industry (payments, retail operations, fraud etc.) Strong ETL experience or experienced Teradata developer Managing team of business analysts, BI developers, ET...
Posted 4 months ago
5.0 - 10.0 years
14 - 16 Lacs
chennai
Work from Office
Role - Python Developer + SQL Experience - 5+ Years Location - Chennai (DLF) Hybrid work from office Shift timings - 6 pm - 3 am Must-Have Skills: Strong Python programming skills Very strong SQL skills Experience in ETL/ELT development Knowledge of relational databases Ability to work independently and communicate effectively Good-to-Have: Knowledge of data quality frameworks and testing strategies Experience with version control systems like Git Familiarity with CI/CD pipelines for data engineering
Posted Date not available
10.0 - 15.0 years
15 - 25 Lacs
pune
Work from Office
Designing and implementing scalable and resilient data architectures for both batch and streaming data processing. Developing data models and database structures. Ensuring data security, integrity, and compliance with relevant regulations. Required Candidate profile Experience in Hadoop, Spark, NoSQL databases. Cloud exposure: AWS, Azure, GCP. Coding skills in Python, Java, Scala, and Spark frameworks. Hands-on with ETL tools and building data pipelines.
Posted Date not available
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
123151 Jobs | Dublin
Wipro
40198 Jobs | Bengaluru
EY
32154 Jobs | London
Accenture in India
29674 Jobs | Dublin 2
Uplers
24333 Jobs | Ahmedabad
Turing
22774 Jobs | San Francisco
IBM
19350 Jobs | Armonk
Amazon.com
18945 Jobs |
Accenture services Pvt Ltd
18931 Jobs |
Capgemini
18788 Jobs | Paris,France