Jobs
Interviews

1016 Etl Process Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality assessment and improvement methodologies.- Familiarity with data governance principles and best practices.- Ability to work with large datasets and perform data cleansing. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Chennai

Hybrid

Hi there! I am hiring for a client of mine seeking a Data Engineer to join their team in Chennai. Overview of the position below: Role & responsibilities: Design, develop, test and support ETL/ELT solutions automating data loading processes in line with architectural standards and best practices Follow migration process to move ETL objects from development to QA, stage and production environments Preferred candidate profile At least 5 years of working experience in SAP Data Services (BODS) in ETL projects and support work Experience in SAP Data Services (BODS) version 4.2/4.3 Strong knowledge of relational data design and structured query language (SQL) Strong knowledge of Python programming for scripting, data transformation, automation, and integration with external APIs and services. Experience with Snowflake, SAP HANA, PostgreSQL, MS-SQL and Oracle Experience writing technical specifications and testing documents Experience in Data Services Management Console for monitoring, execution and scheduling jobs Experience with file transfer protocols

Posted 3 weeks ago

Apply

7.0 - 10.0 years

20 - 35 Lacs

Chennai

Work from Office

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Chennai) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Agile, Program Management, data infrastructure Forbes Advisor is Looking for: Program Manager Data Job Description: Were hiring a Program Manager to orchestrate complex, cross-functional data initiativesfrom revenue-pipeline automation to analytics product launches. You’ll be the connective tissue between Data Engineering, Analytics, RevOps, Product, and external partners, ensuring programs land on time, on scope, and with measurable impact. If you excel at turning vision into executable roadmaps, mitigating risk before it bites, and communicating clearly across technical and business audiences, we’d love to meet you. Key Responsibilities: Own program delivery for multi-team data products (e.g., revenue-data pipelines, attribution models, partner-facing reporting APIs). Build and maintain integrated roadmaps, aligning sprint plans, funding, and resource commitments. Drive agile ceremonies (backlog grooming, sprint planning, retrospectives) and track velocity, burn-down, and cycle-time metrics. Create transparent status reporting—risks, dependencies, OKRs—tailored for engineers up to C-suite stakeholders. Proactively remove blockers by coordinating with Platform, IT, Legal/Compliance, and external vendors. Champion process optimisation: intake, prioritisation, change management, and post-mortems. Partner with RevOps and Media teams to ensure program outputs translate into revenue growth and faster decision making. Facilitate launch readiness—QA checklists, enablement materials, go-live runbooks—so new data products land smoothly. Foster a culture of documentation, psychological safety, and continuous improvement within the data organisation. Experience required: 7+ years program or project-management experience in data, analytics, SaaS, or high-growth tech. Proven success delivering complex, multi-stakeholder initiatives on aggressive timelines. Expertise with agile frameworks (Scrum/Kanban) and modern collaboration tools (Jira, Asana, Notion/Confluence, Slack). Strong understanding of data & cloud concepts (pipelines, ETL/ELT, BigQuery, dbt, Airflow/Composer). Excellent written and verbal communication—able to translate between technical teams and business leaders. Risk-management mindset: identify, quantify, and drive mitigation before issues escalate. Experience coordinating across time zones and cultures in a remote-first environment. Nice to Have Formal certification (PMP, PMI-ACP, CSM, SAFe, or equivalent). Familiarity with GCP services, Looker/Tableau, or marketing-data stacks (Google Ads, Meta, GA4). Exposure to revenue operations, performance marketing, or subscription/affiliate business models. Background in change-management or process-improvement methodologies (Lean, Six Sigma). Perks: Monthly long weekends—every third Friday off. Fitness and commute reimbursement. Remote-first culture with flexible hours and a high-trust environment. Opportunity to shape a world-class data platform inside a trusted global brand. Collaborate with talented engineers, analysts, and product leaders who value innovation and impact. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! Forbes Advisor is a high-growth digital media and technology company that empowers consumers to make confident decisions about money, health, careers, and everyday life. Our global data organisation builds modern, AI-augmented pipelines that turn information into revenue-driving insight.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

1 - 4 Lacs

Mysuru

Work from Office

Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 3 weeks ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Mumbai

Work from Office

Sr. Python Developer Experience 5+Years Location Bangalore/Hyderabad Job Overview We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Mumbai

Work from Office

: J ob TitleBusiness Management Analyst Corporate TitleAnalyst LocationMumbai, India Role Description As a BA you are expected to design and deliver on critical senior management dashboards and analytics using tools such as Excel, SQL etc. These management packs should enable management to make timely decisions for their respective businesses and create a sound foundation for the analytics. You will need to collaborate closely with senior business managers, data engineers and stakeholders from other teams to comprehend requirements and translate them into visually pleasing dashboards and reports. You will play a crucial role in analyzing business data and generating valuable insights for other strategic ad hoc exercises. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Collaborate with business user, managers to gather requirements, and comprehend business needs to design optimal solutions. Perform ad hoc data analysis as per business needs to generate reports, visualizations, and presentations helping strategic decision making. You will be responsible for sourcing information from multiple sources, build a robust data pipeline model. To be able work on large and complex data sets to produce useful insights. Perform audit checks ensuring integrity and accuracy across all spectrums before implementing findings. Ensure timely refresh to provide most updated information in dashboards/reports. Identifying opportunities for process improvements and optimization based on data insights. Communicate project status updates and recommendations. Your skills and experience Bachelors degree in computer science, IT, Business Administration or related field Minimum of 5 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets Excellent Microsoft Office skills including advanced Excel skills. Comprehensive understanding of data visualization best practices Experience with data analysis, modeling, and ETL processes is advantageous. Excellent knowledge of database concepts and extensive hands-on experience working with SQL Strong analytical, quantitative, problem solving and organizational skills. Attention to detail and ability to coordinate multiple tasks, set priorities and meet deadlines. Excellent communication and writing skills. How well support you . . . .

Posted 3 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

5-10 years of experience in database development or a related field. Proven experience with database design, development, and management. Experience working with large-scale databases and complex data environments Experience with data modelling and database design. Knowledge of database performance tuning and optimization. Architect, Develop and maintain tables, views, procedures, functions and packages in Database MUST HAVE Performing complex relational databases queries using SQL (AWS RDS for PostgreSQL) and Oracle PLSQL MUST HAVE Familiarity with ETL processes and tools (AWS Batch, AWS Glue etc) MUST HAVE Familiarity with CI/CD Pipelines, Jenkins Deployment, Git Repository MUST HAVE Perform in performance tuning. Proactively monitor the database systems to ensure secure services with minimum downtime and improve maintenance of the databases to include rollouts, patching, and upgrades. Experience with Aurora's scaling and replication capabilities. MUST HAVE Proficiency with AWS CloudWatch for monitoring database performance and setting up alerts. Experience with performance tuning and optimization in AWS environments MUST HAVE Experience using Confluence for documentation and collaboration. Proficiency in using SmartDraw for creating database diagrams, flowcharts, and other visual representations of data models and processes. MUST HAVE Proficiency in using libraries such as Pandas and NumPy for data manipulation, analysis, and transformation. Experience with libraries like SQLAlchemy and PyODBC for connecting to and interacting with various databases. MUST HAVE Python programming language MUST HAVE Agile/Scrum , Communication (Spoken English, clarity of thought) Big Data, Data mining, machine learning and natural language processing

Posted 3 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Work from Office

Minimum of 4+ years of hands-on experience in Ab Initio development. Develop and optimize ETL workflows and processes for data extraction transformation and loading. Proficiency in different Ab Initio suite components Strong understanding of ETL concepts, data warehousing principles and relational databases. Experience in designing and implementing ETL processes for large-scale data sets. Solid knowledge of SQL and scripting languages for data manipulation and analysis. Excellent problem-solving skills and ability to work independently or as part of a team. Strong communication skills and the ability to interact effectively with stakeholders at various levels. Perform unit testing debugging and troubleshooting of Ab Initio graphs and applications to ensure data accuracy and integrity.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Greater Noida

Work from Office

SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Monitor and troubleshoot database performance issues. Automate routine database tasks using scripts and tools. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. QE: Design, develop, and execute test plans and test cases for data pipelines, ETL processes, and data platforms. Validate data quality, integrity, and consistency across various data sources and destinations. Automate data validation and testing using tools such as PyTest, Great Expectations, or custom Python/SQL scripts. Collaborate with data engineers, analysts, and product managers to understand data requirements and ensure test coverage. Monitor data pipelines and proactively identify data quality issues or anomalies. Contribute to the development of data quality frameworks and best practices. Participate in code reviews and provide feedback on data quality and testability. Strong SQL skills and experience with large-scale data sets. Proficiency in Python or another scripting language for test automation. Experience with data testing tools Familiarity with cloud platforms and data warehousing solutions

Posted 3 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Gurugram

Remote

Job Title: Architect Power Apps, Power BI, API Integration, and ETL Pipeline Design Experience: 8 10 years Job Type: Full-time Job Summary: We are seeking a highly skilled and experienced Architect to design, develop, and implement robust enterprise solutions using Power Apps , Power BI , API integration , and ETL pipeline architectures . The ideal candidate will play a pivotal role in leading solution design, technical architecture, and end-to-end integration strategies across platforms. Key Responsibilities: Architect and design scalable low-code/no-code applications using Power Apps . Lead Power BI data modeling, dashboard/report creation, and DAX/Power Query development. Design and implement ETL pipelines for efficient data ingestion, transformation, and integration from heterogeneous sources. Develop and manage API integrations with internal and third-party systems using REST/SOAP services. Collaborate with business and technical teams to gather requirements and convert them into functional technical solutions. Ensure performance, security, and compliance of all integrated systems and data flows. Provide technical leadership, mentorship, and code review for development teams. Create architecture diagrams, design documents, and best practice guidelines. Required Skills: Deep expertise in Microsoft Power Platform –Power Apps (Canvas & Model-driven), Power Automate. Strong knowledge of Power BI architecture, report development, and embedding/integration strategies. Proven experience with API development and integration (REST, JSON, OAuth, etc.). Expertise in ETL tools/pipeline design , using tools like Azure Data Factory, SSIS, or custom data flow frameworks. Solid understanding of data modeling , SQL, and relational/dimensional databases. Experience with Azure services (Functions, Logic Apps, SQL, Data Lake) is a plus. Ability to lead architecture discussions and interact with senior stakeholders. Preferred Qualifications: Microsoft certifications related to Power Platform or Azure (e.g., PL-600, PL-400, DP-203). Experience in agile methodologies and DevOps for deployment and CI/CD. Excellent communication, leadership, and stakeholder management skills. Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. Benefits: Competitive salary and performance-based incentives Learning & development opportunities Flexible work environment Health insurance and other standard benefits

Posted 4 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Sr. Python Developer Experience 5+Years Location Bangalore/Hyderabad Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us? Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 4 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Lead Python Developer Experience 7+Years Location Bangalore/Hyderabad Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us? Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 4 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us? Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 4 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us? Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 4 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us? Growth Opportunities: Youll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!

Posted 4 weeks ago

Apply

15.0 - 20.0 years

20 - 30 Lacs

Noida, Gurugram

Hybrid

Design architectures using Microsoft SQL Server MongoDB Develop ETLdata lakes, Integrate reporting tools like Power BI, Qlik, and Crystal Reports to data strategy Implement AWS cloud services,PaaS,SaaS, IaaS,SQL and NoSQL databases,data integration

Posted 4 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

We are looking for a skilled Senior Power BI Consultant to join our team at Apps Associates (I) Pvt. Ltd, with 4-8 years of experience in the IT Services & Consulting industry. Roles and Responsibility Design and develop interactive dashboards using Power BI to provide data-driven insights. Collaborate with stakeholders to understand business requirements and develop solutions. Develop and maintain complex reports, visualizations, and analytics models. Troubleshoot and resolve issues related to Power BI performance, data quality, and security. Work closely with cross-functional teams to integrate Power BI with other systems and tools. Stay up-to-date with the latest trends and technologies in Business Intelligence and Data Analytics. Job Strong understanding of data modeling, ETL processes, and database concepts. Proficiency in developing complex queries and writing efficient SQL code. Experience working with large datasets and creating scalable data architectures. Excellent communication and interpersonal skills to work effectively with stakeholders. Ability to analyze complex business problems and develop innovative solutions. Strong problem-solving skills with attention to detail and ability to meet deadlines.

Posted 4 weeks ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 4 weeks ago

Apply

5.0 - 10.0 years

16 - 27 Lacs

Pune

Work from Office

Exp in json, 5 yrs in PostgreSQL ,etl fundamentals, work with dba to monitor data architecture and data quality, integrate & upkeep database code in qeries, views, scripts & processes

Posted 4 weeks ago

Apply

12.0 - 14.0 years

25 - 30 Lacs

Chennai

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 4 weeks ago

Apply

4.0 - 7.0 years

15 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Design, develop, and implement data solutions using AWS Data Stack components such as Glue and Redshift.Write and optimize advanced SQL queries for data extraction, transformation, and analysis.Develop data processing workflows and ETL processes using Python and PySpark.

Posted 4 weeks ago

Apply

2.0 - 4.0 years

3 - 4 Lacs

Chennai

Work from Office

Looking for Database-Focused Full Stack Developer with expertise in Node.js(Express.js),PostgreSQL(Database Design,Optimization and Migrations),exp. in asynchronous messaging systems like RabbitMQ,CI/CD pipelines,GIT workflows, DevOps collaboration Required Candidate profile Node.js,Express.js,PostgreSQL, ETL, RabbitMQ,queue management,event-driven architecture,CI/CD pipeline config and maintenance,GIT version control, RESTful APIs, Webhooks, third-party integrations

Posted 4 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Gurugram

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database technologies and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Gurugram office.- A 15 years of full time education is required. Qualification 15 years of full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also focusing on the development of new features to meet client needs. You will be responsible for troubleshooting issues and providing solutions, ensuring that the applications function optimally and meet the required specifications. Your role will require a proactive approach to problem-solving and a commitment to delivering high-quality results in a timely manner. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct thorough testing and debugging of application components to ensure functionality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data modeling and database design.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years of full time education is required. Qualification 15 years of full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies