Jobs
Interviews

32 Etl Scripts Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Summary: We are seeking a skilled * Associate Informatica Developer* with *2 - 4 years of experience* in designing, developing, and maintaining ETL processes using *Informatica PowerCenter*. The ideal candidate should have strong SQL knowledge, data warehousing concepts, and hands-on experience in data integration, transformation, and loading. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues About You You are a fit for this position if your background includes - 24 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks. #LI-SM1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 4 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. A typical day involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and optimize data workflows, ensuring that the data infrastructure supports the organization's analytical needs effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and data lake architectures.- Familiarity with data integration tools and ETL frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Nagar, Chennai

Work from Office

What you will do: As a Data Engineer at ACV Auctions, you HAVE FUN !! You will design, develop, write, and modify code. You will be responsible for development of ETLs, application architecture, optimizing databases & SQL queries. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Design, develop, and maintain scalable ETL pipelines using Python and SQL to ingest, process, and transform data from diverse sources. Write clean, efficient, and well-documented code in Python and SQL. Utilize Git for version control and collaborate effectively with other engineers. Implement and manage data orchestration workflows using industry-standard orchestration tools (e.g., Apache Airflow, Prefect).. Apply a strong understanding of major data structures (arrays, dictionaries, strings, trees, nodes, graphs, linked lists) to optimize data processing and storage. Support multi-cloud application development. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in MySQL and Postgres database development. Influence companywide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Participate in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements. Participate in on-call rotation What you will need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 3+ years of experience programming in Python 3+ years of experience with ETL workflow implementation (Airflow, Python) 3+ years work with continuous integration and build tools. 2+ year of experience with Cloud platforms preferably in AWS or GCP Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Proficient in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Proficient in databases (RDB), SQL, and can contribute to table definitions. Self-sufficient debugger who can identify and solve complex problems in code. Deep understanding of major data structures (arrays, dictionaries, strings). Experience with Domain Driven Design. Experience with containers and Kubernetes. Experience with database monitoring and diagnostic tools, preferably Data Dog. Hands-on skills and the ability to drill deep into the complex system design and implementation. Proficiency in SQL query writing and optimization. Familiarity with database security principles and best practices. Familiarity with in-memory data processing Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Experience working with: SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python Nice to Have Qualifications: Experience with Airflow, Docker, Visual Studio, Pycharm, Redis, Kubernetes, Fivetran, Spark, Dataflow, Dataproc, EMR Experience with database monitoring and diagnostic tools, preferably DataDog. Hands-on experience with Kafka or other event streaming technologies. Hands-on experience with micro-service architecture ACV Auctions in Chennai, India are looking for talented individuals to join our team As we expand our platform, we're offering a wide range of exciting opportunities across various roles in corpo

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Roles and Responsibilities Develop, update, and maintain new and existing applications, ensuring they meet specified requirements, scale efficiently, and maintain high performance. Analyze and interpret project requirements to independently design effective solutions while keeping the broader product architecture in mind. Design, develop, and deploy APIs and web services with a focus on reusable, testable, and efficient code. Implement low-latency, scalable applications with optimized performance. Create Docker files for containerization and deploy applications within a Kubernetes environment. Adapt quickly to a dynamic, start-up style environment, demonstrating strong problem-solving skills and a resourceful approach to driving results. Qualifications Required Skills and Experience Required Proficiency in Python: 8+ years of hands-on experience with Python, particularly with Fast API / Flask. Familiarity with other web frameworks like Django and web2py is beneficial. Web Development and API Design: Deep understanding of RESTful API design, as well as a working knowledge of HTTP, JSON, and other web protocols. Database Expertise: Experience with RDBMS databases (e.g., PostgreSQL, MySQL) and document-based databases (e.g., MongoDB). Skilled in database design, indexing, and optimizing queries. Design Patterns and Best Practices: Knowledge of fundamental design principles, including object-oriented programming (OOP) and design patterns, especially as they apply to Python. Containerization and Orchestration: Strong experience with Docker for containerization, and Kubernetes for deploying and managing containerized applications. Scalable Architecture Knowledge: Understanding of multi-process architecture, threading limitations of Python, and core principles behind building scalable and maintainable applications. Unit Testing and Quality Assurance: Familiar with testing frameworks such as PyTest or UnitTest for building unit tests and ensuring code quality, as well as a TDD (Test-Driven Development) approach. Version Control: Proficiency with Git for source code management and collaborative development. Preferred Skills: ETL Processes and Data Pipelines: Hands-on experience in building data pipelines and workflows, using tools such as Apache Airflow or other ETL frameworks. Cloud Services: Experience working with cloud environments, especially AWS, including knowledge of services like S3, EC2, and Lambda. Microservices Architecture: Familiarity with microservices design patterns and best practices, as well as deployment in containerized environments. Continuous Integration/Continuous Deployment (CI/CD): Knowledge of CI/CD tools such as Jenkins, GitLab CI, or GitHub Actions. Why you'll love working with us: Opportunity to work on technical challenges with global impact. Vast opportunities for self-development, including online university access and sponsored certifications. Sponsored Tech Talks & Hackathons to foster innovation and learning. Generous benefits package including health insurance, retirement benefits, flexible work hours, and more. Supportive work environment with forums to explore passions beyond work. This role presents an exciting opportunity for a motivated individual to contribute to the development of cutting-edge solutions while advancing their career in a dynamic and collaborative environment.

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

3.0 - 7.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Job Title :Industry & Function AI Data Engineer + S&C GN Management Level :09 - Consultant Location :Primary - Bengaluru, Secondary - Gurugram Must-Have Skills :Data Engineering expertise, Cloud platforms:AWS, Azure, GCP, Proficiency in Python, SQL, PySpark and ETL frameworks Good-to-Have Skills :LLM Architecture, Containerization tools:Docker, Kubernetes, Real-time data processing tools:Kafka, Flink, Certifications like AWS Certified Data Analytics Specialty, Google Professional Data Engineer,Snowflake,DBT,etc. Job Summary : As a Data Engineer, you will play a critical role in designing, implementing, and optimizing data infrastructure to power analytics, machine learning, and enterprise decision-making. Your work will ensure high-quality, reliable data is accessible for actionable insights. This involves leveraging technical expertise, collaborating with stakeholders, and staying updated with the latest tools and technologies to deliver scalable and efficient data solutions. Roles & Responsibilities: Build and Maintain Data Infrastructure:Design, implement, and optimize scalable data pipelines and systems for seamless ingestion, transformation, and storage of data. Collaborate with Stakeholders:Work closely with business teams, data analysts, and data scientists to understand data requirements and deliver actionable solutions. Leverage Tools and Technologies:Utilize Python, SQL, PySpark, and ETL frameworks to manage large datasets efficiently. Cloud Integration:Develop secure, scalable, and cost-efficient solutions using cloud platforms such as Azure, AWS, and GCP. Ensure Data Quality:Focus on data reliability, consistency, and quality using automation and monitoring techniques. Document and Share Best Practices:Create detailed documentation, share best practices, and mentor team members to promote a strong data culture. Continuous Learning:Stay updated with the latest tools and technologies in data engineering through professional development opportunities. Professional & Technical Skills: Strong proficiency in programming languages such as Python, SQL, and PySpark Experience with cloud platforms (AWS, Azure, GCP) and their data services Familiarity with ETL frameworks and data pipeline design Strong knowledge of traditional statistical methods, basic machine learning techniques. Knowledge of containerization tools (Docker, Kubernetes) Knowing LLM, RAG & Agentic AI architecture Certification in Data Science or related fields (e.g., AWS Certified Data Analytics Specialty, Google Professional Data Engineer) Additional Information: The ideal candidate has a robust educational background in data engineering or a related field and a proven track record of building scalable, high-quality data solutions in the Consumer Goods sector. This position offers opportunities to design and implement cutting-edge data systems that drive business transformation, collaborate with global teams to solve complex data challenges and deliver measurable business outcomes and enhance your expertise by working on innovative projects utilizing the latest technologies in cloud, data engineering, and AI. About Our Company | Accenture Qualification Experience :Minimum 3-7 years in data engineering or related fields, with a focus on the Consumer Goods Industry Educational Qualification :Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Gurugram

Hybrid

Must Have Skills: Good Communication Skills- Ability to interact with customers independently with good articulation and confidence and Articulation. Digital marketing experience, with Martech platforms implementation (Adobe/SFDC) from solution, design, and implementation specifically with AEP and CJO. Database skills Have good expertise of either of databases SQL/ Oracle /Teradata etc. Data model design Skills for Marketing databases. Capability to query, work on aggregates, use filter conditions, Creation joins, define keys, set index etc Logical Programming knowledge, with either JS/ Python /PHP etc., Should be comfortable with HTML/CSS. Understand complex data models (preferrable XDM) and the objective of marketing campaigns. Familiarity with establishing rules and conditional logics to customize journeys based on customer attributes and behaviours. Advocate for improvements. Strong ability to partner and effectively work across the organization with line of business and technology colleagues. Strong implementation coordination and communication skills. Good to have Skills: Preparing reports, Pulling data from different platform like AEP into AJO Own the project deliverables end to end working independently. Prior experience with CDP/Adobe campaign Data engineering Skills Experience in debugging and optimised the SQLs and ETL scripts. Responsibilities: Provide advisory, consulting, and technical implementation services to customers on Adobe Experience Platform and Adobe Journey Optimizer Assess customer requirements, pain points, goals and make recommendations on solution architecture, design, and roadmap. Configure Adobe Experience Platform services like Identity Service, Data Storage Layer, Profile dataset, Real-time Customer Profile etc. Implement data ingestion from various sources like CRM, MPNs, Website data, Mobile apps using APIs, Experience Data Model (XDM), schemas and mappers. Create customer, visitor and product audiences using segments and AI/ML powered segments. Configure destinations to activate audiences in channels like email, Journey Optimizer, Ad Cloud and CRM systems. Implement Journey Optimizer features like journey orchestration, triggers, automations, actions, messages, and offers. Develop custom applications, workflows, and APIs to extend the platform as per customer needs. Troubleshoot technical issues, debug, and optimize performance of customer implementations. Provide post go-live support, enhancements, maintenance, and upgrades for customer solutions. Conduct product training, knowledge transfer and share best practices. Continuously track product updates and improve solution recommendations for evolving customer needs.

Posted 2 weeks ago

Apply

2.0 years

11 - 17 Lacs

Pune

Work from Office

The Role A Technical Data Analyst is responsible for performing data migration, data conversion and data validation projects for Addepar clients using existing tools and established processes. The ideal candidate will have a good understanding of financial portfolio data, a foundational level of python programming skills, exceptional communication skills and the ability to deliver results in alignment with project deadlines while meeting high quality standards. What You’ll Do Convert, migrate and validate data from external or internal sources using existing tooling with defined processes and workflows Complete data projects on-time meeting project deadlines while adhering to high quality standards Coordinate across project teams communicating regular status updates for assigned data projects and while effectively setting expectations Run python ETL scripts and at times modify, fix or debug as needed Raise keys issues to project team members and senior leadership as necessary Prioritize and context-switch effectively to complete simultaneous projects; seeing each through to the finish line Adhere to project management standard processes Identify and drive opportunities to improve current processes, workflows and tools to increase efficiency and automation Who You Are Minimum 2+ years experience working in technology and finance Experience working with colleagues spread across multiple global locations Must have domain experience wealth/portfolio/investment management. Proficient in Python programming language and well versed in ETL concepts Understands financial markets and has experience with financial products and portfolio data Excellent written and oral communication skills with the ability to convey complex information in an understandable manner Solution-oriented and passion for problem solving Highly organized, close attention to detail and driven to make processes more efficient Positive attitude, good work ethic, proactive and a high contributing teammate Independent, adaptable and can work with minimum supervision Proven ability to manage expectations and provide regular updates to the project team P.S. This role will require you to work from Pune office 3 days a week in UK shift i.e. 2:30 PM to 11:30 PM IST. (hybrid role)

Posted 3 weeks ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Mumbai

Work from Office

To help the project by owning the critical data transformation and integration processes. This would help faster and simpler development and maintenance. The profile is someone with excellent Abinitio experience who can quickly adapt and deliver. The Developer will have following responsibilities: Analyse & estimate the requirement. Write detailed technical analysis with impacts (technical/functionally) Design & develop high quality code Unit test and provide support during implementation Bug fixing & performance optimization Contributing Responsibilities Support to Service Delivery team for Production issue. Technical & Behavioral Competencies Solid experience as an RDBMS developer, with stored procedures, query performance tuning and ETL Design ETL Framework for audit and data reconciliation to manage batch and real time interfaces Develop ETL jobs for automation, monitoring and responsible for job performance optimization through the use of ETL development tools or custom developed procedures Set up best practices in App Dev team, share best practices amongst teams, share best practices amongst broader MBIM team Specific Qualifications (if required) Skills Referential Behavioural Skills : Ability to collaborate / Teamwork Client focused Critical thinking Decision Making Transversal Skills: Analytical Ability Ability to develop and adapt a process Ability to develop and leverage networks Ability to develop others & improve their skills Analytical Ability Education Level: Master Degree or equivalent Other/Specific Qualifications (if required) Knowledge in Git Knowledge in Devops (CI/CD)

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Remote

Skillset: PostgreSQL, Amazon Redshift, MongoDB, Apache Cassandra,AWS,ETL, Shell Scripting, Automation, Microsoft Azure We are looking for futuristic, motivated go getters having following skills for an exciting role. Job Description: Monitor and maintain the performance, reliability, and availability of multiple database systems. Optimize complex SQL queries, stored procedures, and ETL scripts for better performance and scalability. Troubleshoot and resolve issues related to database performance, integrity, backups, and replication. Design, implement, and manage scalable data pipelines across structured and unstructured sources. Develop automation scripts for routine maintenance tasks using Python, Bash, or similar tools. Perform regular database health checks, set up alerting mechanisms, and respond to incidents proactively. Analyze performance bottlenecks and resolve slow query issues and deadlocks. Work in DevOps/Agile environments, integrating with CI/CD pipelines for database operations. Collaborate with engineering, analytics, and infrastructure teams to integrate database solutions with applications and BI tools. Research and implement emerging technologies and best practices in database administration. Participate in capacity planning, security audits, and software upgrades for data infrastructure. Maintain comprehensive documentation related to database schemas, metadata, standards, and procedures. Ensure compliance with data privacy regulations and implement robust disaster recovery and backup strategies. Desired skills: Database Systems: Hands-on experience with SQL-based databases (PostgreSQL, MySQL), Amazon Redshift, MongoDB, and Apache Cassandra. Scripting & Automation: Proficiency in scripting using Python, Shell, or similar to automate database operations. Cloud Platforms: Working knowledge of AWS (RDS, Redshift, EC2, S3, IAM,Lambda) and Azure SQL/Azure Cosmos DB. Big Data & Distributed Systems: Familiarity with Apache Spark for distributed data processing. Performance Tuning: Deep experience in performance analysis, indexing strategies, and query optimization. Security & Compliance: Experience with database encryption, auditing, access control, and GDPR/PII policies. Familiarity with Linux and Windows server administration is a plus. Education & Experience: BE, B.Tech, MCA, Mtech from Tier 2/3 colleges & Science Graduates 5-8 years of work experience.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

16 - 19 Lacs

Pune

Work from Office

Immediate/Early Joiners | Pune On-site | Hiring Data Engineer (4+ yrs) with strong SQL & mandatory Shell scripting, ETL pipeline design, large-scale data processing, data accuracy, and integration skills.

Posted 4 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Mumbai

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Use Design thinking and a consultative approach to conceive cutting edge technology solutions for business problems, mining core Insights as a service model Contribute to a global business and technology consulting team responsible for developing and delivering innovative offerings and solutions Contribute in the capacity of a business analyst/data business analyst to Data projects Participate in business requirements / functional specification definition, scope management, data analysis and design, in collaboration with both business stakeholders and IT teams Document detailed business requirements, develop solution design and specifications Support and coordinate system implementations through the project lifecycle (initiation, planning, analysis / design, development, implementation, testing, rollout, and hand-over) working with other teams on a local and global basis Elicit requirements using interviews, requirements workshops, surveys, business process analysis, use cases, scenarios, and workflow analysis Work closely with the solutions architecture team to define the target detailed solution to deliver the business requirements. Engage with project activities across the Information lifecycle, often related to paradigms like Building & managing Business data lakes and ingesting data streams to prepare data Developing machine learning and predictive models to analyse data Visualizing data Empowering Information consumers with agile Data Models that enable Self-Service BI Your Profile Proven working experience as a Data Business Analyst or Business Analyst with overall experience of min 5 to 9+ years Business analyst with Data background is preferred Preferably, person should have domain knowledge on CPRD/ FS/ MALS/ Utilities/ TMT Strong knowledge of database languages such as SQL and ETL frameworks; spreadsheet tools such as Microsoft Excel; and data visualization software such as Power BI, Tableau, Qlik Independently able to work with Product Management team and prepare Functional Analysis, User Stories Experience in technical writing skills to create Business Requirement Document (BRD), Functional Specification Document (FSD), Non-functional Requirement document, User Manual &Use Cases Specifications Comprehensive & solid experience of SCRUM as well as SDLC methodologies What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Gurugram

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Use Design thinking and a consultative approach to conceive cutting edge technology solutions for business problems, mining core Insights as a service model Engage with project activities across the Information lifecycle, often related to paradigms like -Building & managing Business data lakes and ingesting data streams to prepare data , Developing machine learning and predictive models to analyse data ,Visualizing data ,empowering Information consumers with agile Data Models that enable Self-Service BI , Specialize in Business Models and architectures across various Industry verticals Participate in business requirements / functional specification definition, scope management, data analysis and design, in collaboration with both business stakeholders and IT teams ,Document detailed business requirements, develop solution design and specifications. Support and coordinate system implementations through the project lifecycle working with other teams on a local and global basis Work closely with the solutions architecture team to define the target detailed solution to deliver the business requirements. Your Profile Proven working experience as a Data Business Analyst or Business Analyst with overall experience of min 5 to 9+ years Business analyst with Data background is preferred Preferably, person should have domain knowledge on CPRD/ FS/ MALS/ Utilities/ TMT Strong knowledge of database languages such as SQL and ETL frameworks; spreadsheet tools such as Microsoft Excel; and data visualization software such as Power BI, Tableau, Qlik Independently able to work with Product Management team and prepare Functional Analysis, User Stories Experience in technical writing skills to create BRD, FSD, Non-functional Requirement document, User Manual &Use Cases Specifications Comprehensive & solid experience of SCRUM as well as SDLC methodologies Experience in JIRA/Confluence and possess strong knowledge of the Scrum framework as well as other Agile frameworks such as Kanban, Crystal, XP, etc Strong stakeholder management skills Consultant must have a flair for storytelling and be able to present interesting insights from the data. Must have good exposure to Database management systems, Hands on with SQL and good knowledge of noSQL based databases,Good to have working knowledge of R/Python language. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI.

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. Whats in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What Were Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelors degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Sr Devloper with special emphasis and experience of 8 to 10 years on Pyspark and Python along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans.8-10 years of experience in designing and developing Pyspark applications and ETL Jobs using ETL Tools. 5+ years of sound knowledge on Pyspark to implement ETL logics. Strong understanding of frontend technologies such as HTML, CSS, React & JavaScript. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Sr Devloper with special emphasis and experience of 6 to 8 years on Pyspark, Python and SQL along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans. 3+ years of sound knowledge on Pyspark to implement ETL logics. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Chennai, Bengaluru

Work from Office

5+years of experience as an ETL Developer, with hands-on expertise in (ODI). Proficiency in Oracle Database and MySQL, with strong skills in SQL & PL/SQL Experience in data integration, transformation, and loading from heterogeneous data sources.

Posted 1 month ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Pune

Work from Office

Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5?9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake, including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow, or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestio Why Join Diacto Technologies Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions

Posted 1 month ago

Apply

3.0 - 7.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Job Title :Industry & Function AI Data Engineer + S&C GN Management Level :09 - Consultant Location :Primary - Bengaluru, Secondary - Gurugram Must-Have Skills :Data Engineering expertise, Cloud platforms:AWS, Azure, GCP, Proficiency in Python, SQL, PySpark and ETL frameworks Good-to-Have Skills :LLM Architecture, Containerization tools:Docker, Kubernetes, Real-time data processing tools:Kafka, Flink, Certifications like AWS Certified Data Analytics Specialty, Google Professional Data Engineer,Snowflake,DBT,etc. Job Summary : As a Data Engineer, you will play a critical role in designing, implementing, and optimizing data infrastructure to power analytics, machine learning, and enterprise decision-making. Your work will ensure high-quality, reliable data is accessible for actionable insights. This involves leveraging technical expertise, collaborating with stakeholders, and staying updated with the latest tools and technologies to deliver scalable and efficient data solutions. Roles & Responsibilities: Build and Maintain Data Infrastructure:Design, implement, and optimize scalable data pipelines and systems for seamless ingestion, transformation, and storage of data. Collaborate with Stakeholders:Work closely with business teams, data analysts, and data scientists to understand data requirements and deliver actionable solutions. Leverage Tools and Technologies:Utilize Python, SQL, PySpark, and ETL frameworks to manage large datasets efficiently. Cloud Integration:Develop secure, scalable, and cost-efficient solutions using cloud platforms such as Azure, AWS, and GCP. Ensure Data Quality:Focus on data reliability, consistency, and quality using automation and monitoring techniques. Document and Share Best Practices:Create detailed documentation, share best practices, and mentor team members to promote a strong data culture. Continuous Learning:Stay updated with the latest tools and technologies in data engineering through professional development opportunities. Professional & Technical Skills: Strong proficiency in programming languages such as Python, SQL, and PySpark Experience with cloud platforms (AWS, Azure, GCP) and their data services Familiarity with ETL frameworks and data pipeline design Strong knowledge of traditional statistical methods, basic machine learning techniques. Knowledge of containerization tools (Docker, Kubernetes) Knowing LLM, RAG & Agentic AI architecture Certification in Data Science or related fields (e.g., AWS Certified Data Analytics Specialty, Google Professional Data Engineer) Additional Information: The ideal candidate has a robust educational background in data engineering or a related field and a proven track record of building scalable, high-quality data solutions in the Consumer Goods sector. This position offers opportunities to design and implement cutting-edge data systems that drive business transformation, collaborate with global teams to solve complex data challenges and deliver measurable business outcomes and enhance your expertise by working on innovative projects utilizing the latest technologies in cloud, data engineering, and AI. About Our Company | Accenture Qualification Experience :Minimum 3-7 years in data engineering or related fields, with a focus on the Consumer Goods Industry Educational Qualification :Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field

Posted 1 month ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelor's degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 1 month ago

Apply

5.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Use Design thinking and a consultative approach to conceive cutting edge technology solutions for business problems, mining core Insights as a service model Engage with project activities across the Information lifecycle, often related to paradigms like -Building & managing Business data lakes and ingesting data streams to prepare data , Developing machine learning and predictive models to analyse data ,Visualizing data ,empowering Information consumers with agile Data Models that enable Self-Service BI , Specialize in Business Models and architectures across various Industry verticals Participate in business requirements / functional specification definition, scope management, data analysis and design, in collaboration with both business stakeholders and IT teams ,Document detailed business requirements, develop solution design and specifications. Support and coordinate system implementations through the project lifecycle working with other teams on a local and global basis Work closely with the solutions architecture team to define the target detailed solution to deliver the business requirements. Your Profile Proven working experience as a Data Business Analyst or Business Analyst with overall experience of min 5 to 9+ years Business analyst with Data background is preferred Preferably, person should have domain knowledge on CPRD/ FS/ MALS/ Utilities/ TMT Strong knowledge of database languages such as SQL and ETL frameworks; spreadsheet tools such as Microsoft Excel; and data visualization software such as Power BI, Tableau, Qlik Independently able to work with Product Management team and prepare Functional Analysis, User Stories Experience in technical writing skills to create BRD, FSD, Non-functional Requirement document, User Manual &Use Cases Specifications Comprehensive & solid experience of SCRUM as well as SDLC methodologies Experience in JIRA/Confluence and possess strong knowledge of the Scrum framework as well as other Agile frameworks such as Kanban, Crystal, XP, etc Strong stakeholder management skills Consultant must have a flair for storytelling and be able to present interesting insights from the data. Must have good exposure to Database management systems, Hands on with SQL and good knowledge of noSQL based databases,Good to have working knowledge of R/Python language. What you will love working about here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies