Jobs
Interviews

270 Pentaho Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

8 - 8 Lacs

Hyderābād

On-site

Hyderabad, India Technology In-Office 11047 Job Description Job Purpose The Property Data Engineer is responsible for developing and maintaining data conversion programs that transform raw property assessment data into standardized formats based on specifications by Property Data Analyst and Senior Analysts. This role requires not only advanced programming and ETL skills but also a deep understanding of the structure, nuances, and business context of assessment data. Even with clear and well-documented conversion instructions, engineers without prior exposure to this domain often face significant challenges in interpreting and transforming the data accurately. Data Engineer plays a critical role in ensuring the accuracy, efficiency and scalability of data processing pipelines that support the Assessor Operations. Responsibilities Depending on the specific team and role, the Property Data Engineer may be responsible for some or all the following tasks: Develop and maintain data conversion programs using C#, Python, JavaScript, and SQL. Implement ETL workflows using tools such as Pentaho Kettle, SSIS, and internal applications. Collaborate with Analysts and Senior Analysts to interpret conversion instructions and translate them into executable code. Troubleshoot and resolve issues identified during quality control reviews. Recommend and implement automation strategies to improve data processing efficiency. Perform quality checks on converted data and ensure alignment with business rules and standards. Contribute to the development of internal tools and utilities to support data transformation tasks. Maintain documentation for code, workflows, and processes to support team knowledge sharing. Programming (Skill Level: Advanced to Expert) Create and maintain conversion programs in SQL, Visual Studio using C#, Python or JavaScript. Use JavaScript within Pentaho Kettle workflows and SSIS for data transformation. Build and enhance in-house tools to support custom data processing needs. Ensure code is modular, maintainable, and aligned with internal development standards. Ensure code quality through peer reviews, testing and adherence to development standards. ETL Execution (Skill Level: Advanced to Expert ) Execute and troubleshoot ETL processes using tools like Kettle, SSIS, and proprietary tools. Input parameters, execute jobs, and perform quality checks on output files. Troubleshoot ETL failures and optimize performance. Recommend and implement automation strategies to improve data processing efficiency and accuracy. Data File Manipulation (Skill Level: Advanced to Expert) Work with a wide variety of file formats (CSV, Excel, TXT, XML, etc.) to prepare data for conversion. Apply advanced techniques to clean, merge, and structure data. Develop scripts and tools to automate repetitive data preparation tasks. Ensure data is optimized for downstream ETL and analytical workflows. Data Analysis (Skill Level: Supportive – Applied) Leverage prior experience in data analysis to independently review and interpret source data when developing or refining conversion programs. Analyze data structures, field patterns, and anomalies to improve the accuracy and efficiency of conversion logic. Use SQL queries, Excel tools, and internal utilities to validate assumptions and enhance the clarity of analyst-provided instructions. Collaborate with Analysts and Senior Analysts to clarify ambiguous requirements and suggest improvements based on technical feasibility and data behavior. Conduct targeted research using public data sources (e.g., assessor websites) to resolve data inconsistencies or fill in missing context during development. Quality Control (Skill Level: Engineer-Level) Perform initial quality control on converted data outputs before formal review by Associates, Analysts, or Senior Analysts for formal review. Validate that the program output aligns with conversion instructions and meets formatting and structural expectations. Use standard scripts, ad-hoc SQL queries, and internal tools to identify and correct discrepancies in the data. Address issues identified during downstream QC reviews by updating conversion logic or collaborating with analysts to refine requirements. Ensure that all deliverables meet internal quality standards prior to release or further review. Knowledge and Experience Minimum Education: Bachelor’s degree in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related technical field; or equivalent practical experience in software development or data engineering. Preferred Education: Bachelor’s degree (as above) plus additional coursework or certifications in: Data Engineering ETL Development Cloud Data Platforms (e.g., AWS, Azure, GCP) SQL and Database Management Programming (C#, Python, JavaScript) 4+ years of experience in software development, data engineering, or ETL pipeline development. Expert-level proficiency in programming languages such as SQL, Visual Studio using C#, Python, and JavaScript. Experience with ETL tools such as Pentaho Kettle, SSIS, or similar platforms. Strong understanding of data structures, file formats (CSV, Excel, TXT, XML), and data transformation techniques. Familiarity with relational databases and SQL for data querying and validation. Ability to read and interpret technical documentation and conversion instructions. Strong problem-solving skills and attention to detail. Ability to work independently and collaboratively in a fast-paced environment. Familiarity with property assessment, GIS, tax or public property records data. Preferred Skills Experience developing and maintaining data conversion programs in Visual Studio. Experience with property assessment, GIS, tax or public records data. Experience building internal tools or utilities to support data transformation workflows. Knowledge of version control systems (e.g., Git, Jira) and agile development practices. Exposure to cloud-based data platforms or services (e.g., Azure Data Factory, AWS Glue). Ability to troubleshoot and optimize ETL performance and data quality. Strong written and verbal communication skills for cross-functional collaboration.

Posted 1 day ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Spark/Python/Pentaho Developer Location: Pune, India Role Description Spark/Python/Pentaho Developer. Need to work on Data Integration project. Mostly batch oriented using Python/Pyspark/Pentaho. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Hands on Spark/Python/Pentaho programming experience Participating in agile development projects for batch data ingestion. Fast learner into order to understand the current data landscape and existing Python/Spark/Pentaho program to make enhancement. Stakeholder communication Contribute to all stages of software development lifecycle Analyze user requirements to define business objectives Define application objectives and functionality Develop and test software Identify and resolve any technical issues arising Create detailed design documentation Conducting software analysis, programming, testing, and debugging Software upgrades and maintenance Migration of Out of Support Application Software Your Skills And Experience Experience: Minimum 5-10 years Spark Python programming Pentaho Good in writing Hive HQL’s / SQLs Oracle database Java/Scala experience is a plus Expertise in unit testing. Know-how with cloud-based infrastructure How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 days ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

You will be joining QTek Digital, a leading data solutions provider known for its expertise in custom data management, data warehouse, and data science solutions. Our team of dedicated data professionals, including data scientists, data analysts, and data engineers, collaborates to address present-day challenges and pave the way for future innovations. At QTek Digital, we value our employees and focus on fostering engagement, empowerment, and continuous growth opportunities. As a BI ETL Engineer at QTek Digital, you will be taking on a full-time remote position. Your primary responsibilities will revolve around tasks such as data modeling, applying analytical skills, implementing data warehouse solutions, and managing Extract, Transform, Load (ETL) processes. This role demands strong problem-solving capabilities and the capacity to work autonomously. To excel in this role, you should ideally possess: - 6-9 years of hands-on experience in ETL and ELT pipeline development using tools like Pentaho, SSIS, FiveTran, Airbyte, or similar platforms. - 6-8 years of practical experience in SQL and other data manipulation languages. - Proficiency in Data Modeling, Dashboard creation, and Analytics. - Sound knowledge of data warehousing principles, particularly Kimball design. - Bonus points for familiarity with Pentaho and Airbyte administration. - Demonstrated expertise in Data Modeling, Dashboard design, Analytics, Data Warehousing, and ETL procedures. - Strong troubleshooting and problem-solving skills. - Effective communication and collaboration abilities. - Capability to operate both independently and as part of a team. - A Bachelor's degree in Computer Science, Information Systems, or a related field. This position is based in our Hyderabad office, offering an attractive compensation package ranging from INR 5-19 Lakhs, depending on various factors such as your skills and prior experience. Join us at QTek Digital and be part of a dynamic team dedicated to shaping the future of data solutions.,

Posted 2 days ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

Remote

Java Back End Engineer with AWS Location : Remote Experience : 10+ Years Employment Type : Full-Time Job Overview We are looking for a highly skilled Java Back End Engineer with strong AWS cloud experience to design and implement scalable backend systems and APIs. You will work closely with cross-functional teams to develop robust microservices, optimize database performance, and contribute across the tech stack, including infrastructure automation. Core Responsibilities Design, develop, and deploy scalable microservices using Java, J2EE, Spring, and Spring Boot. Build and maintain secure, high-performance APIs and backend services on AWS or GCP. Use JUnit and Mockito to ensure test-driven development and maintain code quality. Develop and manage ETL workflows using tools like Pentaho, Talend, or Apache NiFi. Create High-Level Design (HLD) and architecture documentation for system components. Collaborate with cross-functional teams (DevOps, Frontend, QA) as a full-stack contributor when needed. Tune SQL queries and manage performance on MySQL and Amazon Redshift. Troubleshoot and optimize microservices for performance and scalability. Use Git for source control and participate in code reviews and architectural discussions. Automate infrastructure provisioning and CI/CD processes using Terraform, Bash, and pipelines. Primary Skills Languages & Frameworks : Java (v8/17/21), Spring Boot, J2EE, Servlets, JSP, JDBC, Struts Architecture : Microservices, REST APIs Cloud Platforms : AWS (EC2, S3, Lambda, RDS, CloudFormation, SQS, SNS) or GCP Databases : MySQL, Redshift Secondary Skills (Good To Have) Infrastructure as Code (IaC) : Terraform Additional Languages : Python, Node.js Frontend Frameworks : React, Angular, JavaScript ETL Tools : Pentaho, Talend, Apache NiFi (or equivalent) CI/CD & Containers : Jenkins, GitHub Actions, Docker, Kubernetes Monitoring/Logging : AWS CloudWatch, DataDog Scripting : Bash, Shell scripting Nice To Have Familiarity with agile software development practices Experience in a cross-functional engineering environment Exposure to DevOps culture and tools (ref:hirist.tech)

Posted 2 days ago

Apply

3.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

About This Role At BlackRock, we are looking for a Data Engineer who enjoys building and supporting high impact data pipelines to solve complex challenges while working closely with your colleagues throughout the business. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, drive, and passion while giving you the opportunity to grow technically while learning from hands-on leaders in technology and finance. With over USD $11 trillion of assets we have an outstanding responsibility: our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing. Being a financial technologist at BlackRock means you get the best of both worlds: working for one of the most successful financial companies and also working in a software development team responsible for next generation technology and solutions. We are seeking a high-reaching individual to help implement financial data engineering projects, initially focusing on our Index Fixed Income Group for the BGM DnA ("Data and Analytics") team in India. We are a community of highly qualified Data Engineers, Content & DevOps Specialists who have a passion for working on data solutions that help drive the agenda for our business partners Our team is based in San Francisco, London & Hungary, and we will complete the global circle with a new engineering team in Mumbai. About BlackRock Global Markets BlackRock Global Markets (“BGM”) functions are at the core of BlackRock’s markets and investments platform, including ETF and Index Investments (“Engine”), Global Trading, Securities Lending, Fixed Income, Liquidity and Financing. BGM is passionate about advancing the investment processes and platform architecture in these areas and on ensuring we engage with other market participants in a collaborative, strategic way. You should be Someone who is passionate about solving sophisticated business problems through data! Capable of the design, implementation, and optimization of data pipelines, ETL processes, and data storage solutions Able to work closely with multi-functional teams (e.g., Data Science, Product, Analytics, and Citizen Developer teams) to ensure the data infrastructure meets business needs. Enthusiastic about establishing and maintaining standard methodologies for data engineering, focusing on data quality, security, and scalability. Key Requirements 3-6 years Data Engineering experience preferably in the financial sector Familiarity with any aspect of Fixed Income Index and Market Data including ICE, Bloomberg, JP Morgan, FTSE/Russell, and IBOXX. Liquidity, Venue, and Direct Broker Dealer Market Maker Axe Data. Pricing Data from sources like S&P Global Live Bond Pricing or Bloombergs IBVAL. Understand Portfolio Management Fundamentals: Asset Management and FI Trading. A passion for Financial and Capital Markets. Proven experience working in an agile development team. Strong problem solving skills. Strong SQL and Python skills with a proven track record optimizing SQL queries. Curiosity of financial markets. Good To Have Bachelor’s degree in Computer Science, Engineering, Finance, Economics, or a related field. A Master’s degree or equivalent experience is a plus. Knowledge of Linux and scripting languages such as Bash Experience with MySQL, PostgreSQL, Greenplum, Snowflake or similar databases. Strong experience with ETL/ELT tools like DBT, Pentaho, Informatica or similar technologies. Experience with DevOps and tools like Azure DevOps Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: CRD (Charles River Development) Java Developer Location: Bangalore / Hyderabad Notice Period: Immediate to 30 days Job Summary: We are looking for an experienced Java Developer with hands-on expertise in Charles River Development (CRD) Investment Management System . The ideal candidate should have a strong background in asset management systems, Java development, and CRD API integration. Key Responsibilities: Design, develop, and support CRD IMS interfaces and custom solutions. Utilize CRD API to build and maintain Java-based plugins. Work with Oracle and MS SQL databases to create complex queries, triggers, and stored procedures. Configure and manage CRD imports/exports (OOTB & custom). Handle ETL tasks using Pentaho Kettle IDE . Use Azure DevOps (or similar tools) for CI/CD pipeline management and Git integration. Collaborate with business and IT teams to interpret requirements and deliver robust solutions. Perform defect analysis, debugging, and system enhancements. Requirements: 5+ years of experience in Java development with CRD IMS . Proficiency in Oracle Java, CRD APIs, and plugin development. Strong SQL knowledge; experience with Oracle/MS SQL Server. Hands-on with Azure DevOps, Git, and batch scheduling tools. Experience in financial services or asset management domain is a must. Excellent communication, problem-solving, and analytical skills.

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,

Posted 3 days ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Job Description Some careers shine brighter than others, If youre looking for a career that will help you stand out, join HSBC and fulfil your potential Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further, HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions, We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Provide expert technical guidance and solutions to the POD for complex business problems Design, develop, and implement technical solutions, ensuring they meet business requirements and are scalable and maintainable Troubleshoot and resolve escalated technical issues instantly, Experience in providing risk assessment for new functionality and enhancements As an ITSO (IT Service Owner), complete BOW tasks within the timelines and ensure that your application services are vulnerability, ICE, resiliency, and contingency testing compliant, As an ITSO, ensure that application have an effective escalation and support framework in place for all IT production Incidents and one that shall meet the agreed operational and service level agreements of business Accountable for leading the POD Sound Knowledge of corporate finance experience exhibiting knowledge of Interest rate risk in the banking book Experience with Agile delivery methodologies (JIRA, Scrum, FDD, SAFe) Experience with DevOps tools (Jenkins, Ansible, Git) Requirements To be successful in this role, you should meet the following requirements: Graduation in technology (B E, b-tech & Above) with 5+ years of IT experience, Strong knowledge on Pentaho ETL tool with map reduce build knowledge Writing complex SQL queries Good knowledge on Shell scripting, Python, Java Exposure to Hadoop and Bigdata is plus Infrastructure as Code & CICD Git, Ansible, Jenkins Having experience in working in Agile/DevOps env, Monitoring, Alerting, Incident Tracking, Reporting, etc Good understanding of Google cloud and latest tools/technologies exposure will be add-on, Youll achieve more when you join HSBC, hsbc /careers HSBC is committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website, Issued by HSBC Software Development India Show

Posted 3 days ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 3 days ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

AWS/Azure/GCP, Linux, shell scripting, IaaC, Docker, Kubernetes, Jenkins, GitHub A day in the life of an Infosys Equinox employee: As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. A Clear understanding of HTTP / Network protocol concepts, designs & operations - TCP dump, Cookies, Sessions, Headers, Client Server Architecture. Core strength in Linux and Azure infrastructure provisioning including VNet, Subnet, Gateway, VM, Security groups, MySQL, Blob Storage, Azure Cache, AKS Cluster etc. Expertise with automating Infrastructure as a code using Terraform, Packer, Ansible, Shell Scripting and Azure DevOps. Expertise with patch management, APM tools like AppDynamics, Instana for monitoring and alerting. Knowledge in technologies including Apache Solr, MySQL, Mongo, Zookeeper, RabbitMQ, Pentaho etc. Knowledge with Cloud platform including AWS and GCP are added advantage. Ability to identify and automate recurring tasks for better productivity. Ability to understand, implement industry standard security solutions. Experience in implementing Auto scaling, DR, HA, Multi-region with best practices is added advantage. Ability to work under pressure, managing expectations from various key stakeholders. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient prog Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate

Posted 3 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 4 days ago

Apply

3.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Job Description: Job Title: Pentaho Developer Corporate Title: Associate Location: Pune, India Role Description Developer with atleast 10 years of experience. Experience in developing and deploying Pentaho based applications. Need to work on Data Integration project. Mostly batch oriented using Pentaho 9. 3, oracle, hadoop, python, pyspark and similar technologies Primary Skills: Pentaho, SQL, Oracle, Python, Pyspark, shell scripting. Experience: Minimum of 10 years of on-hands experience in development projects. What we ll offer you As part of our flexible scheme, here are just some of the benefits that you ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Hands-on Developer. Pentaho Jobs/Transformation development role. Participating in agile development projects for batch data ingestion. Fast learner into order to understand the current data landscape and existing Spark and Hive HQL program to make enhancement. Bug fixing, enhancements for the existing applications Software upgrades and maintenance mandatory client connectivity upgrades Closure of Audit and operating control issues Migration of out of support application software Migration of applications to cloud Migration of applications from one technology to other Your skills and experience Pentaho, SQL, Oracle, Python, Pyspark, shell scripting Basic knowledge in Unix shell scripting is a must. Good in writing Database SQL s to process the batch jobs. Analytical SQL Pentaho Big data components Hadoop - Hive and Spark experience/knowledge Know-how with cloud-based infrastructure. Expertise in unit testing. How we ll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. https://www. db. com/company/company. htm

Posted 4 days ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

New Delhi, Chennai, Bengaluru

Work from Office

Must have 2+ years of working experience in designing ETL flows with Pentaho Data Integration Develop ETL jobs based on Requirements given. Self-Efficient to understand the existing ETL Jobs / Reports. Analyses data from various sources, including databases and flat files. Discuss with the customers/end users and gather requirements. Build Customize ETL/BI reports. Identify the Problematic Areas in ETL /Loads and need to fix by Performance Improvement. Must have 3+ years of experience in developing and tuning complex SQL queries using Oracle, PostgreSQL or other leading DBMS. Need to Write the SQL Scripts to validate the data. Need to Support Daily Loads able to fix the issues within SLA. Hands on data migration Able to setup reporting for a new client Able to write Linux script Provide production support Create Source to Target (Mapping) Documents Required Technical and Professional Expertise: Hands on experience on ETL Tools Like Pentaho. Hands on experience on any one Reporting tools like Pentaho BI, Microsoft Power BI, Oracle BI, Tableau etc. Experience in a data warehousing role with a solid understanding of data warehousing approaches and best practices. Strong Hands-on Experience writing SQL Scripts to analyse and validate the Data. Should have expert knowledge in writing SQL commands, queries, and stored procedures Strong knowledge of DB and DW concepts. Functional Knowledge/understanding on Finance, reconciliation, Customer service, Pricing modules etc. Excellent SQL, PL/SQL, XML, JSON and Database Skills Good to have some experience with Python or Javascript Good to have some knowledge on Kubernetes, Ansible Good to have some knowledge on Linux script Education UG: Any Graduate Key Skills ETL , Pentaho, PL/SQL, Etl Development, ETL Tool, SQL

Posted 5 days ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Gurgaon, Haryana, India

On-site

The Role We are seeking a Lead, Software Engineer who will: Perform data ingestion, aggregation, processing to drive and enable relevant insights from available data sets. Partner with various teams (i.e., Product Manager, Data Science, Platform Strategy, Technology) on data needs/requirements in order to deliver data solutions that generate business value. Manipulate and analyze complex, high-volume, high-dimensionality data from varying sources using a variety of tools and data analysis techniques. Identify innovative ideas and deliver proof of concepts, prototypes to deliver against the existing and future needs and propose new products, services and enhancements. Integrate & Unify new data assets which increase the value proposition for our customers and enhance our existing solutions and services. Analyse large volumes of transaction and product data to generate insights and actionable recommendations to drive business growth Collect and synthesize feedback from clients, development, product and sales teams for new solutions or product enhancements. Apply knowledge of metrics, measurements, and benchmarking to complex and demanding solutions. All About You Good understanding of streaming technologies like Kafka, Spark Streaming. Proficiency in one of the programming language preferably Java, Scala or Python. Experience with Enterprise Business Intelligence Platform/Data platform. Strong SQL and higher-level programming languages with solid knowledge of data mining, machine learning algorithms and tools Experience with data integration tools ETL/ELT tools (i.e. Apache NiFi, Azure Data Factory, Pentaho, Talend) Exposure to collecting and/or working with data including standardizing, summarizing, offering initial observations and highlighting inconsistencies. Strong understanding of the application of analytical methods and data visualization to support business decisions. Ability to understand complex operational systems and analytics/business intelligence tools for the delivery of information products and analytical offerings to a large, global user base. Able to work in a fast-paced, deadline-driven environment as part of a team and as an individual contributor Ability to easily move between business, analytical, and technical teams and articulate solution requirements for each group

Posted 5 days ago

Apply

2.0 years

5 Lacs

Delhi

On-site

Profile: Database Developer Exp: 2+ years of experience Location: Shastri Park, Notice Period: Immediate Joiner to 15 days of joiner Education: Btech/ B.E./ MCA/ MSC/ MSSalary: 30% to 40% hike (rest depend on interview) CANDIDATES FROM DELHI WILL ONLY BE PREFERRED ROLES & RESPONSIBILITIES:- If any work experience with PENTAHO (will be an added advantage) Sql Developers are responsible for all aspects of designing, creating and maintaining the database Building databases and validating their stability and efficiency Creating programs views, functions and stored proceduresWriting optimized SQL queries for integration with other applications Developing scripts, procedures and triggers for application development Maintaining data quality and backups and overseeing database security Strong proficiency in SQL, PL/SQL, T-SQL and database management systems (MySQL, PostgreSQL, SQL Server, Oracle, etc.). Develop and maintain ETL (Extract, Transform, Load) processes for data migration. Job Types: Full-time, Contractual / Temporary Pay: From ₹500,000.00 per year Work Location: In person

Posted 5 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) – Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelor’s degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Posted 6 days ago

Apply

0.0 years

0 Lacs

Delhi, Delhi

On-site

Profile: Database Developer Exp: 2+ years of experience Location: Shastri Park, Notice Period: Immediate Joiner to 15 days of joiner Education: Btech/ B.E./ MCA/ MSC/ MSSalary: 30% to 40% hike (rest depend on interview) CANDIDATES FROM DELHI WILL ONLY BE PREFERRED ROLES & RESPONSIBILITIES:- If any work experience with PENTAHO (will be an added advantage) Sql Developers are responsible for all aspects of designing, creating and maintaining the database Building databases and validating their stability and efficiency Creating programs views, functions and stored proceduresWriting optimized SQL queries for integration with other applications Developing scripts, procedures and triggers for application development Maintaining data quality and backups and overseeing database security Strong proficiency in SQL, PL/SQL, T-SQL and database management systems (MySQL, PostgreSQL, SQL Server, Oracle, etc.). Develop and maintain ETL (Extract, Transform, Load) processes for data migration. Job Types: Full-time, Contractual / Temporary Pay: From ₹500,000.00 per year Work Location: In person

Posted 6 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

IDT (www.idt.net) is a communications and financial services company founded in 1990 and headquartered in New Jersey, US. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1500 people across 20+ countries, and have revenues in excess of $1.5 billion. We are looking for a Mid-level Business Intelligence Engineer to join our global team. If you are highly intelligent, motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals across many lines of business. The interview process will be conducted in English Responsibilities: Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker) Recommend process improvements to increase efficiency and reliability in ELT/ETL development Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products Should be willing to explore and learn new technologies and concepts to provide the right kind of solution Target and result oriented with strong end user focus Effective oral and written communication skills with BI team and user community Requirements: 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment Excellent English communication skills Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. Pluses: Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools Experience with reporting/visualization tools (e.g., Looker) and job scheduler software Experience in Telecom, eCommerce, International Mobile Top-up Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core Please attach CV in English. The interview process will be conducted in English. Only accepting applicants from INDIA

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Overview Job Title: Data Automation Engineer, AS Location: Bangalore, India Role Description KYC Operations play an integral part in the firm’s first line of defense against financial crime, reducing the risk of working with new clients (primarily Know Your Customer (KYC) risk), whilst ensuring client relationships are on-boarded and maintained efficiently. KYC Operations provide a golden source of quality reference data for CIB, underpinning the firm’s key Regulatory, Control & Governance standards. Within KYC Operations there is a dedicated global group KYC Transformation that drives end-to-end-delivery. Our teams’ partners with stakeholders in and outside of KYC Ops to ensure our processes are fit for purpose, follow a uniform standard and continuously improving our processes thereby adding a quantifiable value to support colleagues and clients in a flexible, fast and focused manner. As a Data Automation Engineer, you will build fast solutions to help Operations and other parts of the bank deliver their highest value, removing repetitive tasks, building strategic data pipelines, ensuring automation is robust and stable using solutions incl. Python, VBA, MS Power platforms (Power Automate, Power Apps, Power BI), SQL and Share Points. Our approach is to ensure the solution can be merged into strategic tooling and fits the technology design process standards. We are looking for an enthusiastic and motivated person with excellent communication skills to join our team. You will love working with us and see the value in helping people by delivering effective solutions that make a positive impact on your colleagues’ workload. You will be curious and able to quickly absorb organizational complexity, regulatory requirements, and business logic, translating that structure into your work. This role will offer a fantastic opportunity to join one of the most prestigious financial organisations operating all over the globe, and you will gain amazing experience. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Work with stakeholders to identify opportunities to drive business solutions and improvements Automate manual effort, providing tactical solutions to improve speed and value. Work in an agile way to deliver proof of concept and fast solutions using the appropriate technologies appropriate to the problem statements and requirements Enhance personal and team network to ensure cooperation yields efficiencies, for example sharing of solutions to a wider team, re-using existing solutions, enhancing solutions to have a wider and more beneficial business impact Your Skills And Experience Analyse, design, develop, test, deploy and support Digital services software solutions Exposure to ETL technologies and methods Expertise in coding/ programming in Python, VBA, and SQL skills to extract data sets efficiently Experience in developing business solutions in any of MS power Apps, MS Power Automate or RPA Excellent spatial reasoning and ability to see view process and data in two or three-dimensions. Process Mapping, Process Re-engineering & Data orientated with experience in enterprise process modelling for current and future state. The ability to generate innovative ideas and deliver effectively, highlighting blockers if needed. Exposure to workflow solutions, Alteryx, Pentaho, Celonis, linux and database tuning are desirable Documenting solutions (i.e., Creation and upkeep of artefacts - Requirement Docs, SDDs, Test Scripts, JIRA tickets, KSD - post go live) Provide L1 support to the existing RPA solution, resolve the issues with minimum TAT to ensure business resiliency Competencies: Work alongside Solutions Architects, Business Analysts and BOT controlling to contribute with solution designs Highly organized with a keen eye for detail and proven record operating in a fast- paced environment Ability to work independently and as part of the team with an enterprising spirit and a passion to learn and apply new technologies. Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Behavior Skills Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Desire to work in a fast paced, challenging environment Self-motivated, independently, fast thinking, dynamic with exposure to internal and external stakeholders How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply

0 years

7 - 9 Lacs

India

Remote

Fullstack (Java + Angular) Experience required (5 yrs) Immediate joiners Work Location: Ahmedabad or Pune Detailed JD- 5+ experience with a minimum bachelor’s degree in Computer Science. Technical Skillset o Java 8+, JavaScript, Typescript o Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasper reports. o Angular 8+, React 16+ o Angular Material, Bootstrap 4, HTML5, CSS3, SCSS o Oracle SQL, PL/SQL development. o Pentaho Kettle. o Basic Linux Scripting and troubleshooting. o GIT o Design patterns Guideline- 1. Passport is mandatory – IF candidate is not holding the passport, then please request candidate to apply for Tatkal Passport 2. Need "immediate" joiners only. 3. Attaching the assignment with the email- Profiles need to be submit along with the assignment. Job Types: Contractual / Temporary, Freelance Contract length: 6 months Pay: ₹65,000.00 - ₹80,000.00 per month Benefits: Work from home Work Location: In person

Posted 1 week ago

Apply

5.0 - 7.0 years

4 - 9 Lacs

Chennai

Hybrid

Role & responsibilities Experience in ETL development with a focus on Pentaho Data Integration . Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Experience with data modeling and data warehousing concepts. Familiarity with scripting languages (e.g., Shell, Python) is a plus. Experience with version control systems like Git. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: ETL Pentaho Developer with Snowflake Experience: 4-7 yrs Job Location: Chennai Mandatory Skills: ETL Developer-SF, ETL, SQL, Pentaho JD- Over all 4 to 7 years of strong hands-on experience in ETL development Should have at least 2 years of experience with Pentaho Data Integration and Snowflake Database. Extensive expertise on Databases, data acquisition, ETL strategies and tools and technologies using Pentaho DI Should have experience of extracting data through various file formats like XML files, Json and Flat file sources etc Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, Oracle 10 & 11g, MySQL DBs etc. Strong hands-on experience in tuning the Sal queries/ETL mappings. Exposure to Standard support ticket management tools. Mastery in Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities. Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software. Experience with scripting language like Python, Shell, JavaScript, HTML5 is a plus. Broad understanding of multiple existing and next generation Business intelligence tools and technologies. Should have at least 2 years of experience with Pentaho Data Integration and Snowflake Database.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

This is a full-time work from home opportunity for a star BI data engineer from India. IDT (www.idt.net) is a communications and financial services company founded in 1990 and headquartered in New Jersey, US. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1500 people across 20+ countries, and have revenues in excess of $1.5 billion. We are looking for a Mid-level Business Intelligence Engineer to join our global team. If you are highly intelligent, motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals across many lines of business. The interview process will be conducted in English Responsibilities: Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker) Recommend process improvements to increase efficiency and reliability in ELT/ETL development Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products Should be willing to explore and learn new technologies and concepts to provide the right kind of solution Target and result oriented with strong end user focus Effective oral and written communication skills with BI team and user community Requirements: 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment Excellent English communication skills Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. Pluses Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools Experience with reporting/visualization tools (e.g., Looker) and job scheduler software Experience in Telecom, eCommerce, International Mobile Top-up Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core Please attach CV in English. The interview process will be conducted in English. Only accepting applicants from INDIA

Posted 1 week ago

Apply

0 years

4 - 8 Lacs

Hyderābād

On-site

ABOUT FLUTTER ENTERTAINMENT Flutter Entertainment is the world’s largest sports betting and iGaming operator with 13.9 million Average Monthly Players worldwide and an annual revenue of $14Bn in 2024. We have a portfolio of iconic brands, including Paddy Power, Betfair, FanDuel, PokerStars, Junglee Games and Sportsbet. Flutter Entertainment is listed on both the New York Stock Exchange (NYSE) and the London Stock Exchange (LSE). In 2024, we were recognized in TIME’s 100 Most Influential Companies under the 'Pioneers' category—a testament to our innovation and impact. Our ambition is to transform global gaming and betting to deliver long-term growth and a positive, sustainable future for our sector. Together, we are Changing the Game. Working at Flutter is a chance to work with a growing portfolio of brands across a range of opportunities. We will support you every step of the way to help you grow. Just like our brands, we ensure our people have everything they need to succeed. FLUTTER ENTERTAINMENT INDIA Our Hyderabad office, located in one of India’s premier technology parks is the Global Capability Center for Flutter Entertainment. A center of expertise and innovation, this hub is now home to over 900+ talented colleagues working across Customer Service Operations, Data and Technology, Finance Operations, HR Operations, Procurement Operations, and other key enabling functions. We are committed to crafting impactful solutions for all our brands and divisions to power Flutter's incredible growth and global impact. With the scale of a leader and the mindset of a challenger, we’re dedicated to creating a brighter future for our customers, colleagues, and communities. ROLE PURPOSE: At Flutter, we are embarking on an ambitious global finance transformation programme throughout 2025, 2026 and 2027. The Technology Enablement and Automation Manager will be responsible for delivering elements of the ICFR pillar of the global finance transformation programme. The Technology Enablement and Automation Transformation Manager will report directly, or indirectly, to the Head of Technology Enablement and Automation Transformation. Flutter consists of two commercial divisions (FanDuel and International) and our central Flutter Functions; COO, Finance & Legal. Here in Flutter Functions we work with colon-premises across all our divisions and regions to deliver something we call the Flutter Edge. It’s our competitive advantage, our ‘secret sauce’ which plays a key part in our ongoing success and powers our brands and divisions, through Product, Tech, Expertise and Scale. In Flutter Finance we pride ourselves on providing global expertise to ensure Flutter is financially healthy. Utilizing our Flutter Edge to turbo-charge our capabilities. KEY RESPONSIBILITIES Design, develop, launch and maintain custom technical solutions including workflow automations, reporting pipelines / dashboards and cloud systems integrations, focused on improving and enabling Flutter’s Internal Controls over Financial Reporting (ICFR) annual cycle Bring your technical know-how to continuously improve Finance and IT processes and controls (for example, balance sheet reconciliations, GRC tool enablement and analytical reviews). Prepare and maintain high quality documentation related to your automation and reporting deliverables. Contribute to robust technical delivery processes for the ICFR Transformation Technology Enablement & Automation team. Collaborate closely with Internal Controls Transformation, Internal Controls Assurance teams and with colleagues across Finance and IT (Group and Divisional teams) to ensure seamless delivery of the technical solutions, automations and reporting that you own. Contribute to regular status reporting to senior leaders, highlighting potential challenges and opportunities for improvement. TO EXCEL IN THIS ROLE, YOU WILL NEED TO HAVE Passion for technical solution delivery, and for learning new technologies. Strong technology architecture, design, development, deployment and maintenance skills. Demonstrable coding experience launching workflow automations and reporting solutions using SQL and Python (or equivalent programming languages) with measurable business impact Proficiency with databases, data pipelines, data cleansing and data visualization / business intelligence (including ETL) - using tools such as KNIME, Pentaho, Alteryx, Power Automate, Databricks, Tableau or PowerBI (or equivalent) Hands-on technical experience and confidence in implementing at least one of: System integrations - ideally across both on-premises and cloud-based applications, (including Application Integration Patterns and Microservices orchestration) Robotic process automation - such as Alteryx, UIPath, BluePrism (or equivalent) Low-code application development - such as Retool (or equivalent) Business process orchestration / business process management - such as Appian, Pega, Signavio, Camunda (or equivalent) Business process mining and continuous controls monitoring - such as Celonis, Soroco or Anecdotes (or equivalent) Ability to operate in a fast-paced environment and successfully deliver technical change. Strong communication skills, clearly articulating technical challenges and potential solutions. It will be advantageous, but not essential to have one or more of: Experience improving processes focused on reducing risk (e.g. ICFR / internal controls / audit / risk and compliance). Experience of betting, gaming or online entertainment businesses. Experience bringing Artificial Intelligence (AI) solutions to improve enterprise business processes. Knowledge of Oracle ERP (e.g. Oracle Fusion and Oracle Governance, Risk and Compliance modules). Knowledge of Governance, Risk and Compliance systems. BENEFITS WE OFFER Access to Learnerbly, Udemy , and a Self-Development Fund for upskilling. Career growth through Internal Mobility Programs . Comprehensive Health Insurance for you and dependents. Well-Being Fund and 24/7 Assistance Program for holistic wellness. Hybrid Model : 2 office days/week with flexible leave policies, including maternity, paternity, and sabbaticals. Free Meals, Cab Allowance , and a Home Office Setup Allowance. Employer PF Contribution , gratuity, Personal Accident & Life Insurance. Sharesave Plan to purchase discounted company shares. Volunteering Leave and Team Events to build connections. Recognition through the Kudos Platform and Referral Rewards . WHY CHOOSE US: Flutter is an equal-opportunity employer and values the unique perspectives and experiences that everyone brings. Our message to colleagues and stakeholders is clear: everyone is welcome, and every voice matters. We have ambitious growth plans and goals for the future. Here's an opportunity for you to play a pivotal role in shaping the future of Flutter Entertainment India.

Posted 1 week ago

Apply

0 years

0 - 0 Lacs

India

Remote

Fullstack (Java + Angular) Experience required (5 yrs) Immediate joiners Work Location: Ahmedabad or Pune 1st location : 6th Floor, Unit No. 601-609, Spinel, Opp. Kargil Petrol Pump, Sarkhej-Gandhinagar Highway, Ahmedabad, Gujarat- 380060. 2st location : Tower B1, Level 5, Office No- 4, Symphony IT Park, Nanded City, Pune- 411068, Maharashtra, India. Detailed JD- 5+ experience with a minimum bachelor’s degree in Computer Science. Technical Skillset o Java 8+, JavaScript, Typescript o Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasper reports. o Angular 8+, React 16+ o Angular Material, Bootstrap 4, HTML5, CSS3, SCSS o Oracle SQL, PL/SQL development. o Pentaho Kettle. o Basic Linux Scripting and troubleshooting. o GIT o Design patterns Guideline- 1. Passport is mandatory – IF candidate is not holding the passport, then please request candidate to apply for Tatkal Passport 2. Need "immediate" joiners only. 3. Attaching the assignment with the email- Profiles need to be submit along with the assignment. Job Types: Contractual / Temporary, Freelance Contract length: 6 months Pay: ₹60,000.00 - ₹80,000.00 per year Benefits: Work from home Work Location: In person

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies