Jobs
Interviews

273 Pentaho Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gandhinagar, Gujarat, India

On-site

Experience: 5+ years Notice Period: Immediate Joiners Only Work Location: Gandhinagar: 3A, Ground Floor, IT Tower 3, Infocity, Gandhinagar, Gujarat 382007 Ahmedabad: 6th Floor, Unit No. 601-609, Spinel, Opp. Kargil Petrol Pump, Sarkhej-Gandhinagar Highway, Ahmedabad, Gujarat- 380060 Pune: Tower B1, Level 5, Office No-4, Symphony IT Park, Nanded City, Pune-411068, Maharashtra, India Key Responsibilities: Develop, enhance, and maintain scalable web applications using Java and Angular . Work on Spring Boot, Spring MVC, Spring Webservices, and Hibernate for backend development. Build responsive and user-friendly UI using Angular 8+, Angular Material, Bootstrap 4, HTML5, CSS3, and SCSS . Perform database design, development, and optimization using Oracle SQL & PL/SQL . Integrate Jasper Reports and Pentaho Kettle for reporting and ETL processes. Ensure code quality, adherence to design patterns , and version control with GIT . Troubleshoot and perform basic Linux scripting for deployments. Collaborate with cross-functional teams for requirement gathering and solution delivery. Required Technical Skillset: Languages & Frameworks: Java 8+, JavaScript, TypeScript, Spring Boot, Spring MVC, Hibernate Frontend: Angular 8+, React 16+, Angular Material, Bootstrap 4, HTML5, CSS3, SCSS Database: Oracle SQL, PL/SQL ETL & Reporting: Pentaho Kettle, Jasper Reports Other: GIT, Basic Linux scripting Good understanding of design patterns Educational Qualification: Bachelor’s degree in Computer Science or equivalent Additional Guidelines: Passport is mandatory – Candidates without a passport must be willing to apply through Tatkal Passport . Only immediate joiners will be considered.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

panchkula, haryana

On-site

As a highly skilled Data Architect & ETL Engineer, you will play a crucial role in designing, implementing, and managing scalable data architectures and ETL pipelines. Your expertise in data, SQL, and ETL transformations will be essential in enabling high-quality business intelligence and analytics solutions. Your responsibilities will include designing and implementing scalable data architectures to support analytics, data integration, and reporting. You will be developing and maintaining ETL pipelines using tools like Pentaho, optimizing SQL queries, and building efficient data models for reporting and data warehousing. Collaboration with business analysts, data scientists, and application teams will be key to ensuring efficient data flow. In addition, you will implement data governance, quality control, and security measures, as well as develop interactive reports and dashboards using tools such as Tableau, QuickSight, or Power BI. Monitoring and troubleshooting data pipelines to ensure high availability and reliability will also be part of your role, along with documenting data flows, ETL processes, and architectural designs. To excel in this role, you should have a Bachelor's or Masters degree in Computer Science, Information Systems, or a related field, along with 5+ years of experience in data engineering, ETL development, or data architecture. Strong SQL skills, experience with relational databases, and hands-on experience with ETL transformation tools like Pentaho are required. Knowledge of data modeling, data warehousing, and BI best practices, as well as an understanding of data governance, metadata management, and data security, will be beneficial. Strong problem-solving and communication skills are essential for success in this position. Preferred skills include experience with cloud platforms (AWS, Azure, or GCP) and cloud-based data services, proficiency in Python or Java for data transformation and automation, and knowledge of CI/CD pipelines for data workflows. Join us for the opportunity to work with an amazing team on high-impact projects in the Fintech industry, where you will have the chance to learn and grow professionally.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Title: L2 Application Support Engineer – AML Platform (Jocata) Job Location: remote Experience 3 -5 yearsNeed immediate joiners preferred within 0-15 days..!! Work Schedule:24x7 rotational shifts, weekend support, and on-call duties for critical escalations or batch failures. Key Responsibilities Technical Support & Maintenance Monitor and manage day-to-day operations of the AML system (Jocata GRID), including process schedulers, ETL jobs, alerts, rule engines, and dashboards. Provide Level 2 troubleshooting and perform root cause analysis (RCA) for issues escalated by L1 or end-users. Work on incident management using ITSM tools (e.g., ServiceNow, Remedy), ensuring resolution within SLA and proper documentation. Coordinate with L3 vendor support (Jocata) for unresolved issues, patch deployments, and hotfixes. Configuration & Rules Management Perform changes to alert thresholds, typologies, and rules as per compliance team requirements. Assist in testing and deployment of AML rule configurations, scenario tuning, and performance impact analysis. Maintain version control of rule sets and workflow configurations using proper DevOps or change control protocols. Data & Security Compliance Ensure that the system adheres to internal data protection and external regulatory standards (FATF, RBI, FIU-IND). Monitor data feeds (KYC, transactions, customer profiles) from core banking, payments, and credit systems. Validate ETL pipelines and data reconciliation processes between Jocata and source systems (e.g., Finacle, UPI, NACH). Monitoring, Reporting & Audit Support Generate or validate AML reports, STRs (Suspicious Transaction Reports), and CTRs (Cash Transaction Reports) for submission to regulators. Work with compliance teams during internal/external audits to retrieve logs, evidence, and report generation. Maintain and improve health-check scripts, monitoring dashboards, and alerting systems (e.g., Grafana, ELK, SQL Monitor). Technical Skills Operating Systems: Linux/Unix shell scripting, Windows Server admin basics Database: Oracle/SQL Server/PostgreSQL (Complex Queries, Joins, Views) ETL Tools: Pentaho / Talend / Custom ETL scripts Monitoring: ELK Stack, Prometheus, Nagios, or in-house tools Scripting: Shell, Python, or PowerShell (for automation/log parsing) ITSM Tools: ServiceNow, Jira Service Desk Regulatory Understanding: AML Typologies, STR/CTR norms, RBI/FIU-IND standards Soft Skills & Functional Knowledge Understanding of banking operations and AML workflow process Ability to document standard operating procedures and knowledge base articles. Certifications Certified Anti-Money Laundering Specialist (CAMS) – optional ITIL Foundation (for incident and change management processes). Jocata GRID hands-on experience or certification (if applicable).

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Gurgaon

On-site

Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 2 weeks ago

Apply

5.0 - 7.0 years

25 - 40 Lacs

Gurugram

Work from Office

Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 2 weeks ago

Apply

3.0 - 9.0 years

5 - 11 Lacs

Bengaluru

Work from Office

Data Analysis, Database Design Data Warehouse Design Architecture Roadmap Data Engineering SQL Server Snowflake, Azure Data FactoryShould be good in communication and interaction with stakeholders and should have experience in handling large data engineering and Analytics projectsShould be able to understand customer requirement and do solutioning and present the solution to both technical and non-technical audienceShould be able to work with stakeholders and define the architecture roadmap for the projectShould be able to manage and lead a team and should be able to contribute individually as well towards design PoCs,coordination etcSecondary skill MySQL SSIS and Pentaho and Power BI

Posted 2 weeks ago

Apply

3.0 - 6.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Provide expert technical guidance and solutions to the POD for complex business problems Design, develop, and implement technical solutions, ensuring they meet business requirements and are scalable and maintainable Troubleshoot and resolve escalated technical issues instantly. Experience in providing risk assessment for new functionality and enhancements As an ITSO (IT Service Owner), complete BOW tasks within the timelines and ensure that your application services are vulnerability, ICE, resiliency, and contingency testing compliant. As an ITSO, ensure that application have an effective escalation and support framework in place for all IT production Incidents and one that shall meet the agreed operational and service level agreements of business Accountable for leading the POD Sound Knowledge of corporate finance experience exhibiting knowledge of Interest rate risk in the banking book Experience with Agile delivery methodologies (JIRA, Scrum, FDD, SAFe) Experience with DevOps tools (Jenkins, Ansible, Git) Requirements Graduation in technology (B. E, B. Tech & Above) with 5+ years of IT experience. Strong knowledge on Pentaho ETL tool with map reduce build knowledge Writing complex SQL queries Good knowledge on Shell scripting, Python, Java Exposure to Hadoop and Bigdata is plus Infrastructure as Code & CICD Git, Ansible, Jenkins Having experience in working in Agile/DevOps env. Monitoring, Alerting, Incident Tracking, Reporting, etc. Good understanding of Google cloud and latest tools/technologies exposure will be add-on. .

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About The Company iLink Digital is a Global Software Solution Provider and Systems Integrator, delivers next-generation technology solutions to help clients solve complex business challenges, improve organizational effectiveness, increase business productivity, realize sustainable enterprise value and transform your business inside-out. iLink integrates software systems and develops custom applications, components, and frameworks on the latest platforms for IT departments, commercial accounts, application services providers (ASP) and independent software vendors (ISV). iLink solutions are used in a broad range of industries and functions, including healthcare, telecom, government, oil and gas, education, and life sciences. iLink’s expertise includes Cloud Computing & Application Modernization, Data Management & Analytics, Enterprise Mobility, Portal, collaboration & Social Employee Engagement, Embedded Systems and User Experience design etc. What makes iLink's offerings unique is the fact that we use pre-created frameworks, designed to accelerate software development and implementation of business processes for our clients. iLink has over 60 frameworks (solution accelerators), both industry-specific and horizontal, that can be easily customized and enhanced to meet your current business challenges. Requirements ETL development to transform data Co-ordinate with Engineering team and Development team to gather their requirement related to Engineering and data. Gather data from various data sources and work with engineering team for any gaps. Meetings with Stake holders for Demo and Data related. Requirements Skill set Expertise in Pentaho ETL Transformation. Expertise in SQL and Oracle. Good knowledge in API. Experience in other Halliburton Products is very good plus Experience in Pentaho Reporting is a plus. Good troubleshooting skills. Benefits Competitive salaries Medical Insurance Employee Referral Bonuses Performance Based Bonuses Flexible Work Options & Fun Culture Robust Learning & Development Programs In-House Technology Training

Posted 2 weeks ago

Apply

5.0 years

7 - 9 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Provide expert technical guidance and solutions to the POD for complex business problems Design, develop, and implement technical solutions, ensuring they meet business requirements and are scalable and maintainable Troubleshoot and resolve escalated technical issues instantly. Experience in providing risk assessment for new functionality and enhancements As an ITSO (IT Service Owner), complete BOW tasks within the timelines and ensure that your application services are vulnerability, ICE, resiliency, and contingency testing compliant. As an ITSO, ensure that application have an effective escalation and support framework in place for all IT production Incidents and one that shall meet the agreed operational and service level agreements of business Accountable for leading the POD Sound Knowledge of corporate finance experience exhibiting knowledge of Interest rate risk in the banking book Experience with Agile delivery methodologies (JIRA, Scrum, FDD, SAFe) Experience with DevOps tools (Jenkins, Ansible, Git) Requirements To be successful in this role, you should meet the following requirements: Graduation in technology (B.E, B.Tech & Above) with 5+ years of IT experience. Strong knowledge on Pentaho ETL tool with map reduce build knowledge Writing complex SQL queries Good knowledge on Shell scripting, Python, Java Exposure to Hadoop and Bigdata is plus Infrastructure as Code & CICD – Git, Ansible, Jenkins Having experience in working in Agile/DevOps env. Monitoring, Alerting, Incident Tracking, Reporting, etc. Good understanding of Google cloud and latest tools/technologies exposure will be add-on. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

panchkula, haryana

On-site

You are a highly skilled Data Architect & ETL Engineer with a strong understanding of data, SQL expertise, and experience in ETL transformations and analytics tools. Your role involves designing, implementing, and managing scalable data architectures and ETL pipelines to enable high-quality business intelligence and analytics solutions. You will be responsible for designing and implementing scalable data architectures to support analytics, data integration, and reporting. Developing and maintaining ETL pipelines using Pentaho or other ETL transformation tools will be a key part of your role. Collaboration with business analysts, data scientists, and application teams is essential to ensure efficient data flow. Optimizing SQL queries and building efficient data models for reporting and data warehousing, such as star/snowflake schema, will be part of your responsibilities. Implementing data governance, quality control, and security measures is critical. Additionally, developing interactive reports and dashboards using Tableau, QuickSight, or Power BI is part of the role. Monitoring and troubleshooting data pipelines to ensure high availability and reliability, as well as documenting data flows, ETL processes, and architectural designs, are also important tasks. You should possess a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with 5+ years of experience in data engineering, ETL development, or data architecture. Strong SQL skills and experience with relational databases (PostgreSQL, MySQL, SQL Server, Oracle) are required. Hands-on experience with Pentaho or similar ETL transformation tools, as well as experience with Tableau, QuickSight, or Power BI for data visualization and reporting, is necessary. Knowledge of data modeling, data warehousing, and BI best practices, along with an understanding of data governance, metadata management, and data security, is essential. Strong problem-solving and communication skills are also important. Preferred skills include experience with cloud platforms (AWS, Azure, or GCP) and cloud-based data services, as well as experience in Python or Java for data transformation and automation. Knowledge of CI/CD pipelines for data workflows is a plus. Joining the team offers the opportunity to work on high impact projects in the Fintech industry, providing valuable learning experiences.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

3100 Senior Software Developer - Intech 3100 Senior Software Developer Qualification: 5+ experience with a minimum bachelor s degree in Computer Science. Technical Skillset o Java 8+, JavaScript, Typescript o Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasperreports. o Angular 8+, React 16+ o Angular Material, Bootstrap 4, HTML5, CSS3, SCSS o Oracle SQL, PL/SQL development. o Pentaho Kettle. o Basic Linux Scripting and troubleshooting. o GIT o Design patterns Apply Job

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Fullstack (Java + Angular) Experience required (5 yrs) Position - 3 Budget: 100K /Month Immediate joiners Work Location: Ahmedabad or Pune Detailed JD- 5+ experience with a minimum bachelor’s degree in Computer Science. • Technical Skillset o Java 8+, JavaScript, Typescript o Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasper reports. o Angular 8+, React 16+ o Angular Material, Bootstrap 4, HTML5, CSS3, SCSS o Oracle SQL, PL/SQL development. o Pentaho Kettle. o Basic Linux Scripting and troubleshooting. o GIT o Design patterns Guideline- 1. Passport is mandatory – IF candidate is not holding the passport, then please request candidate to apply for Tatkal Passport 2. Need "immediate" joiners only.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Key Responsibilities: A day in the life of an Infosys Equinox employee As part of the Infosys Equinox delivery team your primary role would be to ensure effective Design Development Validation and Support activities to assure that our clients are satisfied with the high levels of service in the technology domain You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements A Clear understanding of HTTP Network protocol concepts designs operations TCP dump Cookies Sessions Headers Client Server Architecture Core strength in Linux and Azure infrastructure provisioning including VNet Subnet Gateway VM Security groups MySQL Blob Storage Azure Cache AKS Cluster etc Expertise with automating Infrastructure as a code using Terraform Packer Ansible Shell Scripting and Azure DevOps Expertise with patch management APM tools like AppDynamics Instana for monitoring and alerting Knowledge in technologies including Apache Solr MySQL Mongo Zookeeper RabbitMQ Pentaho etc Knowledge with Cloud platform including AWS and GCP are added advantage Ability to identify and automate recurring tasks for better productivity Ability to understand implement industry standard security solutions Experience in implementing Auto scaling DR HA Multi region with best practices is added advantage Ability to work under pressure managing expectations from various key stakeholders You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers You would be a key contributor to building efficient prog Technical Requirements: Ability to grasp cloud platforms AWS Azure GCP Kubernetes and containerization for scalable deployments Basic knowledge with Performance testing tools like JMeter LoadRunner or any other related tool Good Expertise in any of the programming languages like Java Python C or C Ability to analyze system metrics using profiling monitoring tools like Instana Dynatrace Prometheus and Grafana Additional Responsibilities: Ability to identify bottlenecks debugging hotspots in optimizing performance Continuously learning with the latest trends in performance engineering frameworks and methodologies Preferred Skills: Technology->Analytics - Packages->Python - Big Data,Technology->Infra_ToolAdministration-Others->Loadrunner,Technology->Java->Java - ALL,Technology->Performance Testing->Performance Engineering->Apache Jmeter,Technology->Performance Testing->Performance Testing - ALL

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Panchkula, Haryana, India

On-site

Job Summary We are seeking a highly skilled Data Architect & ETL Engineer with a strong understanding of data, SQL expertise, and experience in ETL transformations and analytics tools. The ideal candidate will design, implement, and manage scalable data architectures and ETL pipelines that enable high-quality business intelligence and analytics solutions. Key Responsibilities Design and implement scalable data architectures to support analytics, data integration, and reporting. Develop and maintain ETL pipelines using Pentaho or other ETL transformation tools. Ensure efficient data flow by collaborating with business analysts, data scientists, and application teams. Optimize SQL queries and build efficient data models for reporting and data warehousing (e.g., star/snowflake schema). Implement data governance, quality control, and security measures. Develop interactive reports and dashboards using Tableau, QuickSight, or Power BI. Monitor and troubleshoot data pipelines to ensure high availability and reliability. Document data flows, ETL processes, and architectural designs. Required Skills and Qualifications Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field. 5+ years of experience in data engineering, ETL development, or data architecture. Strong SQL skills and experience with relational databases (PostgreSQL, MySQL, SQL Server, Oracle). Hands-on experience with Pentaho or similar ETL transformation tools. Experience with Tableau, QuickSight, or Power BI for data visualization and reporting. Knowledge of data modeling, data warehousing, and BI best practices. Understanding of data governance, metadata management, and data security. Strong problem-solving and communication skills. Preferred Skills Experience with cloud platforms (AWS, Azure, or GCP) and cloud-based data services. Experience in Python or Java for data transformation and automation. Knowledge of CI/CD pipelines for data workflows. Why Join Us? Amazing Team High Impact Projects Fintech - Learnings like anything

Posted 3 weeks ago

Apply

12.0 - 14.0 years

30 - 35 Lacs

Pune

Work from Office

You will work within the Data Engineering team and with a POD of Hadoop Data Engineers aligned to priorities from Product Owner. You are expected to support existing application as well as design and build new. You will be part of an Agile team, take on complex problems and design and code. You are expected to participate in the technical innovation within your product area. Support issue resolution and improve processing performance Ensure the use of SQL, Hive, Pentaho, Control-M reduces lead time to delivery and aligns to overall Group strategic direction so that cross-functional development is usable. Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Work with the Product Owner, Solution & Platform Architects to identify changes required, create & agree necessary stories Work with the Agile Lead(s) to ensure efficient flow of the backlog of change activities Support the solution architect to ensure that solutions and services are supported by the right architectures & systems Requirements 12-14 years of overall hands-on IT experience with mandatory exposure of 5-7 years in Big Data, Hadoop, Pentaho ETL and data pipelines. Good knowledge of industry best practices for ETL Design, Principles, Concepts. Experienced in Python or other mainstream programming language Technical skills ETL/ Pentaho, Hive/ SQL, Unix Shell scripting, Control-M (or similar) Ability to work independently on specialized assignments within the context of project deliverables Excellent verbal and written communication skills with the ability to effectively advocate technical solutions Experience in Agile ways of working Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment. Nice to have: Experience with Bigdata cluster monitoring tools such as Ambari/ Cloudera Manager/ Accel Pulse Data and deriving insights into performance tuning or monitoring.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The ideal candidate will lead and drive development in the BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. You will be responsible for dashboard design, development, and delivery of BI services using Tableau eco-system. Your key functions and responsibilities will include communicating with the Project Manager to understand requirements, designing, developing, and deploying dashboards using Tableau eco-system, ensuring timely delivery while maintaining quality, staying updated with current technologies, and proactively working with the Management team to identify and resolve issues. Additionally, you will contribute to dashboard designing, R&D, and project delivery using Tableau. As a leader, you will set the standard and expectations through your conduct, work ethic, integrity, and character. Your academic background should ideally include a Bachelor's degree in Computer Science, while a Master's degree would be an added advantage. You should have 2-5 years of experience in DWBI development projects, with at least 2 years of experience in BI and Visualization technologies, specifically Tableau and QlikView. You should also have at least 2 years of experience covering the Tableau implementation lifecycle, including hands-on development/programming, managing security, data modeling, and data blending. Your technology and skills should include hands-on expertise in Tableau administration and maintenance, strong working knowledge and development experience with Tableau Server and Desktop, proficiency in SQL, PL/SQL, and data modeling, knowledge of databases like Microsoft SQL Server and Oracle, exposure to alternate Visualization technologies such as QlikView, Spotfire, and Pentaho, good communication and analytical skills, excellent creative and conceptual thinking abilities, superior organizational skills, attention to detail and quality, and strong communication skills both verbally and in writing. This position is located in Bangalore.,

Posted 3 weeks ago

Apply

12.0 - 14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: You will work within the Data Engineering team and with a POD of Hadoop Data Engineers aligned to priorities from Product Owner. You are expected to support existing application as well as design and build new. You will be part of an Agile team, take on complex problems and design and code. You are expected to participate in the technical innovation within your product area. Support ‘issue resolution’ and improve processing performance Ensure the use of SQL, Hive, Pentaho, Control-M reduces lead time to delivery and aligns to overall Group strategic direction so that cross-functional development is usable. Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Work with the Product Owner, Solution & Platform Architects to identify changes required, create & agree necessary stories Work with the Agile Lead(s) to ensure efficient flow of the backlog of change activities Support the solution architect to ensure that solutions and services are supported by the right architectures & systems Requirements To be successful in this role, you should meet the following requirements: 12-14 years of overall hands-on IT experience with mandatory exposure of 5-7 years in Big Data, Hadoop, Pentaho ETL and data pipelines. Good knowledge of industry best practices for ETL Design, Principles, Concepts. Experienced in Python or other mainstream programming language Technical skills – ETL/ Pentaho, Hive/ SQL, Unix Shell scripting, Control-M (or similar) Ability to work independently on specialized assignments within the context of project deliverables Excellent verbal and written communication skills with the ability to effectively advocate technical solutions Experience in Agile ways of working Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment. Nice to have: Experience with Bigdata cluster monitoring tools such as Ambari/ Cloudera Manager/ Accel Pulse Data and deriving insights into performance tuning or monitoring. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 weeks ago

Apply

0.0 - 3.0 years

4 - 5 Lacs

Chennai

Work from Office

Intern: Year 3/4 College/University Duties of Position: (Include specific duties and responsibilities) 1) Monitoring across production instances - Onprim & Cloud Applications 2) Optimize workload in concurrent manager - Oracle 3) Monitor ODI loads & executions 4) Oracle Performance Analysis 5) Schedule and maintain interface programs 7) Review System s Health Checks Status 8) Monitor Enterprise Applications - Pentaho Jobs , Postgres Jobs, SSIS Jobs, Applications hosted on Google Cloud. Qualifications Required: (Education, experience, skills, etc. Please be specific) Basic Knowledge in Oracle Applications 11i ERP System Administration Basic Knowledge in Oracle SQL / PLSQL , Postgres, Java, Google Scripting Ready to work in Rotational Shift (24/7 Operations). Employee is expected to stay in the organization for a longer duration. Educational Qualification: BE/B Tech / Bachelors in Computer Science.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

6 - 10 Lacs

Pune

Work from Office

: Job Title Pentaho Developer Corporate TitleAssociate LocationPune, India Role Description Developer with atleast 10 years of experience. Experience in developing and deploying Pentaho based applications. Need to work on Data Integration project. Mostly batch oriented using Pentaho 9.3, oracle, hadoop, python, pyspark and similar technologies Primary Skills: Pentaho, SQL, Oracle, Python, Pyspark, shell scripting. Experience: Minimum of 10 years of on-hands experience in development projects. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Hands-on Developer. Pentaho Jobs/Transformation development role. Participating in agile development projects for batch data ingestion. Fast learner into order to understand the current data landscape and existing Spark and Hive HQL program to make enhancement. Bug fixing, enhancements for the existing applications Software upgrades and maintenance mandatory client connectivity upgrades Closure of Audit and operating control issues Migration of out of support application software Migration of applications to cloud Migration of applications from one technology to other Your skills and experience Pentaho, SQL, Oracle, Python, Pyspark, shell scripting Basic knowledge in Unix shell scripting is a must. Good in writing Database SQLs to process the batch jobs. Analytical SQL Pentaho Big data components Hadoop - Hive and Spark experience/knowledge Know-how with cloud-based infrastructure. Expertise in unit testing. How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Pune

Work from Office

: Job TitleSenior Engineer Data SQL Engineer Corporate TitleAVP LocationPune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience 10+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position:** ETL Developer Client Requirements Experience:** 1-3 years Location: Gurgaon Employment Type - Full time budget - Upto 35,000/ -40,000/ We are looking for a passionate and detail-oriented **ETL Developer** with 1 to 3 years of experience in building, testing, and maintaining ETL processes. The ideal candidate should have a strong understanding of data warehousing concepts, ETL tools, and database technologies. ### **Key Responsibilities:** ✅ Design, develop, and maintain ETL workflows and processes using \[specify tools e.g., Informatica / Talend / SSIS / Pentaho / custom ETL frameworks]. ✅ Understand data requirements and translate them into technical specifications and ETL designs. ✅ Optimize and troubleshoot ETL processes for performance and scalability. ✅ Ensure data quality, integrity, and security across all ETL jobs. ✅ Perform data analysis and validation for business reporting. ✅ Collaborate with Data Engineers, DBAs, and Business Analysts to ensure smooth data operations. --- ### **Required Skills:** * 1-3 years of hands-on experience with ETL tools (e.g., **Informatica, Talend, SSIS, Pentaho**, or equivalent). * Proficiency in SQL and experience working with **RDBMS** (e.g., **SQL Server, Oracle, MySQL, PostgreSQL**). * Good understanding of **data warehousing concepts** and **data modeling**. * Experience in handling **large datasets** and performance tuning of ETL jobs. * Ability to work in Agile environments and participate in code reviews. --- ### **Preferred Skills (Good to Have):** * Experience with **cloud ETL solutions** (AWS Glue, Azure Data Factory, GCP Dataflow). * Exposure to **big data ecosystems** (Hadoop, Spark). * Basic knowledge of **Python / Shell scripting** for automation. * Familiarity with **version control (Git)** and **CI/CD pipelines 🎓 Bachelor’s degree in Computer Science, Engineering, Information Technology, or related field.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Key Responsibilities: A day in the life of an Infosys Equinox employee As part of the Infosys Equinox delivery team your primary role would be to ensure effective Design Development Validation and Support activities to assure that our clients are satisfied with the high levels of service in the technology domain You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements A Clear understanding of HTTP Network protocol concepts designs operations TCP dump Cookies Sessions Headers Client Server Architecture Core strength in Linux and Azure infrastructure provisioning including VNet Subnet Gateway VM Security groups MySQL Blob Storage Azure Cache AKS Cluster etc Expertise with automating Infrastructure as a code using Terraform Packer Ansible Shell Scripting and Azure DevOps Expertise with patch management APM tools like AppDynamics Instana for monitoring and alerting Knowledge in technologies including Apache Solr MySQL Mongo Zookeeper RabbitMQ Pentaho etc Knowledge with Cloud platform including AWS and GCP are added advantage Ability to identify and automate recurring tasks for better productivity Ability to understand implement industry standard security solutions Experience in implementing Auto scaling DR HA Multi region with best practices is added advantage Ability to work under pressure managing expectations from various key stakeholders You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers You would be a key contributor to building efficient prog Technical Requirements: AWS Azure GCP Linux shell scripting IaaC Docker Kubernetes Jenkins GitHub Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Preferred Skills: Technology->Cloud Platform->AWS Database,Technology->Cloud Platform->Azure Devops,Technology->Cloud Platform->GCP Database,Technology->Container Platform->Docker,Technology->Open System->Linux,Technology->Open System->Shell scripting

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Tips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall. Responsibilities Java 8+, JavaScript, Typescript Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasperreports. Angular 8+, React 16+ Angular Material, Bootstrap 4, HTML5, CSS3, SCSS Oracle SQL, PL/SQL development. Pentaho Kettle. Basic Linux Scripting and troubleshooting. GIT Design patterns Qualifications Experience: 5-8 Years in SSD Notice Period: Immediate Joiners Passport is Mandatory 5+ years of minimum experience Degree in Computer Science or Relevant Excellent Communication Skills Important Notes Must be ready to complete a mandatory assignment/assessment before the interview stage. Good communication is a must—client-facing role. Immediate or 0–15 days notice period only. Please do not apply if you do not meet the above criteria.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Requisition ID: 99962-0 Intern: Year 3/4 College/University Duties of Position: (Include specific duties and responsibilities) Monitoring across production instances – Onprim & Cloud Applications Optimize workload in concurrent manager - Oracle Monitor ODI loads & executions Oracle Performance Analysis Schedule and maintain interface programs Review System’s Health Checks Status Monitor Enterprise Applications – Pentaho Jobs , Postgres Jobs, SSIS Jobs, Applications hosted on Google Cloud. Qualifications Required: (Education, experience, skills, etc. Please be specific) Basic Knowledge in Oracle Applications 11i ERP System Administration Basic Knowledge in Oracle SQL / PLSQL , Postgres, Java, Google Scripting Ready to work in Rotational Shift (24/7 Operations). Employee is expected to stay in the organization for a longer duration. Educational Qualification: BE/B Tech / Bachelors in Computer Science. Apply Back to results

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies