Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
6 - 8 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities Architect and design solutions to meet functional and non-functional requirements. Create and review architecture and solution design artifacts. Evangelize re-use through the implementation of shared assets. Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. Proactively guide engineering methodologies, standards, and leading practices. Guidance of engineering staff and reviews of as-built configurations during the construction phase. Provide insight and direction on roles and responsibilities required for solution operations. Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. Demonstrate strong analytical and technical problem-solving skills. Ability to analyze and operate at various levels of abstraction. Ability to balance what is strategically right with what is practically realistic . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. Supporting and developing our people, including learning & development, certification & career development plans Providing technical governance and oversight for solution design and implementation Should have technical foresight to understand new technology and advancement. Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives Evangelizing the Data Engineering service offerings to both internal and external stakeholders Development of Whitepapers, blogs, webinars and other though leadership material Development of Go-to-Market and Service Offering definitions for Data Engineering Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. Build new Data capabilities, solutions, assets, accelerators, and team competencies. Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. Knowledge and experience in IT methodologies and life cycles that will be used. Familiar with solution implementation/management, service/operations management, etc. Leadership skills can inspire others and persuade. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance Must have experience to design platform on Databricks. Hands-on Experience to design and build Databricks based solution on any cloud platform. Hands-on experience to design and build solution powered by DBT models and integrate with databricks . Must be very good designing End-to-End solution on cloud platform. Must have good knowledge of Data Engineering concept and related services of cloud. Must have good experience in Python and Spark. Must have good experience in setting up development best practices. Intermediate level knowledge is required for Data Modelling. Good to have knowledge of docker and Kubernetes. Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC , SSO etc. Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. Experience building and supporting mission-critical technology components with DR capabilities. Experience with multi-tier system and service design and development for large enterprises Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Exposure to infrastructure and application security technologies and approaches Familiarity with requirements gathering techniques. Preferred qualifications Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain with total Must have designed and implemented at least 2-3 project end-to-end in Databricks. Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint – Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security Must have knowledge of new features available in Databricks and its implications along with various possible use-case. Must have followed various architectural principles to design best suited per problem. Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have strong understanding of Data warehousing and various governance and security standards around Databricks. Must have knowledge of cluster optimization and its integration with various cloud services. Must have good understanding to create complex data pipeline. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost. Must have worked on designing both Batch and streaming data pipeline. Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test. Must have strong communication skills and have worked with cross platform team. Must have great attitude towards learning new skills and upskilling the existing skills. Responsible to set best practices around Databricks CI/CD. Must understand composable architecture to take fullest advantage of Databricks capabilities. Good to have Rest API knowledge. Good to have understanding around cost distribution. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Experience around DevSecOps including docker and Kubernetes. Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java , Python, etc. Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx Experience with visualization tools such as Tableau, Power BI Experience with machine learning tools such as mlFlow , Databricks AI/ML, Azure ML, AWS sagemaker , etc. Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. Experience coordinating the intersection of complex system dependencies and interactions Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 1, 2025, 6:40:20 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Position Overview Job Title: Analytics Senior Analyst Location: Jaipur, India Corporate Title: AVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The team’s purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Office’s deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your Skills And Experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions forour stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholder’s requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
0.0 years
9 - 13 Lacs
Bengaluru
Work from Office
: Job TitleData Automation Engineer, NCT LocationBangalore, India Role Description KYC Operations play an integral part in the firms first line of defense against financial crime, reducing the risk of working with new clients (primarily Know Your Customer (KYC) risk), whilst ensuring client relationships are on-boarded and maintained efficiently. KYC Operations provide a golden source of quality reference data for CIB, underpinning the firms key Regulatory, Control & Governance standards. Within KYC Operations there is a dedicated global group KYC Transformation that drives end-to-end-delivery. Our teams partners with stakeholders in and outside of KYC Ops to ensure our processes are fit for purpose, follow a uniform standard and continuously improving our processes thereby adding a quantifiable value to support colleagues and clients in a flexible, fast and focused manner. As a Data Automation Engineer, you will build fast solutions to help Operations and other parts of the bank deliver their highest value, removing repetitive tasks, building strategic data pipelines, ensuring automation is robust and stable using solutions incl. Python, VBA, MS Power platforms (Power Automate, Power Apps, Power BI), SQL and Share Points. Our approach is to ensure the solution can be merged into strategic tooling and fits the technology design process standards. We are looking for an enthusiastic and motivated person with excellent communication skills to join our team. You will love working with us and see the value in helping people by delivering effective solutions that make a positive impact on your colleagues workload. You will be curious and able to quickly absorb organizational complexity, regulatory requirements, and business logic, translating that structure into your work. This role will offer a fantastic opportunity to join one of the most prestigious financial organisations operating all over the globe, and you will gain amazing experience. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Work with stakeholders to identify opportunities to drive business solutions and improvements Automate manual effort, providing tactical solutions to improve speed and value. Work in an agile way to deliver proof of concept and fast solutions using the appropriate technologies appropriate to the problem statements and requirements Enhance personal and team network to ensure cooperation yields efficiencies, for example sharing of solutions to a wider team, re-using existing solutions, enhancing solutions to have a wider and more beneficial business impact Your skills and experience Analyse, design, develop, test, deploy and support Digital services software solutions Exposure to ETL technologies and methods Expertise in coding/ programming in Python, VBA, and SQL skills to extract data sets efficiently Experience in developing business solutions in any of MS power Apps, MS Power Automate or RPA Excellent spatial reasoning and ability to see view process and data in two or three-dimensions. Process Mapping, Process Re-engineering & Data orientated with experience in enterprise process modelling for current and future state. The ability to generate innovative ideas and deliver effectively, highlighting blockers if needed. Exposure to workflow solutions, Alteryx, Pentaho, Celonis, linux and database tuning are desirable Documenting solutions (i.e., Creation and upkeep of artefacts - Requirement Docs, SDDs, Test Scripts, JIRA tickets, KSD - post go live) Provide L1 support to the existing RPA solution, resolve the issues with minimum TAT to ensure business resiliency Competencies: Work alongside Solutions Architects, Business Analysts and BOT controlling to contribute with solution designs Highly organized with a keen eye for detail and proven record operating in a fast- paced environment Ability to work independently and as part of the team with an enterprising spirit and a passion to learn and apply new technologies. Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Behavior skills Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Desire to work in a fast paced, challenging environment Self-motivated, independently, fast thinking, dynamic with exposure to internal and external stakeholders How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way. The Role: We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment. Your Contribution: Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools. Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting. Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights. Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers. Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality. Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau Collaborate with channel data and cross-functional teams to define requirements for POS and MDM data flows. Support Customer MDM & POS Adhoc Requests and Data Clarification from the Channel Data Team and the Finance Team. Collaborate with the BIOPS team to support Quarter-end user activities and ensure compliance with SOX regulations. Should be willing to explore and learn new technologies and concepts to provide the right kind of solution. Key Qualifications: For consideration, you must bring the following minimum skills and behaviors to our team: A total of 4 to 7 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies. At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools. Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift. Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake. Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files. Exposure to standard support ticket management tools. A strong understanding of Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities. A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software. A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems. Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities. Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems. The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include: Exposure to Oracle ERP environment, Basic understanding of Reporting tools like OBIEE, Tableau Education: BS/BTech/MS in computer science Information Systems or a related technical field or equivalent industry expertise. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way. Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house. Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you! We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location. All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability. If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at +1-510-713-4866 for assistance and we will get back to you as soon as possible.
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way. Job Description The Role: We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment. Your Contribution: Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools. Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting. Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights. Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers. Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality. Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau Collaborate with channel data and cross-functional teams to define requirements for POS and MDM data flows. Support Customer MDM & POS Adhoc Requests and Data Clarification from the Channel Data Team and the Finance Team. Collaborate with the BIOPS team to support Quarter-end user activities and ensure compliance with SOX regulations. Should be willing to explore and learn new technologies and concepts to provide the right kind of solution. Key Qualifications: For consideration, you must bring the following minimum skills and behaviors to our team: A total of 4 to 7 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies. At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools. Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift. Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake. Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files. Exposure to standard support ticket management tools. A strong understanding of Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities. A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software. A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems. Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities. Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems. The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include: Exposure to Oracle ERP environment, Basic understanding of Reporting tools like OBIEE, Tableau Education: BS/BTech/MS in computer science Information Systems or a related technical field or equivalent industry expertise. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way. Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house. Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you! We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location. All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability. If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at +1-510-713-4866 for assistance and we will get back to you as soon as possible.
Posted 1 month ago
4.0 years
8 - 9 Lacs
Hyderābād
On-site
Responsibilities: Description Build & maintain data aggregation processes and dashboards, creating interactive and compelling visual, numeric, and written summaries. Ensures data governance policies are followed. Train & coach business users on BI tools. Promote data literacy and a self-serve culture across the enterprise. Analyze business intelligence data to inform business and product decisions Blend historical data from available industry reports, public information, field reports or purchased sources as input to analyses. Identifying and analyzing industry/geographic trends and competitor market strategies and monitoring current/potential customer trends. Partners with other areas of the business (e.g., Marketing, Logistics, Customer Service, etc.) to model the outcome of implementing potential business strategies Qualifications: Bachelor's Degree preferred; Associate's degree required. Bachelor's Degree + 4-6 years analytics, consulting, project management, or equivalent experience preferred Applies advanced knowledge of job area typically obtained through advanced education and work experience. Managing projects / processes, working independently with limited supervision. Coaching and reviewing the work of lower level professionals. Problems faced are difficult and sometimes complex. Typically uses dedicated business intelligence applications and/or cloud services (e.g., Domo, Looker, PowerBI, Qlik, Tableau, SiSense), but may also leverage the reporting capabilities of existing ERP, CRM and/or database software vendors (e.g., Oracle BI, Salesforce, SAP Business Objects, Pentaho, SSRS) Additional Information: Power BI, SQL, Python, SAS Programming(Good to have)
Posted 1 month ago
4.0 years
6 - 9 Lacs
Chennai
Remote
Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way. Job Description The Role: We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment. Your Contribution: Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools. Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting. Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights. Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers. Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality. Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau Collaborate with channel data and cross-functional teams to define requirements for POS and MDM data flows. Support Customer MDM & POS Adhoc Requests and Data Clarification from the Channel Data Team and the Finance Team. Collaborate with the BIOPS team to support Quarter-end user activities and ensure compliance with SOX regulations. Should be willing to explore and learn new technologies and concepts to provide the right kind of solution. Key Qualifications: For consideration, you must bring the following minimum skills and behaviors to our team: A total of 4 to 7 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies. At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools. Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift. Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake. Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files. Exposure to standard support ticket management tools. A strong understanding of Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities. A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software. A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems. Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities. Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems. The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include: Exposure to Oracle ERP environment, Basic understanding of Reporting tools like OBIEE, Tableau Education: BS/BTech/MS in computer science Information Systems or a related technical field or equivalent industry expertise. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way. Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house. Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you! We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location. All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability. If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at +1-510-713-4866 for assistance and we will get back to you as soon as possible.
Posted 1 month ago
2.0 - 4.0 years
5 - 7 Lacs
Chennai
On-site
Skill Set: Data Engineering Experience: 2 to 4 years Location: Chennai Employment Type: FTE (Work from office) Notice Period: Immediate to 15 Days About Role: As a ETL developer with hands-on experience in data integration, development, and support of ETL processes using tools like Pentaho, Informatica, or Talend. This operational role is responsible for providing data analysis and management support. The individual in this position may seek appropriate guidance and advice to ensure the delivery of high-quality outcomes. Qualifications: Bachelor's degree in Computer Science, Information Technology or related field. Job Specifications: 1. Proven experience with ETL tools such as Pentaho, Informatica, or Talend. 2. Strong SQL skills, including writing and debugging complex queries. 3. Basic knowledge of Unix/Linux shell scripting. 4. Experience with database modelling for relational databases. 5. Familiarity with Scrum methodology and Agile practices. 6. Experience with GIT for version control and deployment. 7. Python programming skills are a plus. 8. Strong problem-solving skills and attention to detail. Soft Skills: 1. Excellent communication and teamwork abilities. 2. Ability to work independently and manage multiple tasks. 3. Strong analytical and critical-thinking skills. Job Type: Full-time Pay: ₹500,000.00 - ₹700,000.00 per year Schedule: Day shift Work Location: In person Application Deadline: 07/07/2025 Expected Start Date: 14/07/2025
Posted 1 month ago
5.0 - 31.0 years
6 - 17 Lacs
Hinjewadi, Pune
On-site
Designation: Senior Java Fullstack Developer Total Vacancies : 4 Location: Ahmedabad, Gandhinagar, Pune Experience: Minimum 5 Years – No exceptions Interview rounds: 3 Rounds Target joining date : 11th July 2025. Nature of work: Work from Office only (5 Days a week) Qualification: 5+ years experience with a minimum bachelor’s degree in Computer Science. Technical Skillset · Java 8+, JavaScript, Typescript · Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasper reports. · Angular 8+, React 16+ · Angular Material, Bootstrap 4, HTML5, CSS3, SCSS · Oracle SQL, PL/SQL development. · Pentaho Kettle. · Basic Linux Scripting and troubleshooting. · GIT · Design patterns Job Type: Full-time Work Location: In person Expected Start Date: 11/07/2025 Please send your CV to Services@clastechsolutions.com
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Responsibilities Description Build & maintain data aggregation processes and dashboards, creating interactive and compelling visual, numeric, and written summaries. Ensures data governance policies are followed. Train & coach business users on BI tools. Promote data literacy and a self-serve culture across the enterprise. Analyze business intelligence data to inform business and product decisions Blend historical data from available industry reports, public information, field reports or purchased sources as input to analyses. Identifying and analyzing industry/geographic trends and competitor market strategies and monitoring current/potential customer trends. Partners with other areas of the business (e.g., Marketing, Logistics, Customer Service, etc.) to model the outcome of implementing potential business strategies Qualifications Bachelor's Degree preferred; Associate's degree required. Bachelor's Degree + 4-6 years analytics, consulting, project management, or equivalent experience preferred Applies advanced knowledge of job area typically obtained through advanced education and work experience. Managing projects / processes, working independently with limited supervision. Coaching and reviewing the work of lower level professionals. Problems faced are difficult and sometimes complex. Typically uses dedicated business intelligence applications and/or cloud services (e.g., Domo, Looker, PowerBI, Qlik, Tableau, SiSense), but may also leverage the reporting capabilities of existing ERP, CRM and/or database software vendors (e.g., Oracle BI, Salesforce, SAP Business Objects, Pentaho, SSRS) Additional Information Power BI, SQL, Python, SAS Programming(Good to have)
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Requirements 5 to 10 years of BI experience Medium to expert level in SQL (Advanced/ Complex Queries) Knowledge of ETL concepts and experience in any ETL/Data Integration tool such as Informatica, ODI, Pentaho or other similar tools. Familiarity with one or more reporting tools such as Microstrategy / PowerBI / Tableau, Jaspersoft or other similar tools. Knowledge on Python & Cloud infrastructure Exposure to AWS Data Integration technologies such as Airflow, Glue is preferred but not mandatory. Programming experience in Java/Scala is preferred but not mandatory. Proven abilities to take initiative and be innovative Analytical mind with a problem-solving aptitude Education Must Have B. Tech / M. Tech / MCA
Posted 1 month ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
AWS/Azure/GCP, Linux, shell scripting, IaaC, Docker, Kubernetes, Mongo, MySQL, Solr, Jenkins, Github, Automation, TCP / HTTP network protocols A day in the life of an Infosys Equinox employee: As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. Ensure high availability of the infrastructure, administration, and overall support. Strong analytical skills and troubleshooting/problem solving skills - root cause identification and pro-active service improvement, staying up to date on technologies and best practices. Team and Task management with tools like JIRA adhering to SLAs. A Clear understanding of HTTP / Network protocol concepts, designs & operations - TCP dump, Cookies, Sessions, Headers, Client Server Architecture. More than 5+ years of working experience in AWS/Azure/GCP Cloud Platform. Core strength in Linux and Azure infrastructure provisioning including VNet, Subnet, Gateway, VM, Security groups, MySQL, Blob Storage, Azure Cache, AKS Cluster etc. Expertise with automating Infrastructure as a code using Terraform, Packer, Ansible, Shell Scripting and Azure DevOps. Expertise with patch management, APM tools like AppDynamics, Instana for monitoring and alerting. Knowledge in technologies including Apache Solr, MySQL, Mongo, Zookeeper, RabbitMQ, Pentaho etc. Knowledge with Cloud platform including AWS and GCP are added advantage. Ability to identify and automate recurring tasks for better productivity. Ability to understand, implement industry standard security solutions. Experience in implementing Auto scaling, DR, HA, Multi-region with best practices is added advantage. Ability to work under pressure, managing expectations from various key stakeholders. Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills
Posted 1 month ago
5.0 years
3 - 6 Lacs
Mumbai
On-site
1) Mid level experience in building the web application in AngularJS (5-6 yr exp) 2) Experience in designing and building web components 3) Exposure to pentaho PDI plugin development 4) Experience on designing & Using in CSS/Style sheets Job Type: Full-time Pay: ₹358,504.91 - ₹2,558,282.24 per year Benefits: Health insurance Life insurance Provident Fund Schedule: Day shift Monday to Friday Experience: Angular: 5 years (Preferred) Work Location: In person Expected Start Date: 01/07/2025
Posted 1 month ago
0 years
3 - 4 Lacs
Mumbai
On-site
Operations Team Member - Support Services-Treasury Operations Job Role: Review the retail and wholesale portfolio every month with respect to delinquency / losses / fraud for the period underwritten. Portfolio Management based on accurate evaluation of portfolio performance, market developments and new product management imperatives. Design and implement risk assessment model for retail portfolio and wholesale portfolio. Preparation of monthly qualitative and quantitative risk assessment report and present to senior management. Preparation of loss forecasting model and reviewing the same on Qtrly basis. Qtrly analysis of LWO written off in past qtr and present the report to the senior management with highlighting the learning and recommendation for change in Policy. Regulatory compliance and interactions with various teams. Aid conduct of audit (statutory/internal/operational risk etc.) and help closure of points to satisfaction without any adverse comments. Support new product launches and identity the risk involved in new product launches. Support business by rolling out regular test programs and convert them into policy basis defined success criterion Job Requirements: Technical Knowledge : Candidate should have expertise in working in SAP BO, SQL, MS Access, MS excel , Power point & VBA Macros. Expert presentation skill for various management committees. Expert in MS Excel & Access. Co-ordinate & working on automation of various MIS, management dashboards and tools for portfolio monitoring. Candidate should be able to coordinate with IT Team for getting the processes automated. Candidate should be able to conduct UAT / testing and report the changes required in systems available. Candidate should have expertise in working in Analytics tools like R, SAS, Knime, Pentaho. Candidate should have expertise in working in SQL and Microsoft office programs tools viz Excel Ac.cess and Power Point. Market Research : Check and keep track on industry trends, key factors impacting industry. Highlight the best practice followed by the market. Candidate should be well versed with the risk involved in secured lending and unsecured retail lending business. Candidate should understand internal & external factors affecting the business. Communication Skills: Candidate should be good communicator and should be able to make understand his point to others. Candidate should be confident with communicating with different audiences, from the board of directors to individual employees. Should be fluent in English. Candidate should have 4-5 yrs of work experience of MIS Automation and expert skills of SAP BO , SQL, MS Access, MS excel & VBA Macros. Candidate should have good eye for details. Candidate should be comfortable and confident with calculations and numbers. Candidate should be having problem solving approach by way of bring creative ideas. MIS Automation Skills, Analytical skills, Presentation skills, Market research, Business understanding, Problem solving, Eye for details, Communication Skills, Technical acumen, Numeracy.
Posted 1 month ago
7.0 years
4 - 5 Lacs
Chennai
On-site
Job Description: 7+ years total IT exp and 3+ Years of Pentaho (PDI) tool exp: Business intelligence expertise and technical skills to successfully deliver Data Integration and Business Intelligence solutions based on Pentaho Data Integration (PDI) tool Expertise in developing jobs and transformations for different business solutions in Pentaho Data Integration tool Analysing, designing, testing and coding of Data Integration/Data Migration logics according to the user requirements Experience in administration and configuring / maintenance of multiple Pentaho Environments Strong skills in Oracle SQL and PL/SQL. Candidate must be Proficient in SQL programming to write complex queries including joins, triggers, views, Stored Procedures, etc Exposure to Performance tuning approaches/techniques. Hands-on experience in windows batch scripting, UNIX shell scripting Oracle SQL MySQL and PostgreSQL. ETL Linux UNIX shell scripting Pentaho Data Integration (PDI) tool About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 years
4 - 5 Lacs
Chennai
On-site
Design, develop, and maintain ETL processes using Informatica or Pentaho. Write complex SQL queries to extract, transform, and load data from various sources. Develop and maintain shell scripts to automate ETL processes. Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions. Monitor and troubleshoot ETL processes to ensure data accuracy and integrity. Optimize ETL processes for performance and scalability. Document ETL processes and maintain comprehensive technical documentation. integration. Qualifications: Bachelors degree in computer science, Information Technology, or a related field. Proven experience as an ETL Developer or in a similar role. Strong proficiency in SQL and experience with relational databases. Hands-on experience with Informatica or Pentaho. Proficiency in shell scripting (e.g., Bash, KornShell). Familiarity with data warehousing concepts and methodologies. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: Experience with cloud-based ETL tools and platforms. Knowledge of other programming languages (e.g., Python, Java). Experience with big data technologies (e.g., Hadoop, Spark). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description 7+ years total IT exp and 3+ Years of Pentaho (PDI) tool exp: Business intelligence expertise and technical skills to successfully deliver Data Integration and Business Intelligence solutions based on Pentaho Data Integration (PDI) tool Expertise in developing jobs and transformations for different business solutions in Pentaho Data Integration tool Analysing, designing, testing and coding of Data Integration/Data Migration logics according to the user requirements Experience in administration and configuring / maintenance of multiple Pentaho Environments Strong skills in Oracle SQL and PL/SQL. Candidate must be Proficient in SQL programming to write complex queries including joins, triggers, views, Stored Procedures, etc Exposure to Performance tuning approaches/techniques. Hands-on experience in windows batch scripting, UNIX shell scripting Oracle SQL MySQL and PostgreSQL. ETL Linux UNIX shell scripting Pentaho Data Integration (PDI) tool
Posted 1 month ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialized businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function A new role is open within the Services Division – Java/Web development team in India for dedicated work in the WITT Program , which is a IT transformation program and focusses on Investment Services stream, specifically dealing front office portfolio management system and client channel requirements. Job Title Senior Software Engineer Date 19/7/2025 Department WM IT HUB – Services Division Location: India Business Line / Function Web-Java Developer Reports To (Direct) Web/Java Technical Leader in Services Division Grade (if applicable) (Functional) IT Lead – WITT Investment Services Project Number Of Direct Reports Directorship / Registration: NA Position Purpose The opening is for a Java developer who is able to design and give new solution for the upcoming changes to the existing as well as new assets/applications within the Services division. This role is technology oriented, which includes defining development methodologies, hands on development for new projects/evolutions, production L3 (Developer level) support, providing technical solutions and performing POC for new relevant frameworks available in market. Knowledge and experience in the Wealth Management (Private Banking) domain is a good to have. (Portfolio Analysis, Order Management and Advisory functions). Knowledge and experience in Temenos Wealth Front Office Suite is a big advantage and preferred. Responsibilities Direct Responsibilities Analyze, Discuss, Document Solution architecture and design with leads and architects. Do backend development primarily and frontend development as secondary skill. Perform quality Unit, Integration tests (Test & Business Driven Development is preferred.) Engage with external vendor consultancy development and delivery team and do effective collaboration for delivery and development of product. Assist test team in performance & endurance testing. Write technical specifications. Propose technical solutions and improvements proactively, based on pros and cons after analysis of the various options. Prepare technical deliveries and support implementation by integration teams following the processes and tools of the bank. Contribute to key and strategic projects. Perform quality reporting and controls in line with BNP PARIBAS standards & security. Follow scrum agile ceremonies with discipline and responsibility. Maintain proper BNPP Governance, Code and Conduct at all times. Contributing Responsibilities Technical & Behavioral Competencies Functional knowledge Minimum Proficiency Level General knowledge about private banking, asset management or corporate banking Medium (at least 1 years’ experience) Regulatory Framework, Compliance projects Not mandatory, but a great advantage Technical Skills Minimum Proficiency Level Database Medium (at Least 5 Years’ Experience) Oracle SQL Must have Oracle PL-SQL Beginner. Good to have Postgres SQL Must have Scripting shell scripting (BASH , KSH) Beginner. Good to have Cloud IBM Cloud Very Very Good to have Any other Cloud ( AWS, Google, Alibaba, Azure etc) Good to have Middle ware Medium (at Least 5 Year Experience) Apache TOMCAT Must have IBM Websphere Application Server Beginner. Good to have Apache HTTPD Medium IBM Http Server Beginner. Good to have Docker, Kubernetes – Container Managed Solutions Must Have Apache KAFKA - distributed event streaming platform Very Good To Have Backend Development Advanced (at Least 8 Year Experience) Java 11 - 21 : Collections, Concurrency, Streams, Functional Programming (Lambdas) , File I/O, Network I/O, JCache Must have ORM Frameworks – MyBatis and Spring-Data-JPA Must have Build Tools : MAVEN Must have Spring Framework (Spring core, Spring Boot, Spring MVC, Spring security, Spring Integration, Spring AOP, Spring Batch, Spring Transaction management, Spring WEB) Must have REST APIs Spring Rest Controllers & Template, Feign Http Clint, Apache Http Client, Swagger Specifications, Standard HTTP Error Management, Standard HTTP methods Must have Continuos Interation & Deployment JENKINS , SERENA, ANSIBLE Good to have Testing Utilities Mockito, Easymock, Wiremock, Jmeter, Postman Must have Monitoring Spring Actuator, JMX, JConsole, Jprofiler Good to have XML & JSON Tools JAXB, Jackson, XStream, Eclipse Moxy Good to have Reporting Eclipse BIRT , Hitachi Pentaho Good to have Tuning & Performance Clustering, Memory Management, Applications High Availability & Load Balancing. Good to have Network Protocols JMS, HTTP/HTTPS, SOAP, JDBC Must have Security Digital Signatures, SSL Certificates, Public/Private Keys, TLS. Good to have Logging Log4j , logback , Slf4j Must have Frontend Development Medium (at least 3 year experience) HTML5 Good to have CSS3 Good to have Javascript Must have Jquery Must have Typescript Must have JSP Must have AJAX Must have Angular 2x Must have Unit Testing : Karma and Jasmine Good to have Bootstrap Good to have Temenos Triple’A Plus Product Triple A TSL/API framework Beginner. Very Good to have Triple’A Scripting and system set up Beginner. Very Good to have Other Skills Minimum Proficiency Level Communication skills Excellent Team player Excellent Analytical skills Excellent Specific Qualifications (if Required) Minimum qualifications and Experience : Minimum 1-year experience in the banking & finance IT industry (Very Good to have). 8+ years and above experience in IT Software Development industry (Mandatory). Other Value-added Competencies Working in Agile / Scrum methodology (Good to have). Temenos Triple’A Plus product experience is a strong advantage. Strong Ability to speak and listen effectively and fluently in English language , Knowledge of French language is a good to have, Ability to work in a multi-cultural environment efficiently and effectively with teams located across Paris, Lisbon, Geneva, Chennai , Singapore, Hong Kong, Bengaluru, Mumbai cities. Skills Referential Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Communication skills - oral & written Attention to detail / rigor Adaptability Transversal Skills: (Please select up to 5 skills) Ability to understand, explain and support change Ability to anticipate business / strategic evolution Ability to develop and leverage networks Ability to set up relevant performance indicators Choose an item. Education Level Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if Required) Written and oral skills in French language, Solution Architecture & enterprise Architecture knowledge and certifications IBM Cloud Knowledge and Certification
Posted 1 month ago
7.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Your key responsibilities Your Role - What Youll Do As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases. Key Responsibilities: Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience Skills Youll Need : Must Have: 8+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Educational Qualifications Bachelors degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers
Posted 1 month ago
5.0 years
0 Lacs
Delhi
On-site
The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology
Posted 1 month ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Location: Mumbai About Us StayVista is India’s largest villa hospitality brand and has redefined group getaways. Our handpicked luxury villas are present in every famous holiday destination across the country. We curate unique experiences paired with top-notch hospitality, creating unforgettable stays. Here, you will be a part of our passionate team, dedicated to crafting exceptional getaways and curating one-of-a-kind homes. We are a close-knit tribe, united by a shared love for travel and on a mission to become the most loved hospitality brand in India. Why Work With Us? At StayVista, you're part of a community where your ideas and growth matter. We’re a fast-growing team that values continuous improvement. With our skill upgrade programs, you’ll keep learning and evolving, just like we do. And hey, when you’re ready for a break, our villa discounts make it easy to enjoy the luxury you help create. Your Role As an Associate General Manager – Business Intelligence, you will lead data-driven decision-making by transforming complex datasets into strategic insights. You will optimize data pipelines, automate workflows, and integrate AI-powered solutions to enhance efficiency. Your expertise in database management, statistical analysis, and visualization will support business growth, while collaboration with leadership and cross-functional teams will drive impactful analytics strategies. About You 8+ years of experience in Business Intelligence, Revenue Management, or Data Analytics, with a strong ability to turn data into actionable insights. Bachelor’s or Master’s degree in Business Analytics, Data Science, Computer Science, or a related field. Skilled in designing, developing, and implementing end-to-end BI solutions to improve decision-making. Proficient in ETL processes using SQL, Python, and R, ensuring accurate and efficient data handling. Experienced in Google Looker Studio, Apache Superset, Power BI, and Tableau to create clear, real-time dashboards and reports. Develop, Document & Support ETL mappings, Database structures and BI reports. Develop ETL using tools such as Pentaho/Talend or as per project requirements. Participate in the UAT process and ensure quick resolution of any UAT issue or data issue. Manage different environments and be responsible for proper deployment of reports/ETLs in all client environments. Interact with Business and Product team to understand and finalize the functional requirements Responsible for timely deliverables and quality Skilled at analyzing industry trends and competitor data to develop effective pricing and revenue strategies. Demonstrated understanding of data warehouse concepts, ETL concepts, ETL loading strategy, data archiving, data reconciliation, ETL error handling, error logging mechanism, standards and best practices Cross-functional Collaboration Partner with Product, Marketing, Finance, and Operations to translate business requirements into analytical solutions. Key Metrics: what you will drive and achieve Data Driven Decision Making &Business Impact. Revenue Growth & Cost Optimization. Cross-Functional Collaboration & Leadership Impact BI & Analytics Efficiency and AI Automation Integration Our Core Values: Are you a CURATER? Curious : Here, your curiosity fuels innovation. User-Centric : You’ll anticipate the needs of all our stakeholders and exceed expectations. Resourceful : You’ll creatively optimise our resources with solutions that elevate experiences in unexpected ways. Aspire : Keep learning, keep growing—because we’re all about continuous improvement. Trust : Trust is our foundation. You’ll work in a transparent, reliable, and fair environment. Enjoy : We believe in having fun while building something extraordinary. Business Acumen: You know our services, business drivers, and industry trends inside out. You anticipate challenges in your area, weigh the impact of decisions, and track competitors to stay ahead, viewing risk as a chance to excel. Change Management: You embrace change and actively look for opportunities to improve efficiency. You navigate ambiguity well, promote innovation within the team, and take ownership of implementing fresh ideas. Leadership: You provide direction, delegate effectively, and empower your team to take ownership. You foster passion and pride in achieving goals, holding yourself accountable for the team’s successes and failures. Customer Centricity: You know your customers’ business and proactively find solutions to resolve their challenges. By building rapport and anticipating issues, you ensure smooth, win-win interactions while keeping stakeholders in the loop. Teamwork: You actively seek input from others, work across departments, and leverage team diversity to drive success. By fostering an open environment, you encourage constructive criticism and share knowledge to achieve team goals. Result Orientation: You set clear goals for yourself and your team, overcoming obstacles with a positive, solution-focused mindset. You take ownership of outcomes and make informed decisions based on cost-benefit analysis. Planning and Organizing: You analyze information systematically, prioritize tasks, and delegate effectively. You optimize processes to drive efficiency and ensure compliance with organizational standards. Communication: You communicate with confidence and professionalism, balancing talking and listening to foster open discussions. You identify key players and use the right channels to ensure clarity and gain support. StayVista is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decisions based on race, colour, religion, caste, creed, nationality, age, sex, including pregnancy, childbirth, or related medical conditions, marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or any other characteristic protected under applicable laws.
Posted 1 month ago
6.0 - 11.0 years
5 - 9 Lacs
Pune
Work from Office
* Minimum 6 years of experience in ETL Pentaho Developers * Candidates having an exposure to the latest version like 7/8 are preferred. *Excellent data analysis skills. * Good experience in Pentaho BI Suite(Pentaho Data Integration Designer / Kettle, Pentaho Report Designer, Pentaho Design Studio, Pentaho Enterprise Console, Pentaho BI * * Server, Pentaho Metadata, Pentaho Analysis View, Pentaho Analyser & Mondrian). * Experience in performing Data Masking/Protection using Pentaho Data Integration. * Experience in creating ETL pipeline including extraction, transformation, merging, filtering, joining, cleansing, scheduling, monitoring, troubleshooting using Pentaho * Comfortable in working within RDBMS systems, e.g. PostgreSQL, Oracle. * Analytical with good problem- solving skills * Excellent Communication Skills Lead all payment screening initiatives as the technical person * Develop and implement strategies for false positive reduction * Act as a liaison between business stakeholders and technical teams * Oversee tuning operations for payment screening systems * Manage multiple concurrent projects in a fast-paced environment * Provide expert guidance on sanctions compliance and regulatory requirements * Drive innovation in payment screening methodologies * Communicate complex technical concepts to diverse audiences skills * Knowledge of AI/ML applications in payment screening * Experience with regulatory compliance in multiple jurisdictions * Background in financial services technology implementation * Product ownership certification or equivalent experience
Posted 1 month ago
8.0 - 13.0 years
37 - 40 Lacs
Pune
Work from Office
: Job TitleSenior Data Engineer Corporate TitleAVP LocationPune, India Role Description Technology Management is responsible for improving the technological aspects of operations to reduce infrastructure costs, improve functional performance and help deliver Divisional business goals. To achieve this, organization needs to be engineering focused. Looking for technologists who demonstrate a passion to build the right thing in the right way. We are looking for an experienced SQL developer to help build our data integration layer utilizing the latest tools and technologies. In this critical role you will become part of a motivated and talented team operating within a creative environment. You should have a passion for writing and designing complex data models, stored procedures, and tuning queries, that push the boundaries of what is possible and exists within the bank today. What well offer you : 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Part of a global team, forging strong relationships with geographically diverse teams / colleagues and businesses to formulate and execute technology strategy Production of code-based assets within the context of agile delivery (helping define and meet epics, stories, acceptance criteria) Responsible for the design, development and QA of those assets and outputs Perform review of component integration testing, unit testing and code review Write high performance, highly resilient queries in Oracle PL/SQL, and Microsoft SQL Server T--SQL Experience working with agile/continuous integration/test technologies such as git/stash, Jenkins, Artifactory Work in a fast-paced, high-energy team environment Developing scalable applications using ETL technology like Stream Sets, Pentaho, Informatica etc. Design and develop dashboards for business and reporting using the preferred BI tools eg Power BI or QlikView Thorough understanding of relational databases and knowledge of different data models Well versed with SQL and able to understand and debug database objects like Stored Procedures, Functions etc. Managing a Data Modelling tool like Power Designer, MySQL Workbench etc. Agile (scrum) based delivery practices, test driven development, test automation, continuous delivery Passion for learning new technologies Your skills and experience Education / Certification:Bachelors degree from an accredited college or university with a concentration in Science, Engineering or an IT-related discipline (or equivalent) Fluent English (written/verbal) Excellent communication and influencing skills Ability to work in fast paced environment Passion about sharing knowledge and best practice Ability to work in virtual teams and in matrixed organizations How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Experience 8+ years Data Engineering experience. 3+ years experience of cloud platform services (preferably GCP) 2+ years hands-on experience on Pentaho. Hands-on experience in building and optimizing data pipelines and data sets. Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance. Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases . Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements. Hands-on experience with programming languages - Java (mandatory), Go, Python. Hands-on experience in unit testing data pipelines. Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues. Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skills Programming & LanguagesJAVA Database TechOracle, Spanner, BigQuery, Cloud Storage Operating SystemsLinux Good knowledge and understanding of cloud based ETL framework and tools. Good understanding and working knowledge of batch and streaming data processing. Good understanding of the Data Warehousing architecture. Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets. Excellent numerical and analytical skills. Responsibilities Design and develop various standard/reusable to ETL Jobs and pipelines. Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files. Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers. Work with team to troubleshoot and resolve issues in job logic as well as performance. Write ETL validations based on design specifications for unit testing Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.
Posted 1 month ago
6.0 - 11.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Immediate job opening for # SFI Vlocity_C2H_Pan India. Skill:Pentaho Developer M ust have skillset Pentaho ETL tool, Expert in Unix shell Scripting Python PL-SQL scripting Added advantage : Hive/ hadoop knowledge Experience on migration projects. Candidates with Java knowledge. Tooling Should have exposure to Control-M , GitHub, JIRA, confluence , CI/CD Pipeline , Jenkins etc.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France