Jobs
Interviews

272 Pentaho Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Educational Bachelor of Engineering Service Line Equinox Responsibilities A day in the life of an Infosys Equinox employee As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. A Clear understanding of HTTP / Network protocol concepts, designs & operations - TCP dump, Cookies, Sessions, Headers, Client Server Architecture. Core strength in Linux and Azure infrastructure provisioning including VNet, Subnet, Gateway, VM, Security groups, MySQL, Blob Storage, Azure Cache, AKS Cluster etc. Expertise with automating Infrastructure as a code using Terraform, Packer, Ansible, Shell Scripting and Azure DevOps. Expertise with patch management, APM tools like AppDynamics, Instana for monitoring and alerting. Knowledge in technologies including Apache Solr, MySQL, Mongo, Zookeeper, RabbitMQ, Pentaho etc. Knowledge with Cloud platform including AWS and GCP are added advantage. Ability to identify and automate recurring tasks for better productivity. Ability to understand, implement industry standard security solutions. Experience in implementing Auto scaling, DR, HA, Multi-region with best practices is added advantage. Ability to work under pressure, managing expectations from various key stakeholders. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient prog Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : AWS/Azure/GCP, Linux, shell scripting, IaaC, Docker, Kubernetes, Jenkins, GitHub Preferred Skills: Technology-Cloud Platform-AWS Database Technology-Open System-Shell scripting Technology-Open System-Linux Technology-Container Platform-Docker Technology-Cloud Platform-Azure Devops Technology-Cloud Platform-GCP Database

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad

Work from Office

We are looking for a highly skilled Senior Database Developer who can work independently under limited supervision and apply their expertise in database design, development, and maintenance. This role requires a strong background in SQL, relational databases, and data modelling, with a focus on optimizing performance and supporting business intelligence capabilities. Responsibilities: Provide strategic direction and guidance for enterprise data architecture that supports Business Intelligence capabilities. Design and develop conceptual, logical, and physical data models, ensuring optimal performance, scalability, and maintainability. Use profiling tools to identify slow or resource-intensive SQL queries and develop solutions to improve performance. Focus on performance tuning, especially for complex queries, stored procedures, and indexing strategies. Design and implement new features with a focus on scalability and maintainability. Document and define data modeling requirements, ensuring that the applications database design aligns with technical and functional specifications. Ensure significant database design decisions are communicated and validated, adhering to best practices. Ensure the long-term reliability, scalability, and maintainability of systems. Collaborate with cross-functional teams to gather requirements and implement solutions. Assist in the adoption and application of industry best practices and guidelines. Qualifications: Educational Background: Bachelor’s degree or higher in Information Systems, Computer Science, or a related field (or equivalent experience). Experience: 5+ years of experience as a SQL Server Database Developer or Database Administrator (DBA). Technical Skills: Strong expertise in SQL and experience in writing complex SQL queries. Hands-on experience with SQL-XML programming. Extensive experience with SQL Server (Microsoft) and database architectures. Familiarity with performance tuning of SQL queries, stored procedures, and indexing strategies. Knowledge of Data Profiling Tools and performance optimization (CPU/memory/I/O concerns). Experience with Data Modelling and Database Design. Knowledge of ETL tools like Pentaho is a plus. Programming skills in Java and a willingness to explore new languages or transition into a Full-stack Engineer role. Experience with Agile methodologies (preferably SCRUM) and quick delivery through release management. Soft Skills: Strong attention to detail and results-oriented approach. Passionate, intelligent, and a critical thinker with excellent problem-solving skills. Ability to thrive in a fast-paced environment with multiple ongoing projects. Excellent written and verbal communication skills. Collaborative mindset, with the ability to work with all levels of management and stakeholders. Desired Traits: Self-motivated, technical, results-oriented, and quality-focused individual. Strong data warehouse and architecture skills. Excellent problem-solving abilities, proactive with a focus on delivering business value. A team player who is detail-oriented, respectful, and thoughtful.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Chennai

On-site

Minimum 6 years of experience in ETL Pentaho Developers Excellent data analysis skills. Good experience in Pentaho BI Suite(Pentaho Data Integration Designer / Kettle, Pentaho Report Designer, Pentaho Design Studio, Pentaho Enterprise Console, Pentaho BI * * Server, Pentaho Metadata, Pentaho Analysis View, Pentaho Analyser & Mondrian). Experience in performing Data Masking/Protection using Pentaho Data Integration. Experience in creating ETL pipeline including extraction, transformation, merging, filtering, joining, cleansing, scheduling, monitoring, troubleshooting using Pentaho Comfortable in working within RDBMS systems, e.g. PostgreSQL, Oracle. Analytical with good problem- solving skills Excellent Communication Skills About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

22 - 37 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Job Title: Database Engineer X 8 Positions Location: Hyderabad, India Salary: Market Rate/Negotiable About us Creditsafe is the most used business data provider in the world, reducing risk and maximizing opportunities for our 110,000 business customers. Our journey began in Oslo, Norway in 1997, where we had a dream of using the then revolutionary internet to deliver instant access company credit reports to small and medium-sized businesses. Creditsafe realized this dream and changed the market for the better for businesses of all sizes. From there, we opened 15 more offices throughout Europe, the USA and Asia. We provide data on more than 300 million companies and provide customer notifications for billions of changes annually. We are a high growth company offering the freedom and flexibility of a start-up type culture due to the continuous innovation and new product development performed, coupled with the stability of being a profitable and growing company! With such a large customer base and breadth of data and analytics technology you will have real opportunities to help companies survive and thrive in challenging times by reducing business risk and choosing trustworthy customers and suppliers. Summary: This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. As a Database Engineer with excellent database development skills, you will be responsible for developing and maintaining the databases and scripts that power the company’s products and websites, handling large data sets and having more than 20 million hits per day. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality. Primary Responsibilities: · 5+ year’s solid commercial experience of Oracle development under a 10g or 11g environment. · Advanced PL/SQL knowledge required. · ETL skills – Pentaho would be beneficial · Any wider DB experience would be desirable e.g., Redshift, Aurora DB, DynamoDB, MariaDB, MongoDB etc. · Cloud/AWS An interest in learning new technologies. · Experience in tuning Oracle queries in large databases. · Good experience in loading and extracting large data sets. · Experience of working with an Oracle database under a bespoke web development environment. · Analytical and critical thinking skills; agile problem-solving abilities. · Detail oriented, self-motivated, able to work independently with little or no supervision, and is committed to the highest standards of quality for the entire release process. · Excellent written and verbal communication skills. · Attention to detail. · Ability to work in a fast paced / changing environment. · Ability to thrive in a deadline driven, stressful project environment.3+ years of software development experience. Qualifications and Experience · Degree in Computer Science or similar. · Experience with loading data through SSIS. · Experience working on financial and business intelligence projects or in big data environments. · A desire to learn new skills and branch into development using a wide range of alternative technologies. Skills, Knowledge and Abilities · Write code for new development requirements as well as provide bug fixing, support and maintenance of existing code. · Test your code to ensure it functions as per the business requirements, considering the impact of your code on other areas of the solution. · Provide expert advice on performance tuning within Oracle. · Perform large-scale imports and extracts of data. · Assist the business in the collection and documentation of user's requirements where needed, provide estimates and work plans · Create and maintain technical documentation. · Follow all company procedures/standards/processes. · Contribute to architectural design and development making technically sound development recommendations. · Provide support to other staff in the department and act as a mentor to less experienced staff, including through code reviews. · Work as a team player in an agile environment. · Build release scripts and plans to facilitate the deployment of your code to testing and production environments. · Take ownership of any issues that occur within your area to ensure an appropriate solution is found. Assess opportunities for application and process improvement and share with team members and/or affected parties. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Senior Engineer – Data SQL Engineer Corporate Title: AVP Location: Pune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience 10+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable Skills That Will Help You Excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

AWS/Azure/GCP, Linux, shell scripting, IaaC, Docker, Kubernetes, Mongo, MySQL, Solr, Jenkins, Github, Automation, TCP / HTTP network protocols A day in the life of an Infosys Equinox employee: As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. Ensure high availability of the infrastructure, administration, and overall support. Strong analytical skills and troubleshooting/problem solving skills - root cause identification and pro-active service improvement, staying up to date on technologies and best practices. Team and Task management with tools like JIRA adhering to SLAs. A Clear understanding of HTTP / Network protocol concepts, designs & operations - TCP dump, Cookies, Sessions, Headers, Client Server Architecture. More than 5+ years of working experience in AWS/Azure/GCP Cloud Platform. Core strength in Linux and Azure infrastructure provisioning including VNet, Subnet, Gateway, VM, Security groups, MySQL, Blob Storage, Azure Cache, AKS Cluster etc. Expertise with automating Infrastructure as a code using Terraform, Packer, Ansible, Shell Scripting and Azure DevOps. Expertise with patch management, APM tools like AppDynamics, Instana for monitoring and alerting. Knowledge in technologies including Apache Solr, MySQL, Mongo, Zookeeper, RabbitMQ, Pentaho etc. Knowledge with Cloud platform including AWS and GCP are added advantage. Ability to identify and automate recurring tasks for better productivity. Ability to understand, implement industry standard security solutions. Experience in implementing Auto scaling, DR, HA, Multi-region with best practices is added advantage. Ability to work under pressure, managing expectations from various key stakeholders. Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills

Posted 3 weeks ago

Apply

7.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

. Responsible for planning and designing new software and web applications. Analyzes, tests and assists with the integration of new applications. Oversees the documentation of all development activity. Trains non-technical personnel. Assists with tracking performance metrics. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities: Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Collaborates with project stakeholders to identify product and technical requirements. Conducts analysis to determine integration needs. Designs new software and web applications, supports applications under development and customizes current applications. Develops software update process for existing applications. Assists in the roll-out of software releases. Trains junior Software Development Engineers on internally developed software applications. Oversees the researching, writing and editing of documentation and technical requirements, including evaluation plans, test results, technical manuals and formal recommendations and reports. Keeps current with technological developments within the industry. Monitors and evaluates competitive applications and products. Reviews literature, patents and current practices relevant to the solution of assigned projects. Provides technical leadership throughout the design process and guidance with regards to practices, procedures and techniques. Serves as a guide and mentor for junior level Software Development Engineers. Assists in tracking and evaluating performance metrics. Ensures team delivers software on time, to specification and within budget. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays expertise in knowledge of engineering methodologies, concepts and skills and their application in the area of specified engineering specialty. Displays expertise in process design and redesign skills. Presents and defends architectural, design and technical choices to internal audiences. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relevant Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

8 - 10 Lacs

Pune

On-site

Role Summary: Tableau Developer Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system. Key functions & responsibilities: Communication & interaction with Project Manager to understand the requirement Dashboard designing, development and deployment using Tableau eco-system Ensure delivery within given time frame while maintaining quality Stay up to date with current tech and bring relevant ideas to the table Proactively work with the Management team to identify and resolve issues Performs other related duties as assigned or advised He/she should be a leader that sets the standard and expectations through example in his/her conduct, work ethic, integrity and character Contribute in dashboard designing, R&D and project delivery using Tableau Candidate’s Profile Academics: Batchelor’s degree preferable in Computer science. Master’s degree would have an added advantage. Experience: Overall 3 - 5 Years of experience in DWBI development projects, having worked on BI and Visualization technologies (Tableau, QlikView) for at least 2 years. At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc. Technology & Skills: Hands on expertise of Tableau administration and maintenance Strong working knowledge and development experience with Tableau Server and Desktop Strong knowledge in SQL, PL/SQL and Data modelling Knowledge of databases like Microsoft SQL Server, Oracle, etc. Exposure to alternate Visualization technologies like QlikView, Spotfire, Pentaho etc. Good communication & Analytical skills with Excellent creative and conceptual thinking abilities Superior organizational skills, attention to detail/level of quality, Strong communication skills, both verbal and written skills Location: Pune Job Types: Full-time, Permanent Pay: ₹800,000.00 - ₹1,000,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Application Question(s): What is your current Annual CTC in INR Lacs? What is your notice period in number of days? Will you be able to relocate to other cities if needed? Experience: Tableau: 3 years (Required) Work Location: In person

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to provide effective technical support to the process and actively resolve client issues directly or through timely escalation to meet process SLAs. ͏ Do Support process by managing transactions as per required quality standards Fielding all incoming help requests from clients via telephone and/or emails in a courteous manner Document all pertinent end user identification information, including name, department, contact information and nature of problem or issue Update own availability in the RAVE system to ensure productivity of the process Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Follow standard processes and procedures to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Access and maintain internal knowledge bases, resources and frequently asked questions to aid in and provide effective problem resolution to clients Identify and learn appropriate product details to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Maintain and update self-help documents for customers to speed up resolution time Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by complying with service agreements ͏ Deliver excellent customer service through effective diagnosis and troubleshooting of client queries Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Assist clients with navigating around product menus and facilitate better understanding of product features Troubleshoot all client queries in a user-friendly, courteous and professional manner Maintain logs and records of all customer queries as per the standard procedures and guidelines Accurately process and record all incoming call and email using the designated tracking software Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract /SLAs ͏ Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed Mandatory Skills: Pentaho DI - Kettle . Experience: 3-5 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities . Architect and design solutions to meet functional and non-functional requirements. . Create and review architecture and solution design artifacts. . Evangelize re-use through the implementation of shared assets. . Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. . Proactively guide engineering methodologies, standards, and leading practices. . Guidance of engineering staff and reviews of as-built configurations during the construction phase. . Provide insight and direction on roles and responsibilities required for solution operations. . Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. . Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. . Demonstrate strong analytical and technical problem-solving skills. . Ability to analyze and operate at various levels of abstraction. . Ability to balance what is strategically right with what is practically realistic. . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. . Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. . Supporting and developing our people, including learning & development, certification & career development plans . Providing technical governance and oversight for solution design and implementation . Should have technical foresight to understand new technology and advancement. . Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance . Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. . Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives . Evangelizing the Data Engineering service offerings to both internal and external stakeholders . Development of Whitepapers, blogs, webinars and other though leadership material . Development of Go-to-Market and Service Offering definitions for Data Engineering . Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. . Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. . Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. . Build new Data capabilities, solutions, assets, accelerators, and team competencies. . Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications . Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. . Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. . Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. . Knowledge and experience in IT methodologies and life cycles that will be used. . Familiar with solution implementation/management, service/operations management, etc. . Leadership skills can inspire others and persuade. . Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. . Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience . Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. . Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. . Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance . Must have experience to design platform on Databricks. . Hands-on Experience to design and build Databricks based solution on any cloud platform. . Hands-on experience to design and build solution powered by DBT models and integrate with databricks. . Must be very good designing End-to-End solution on cloud platform. . Must have good knowledge of Data Engineering concept and related services of cloud. . Must have good experience in Python and Spark. . Must have good experience in setting up development best practices. . Intermediate level knowledge is required for Data Modelling. . Good to have knowledge of docker and Kubernetes. . Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC, SSO etc. . Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. . Experience building and supporting mission-critical technology components with DR capabilities. . Experience with multi-tier system and service design and development for large enterprises . Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. . Exposure to infrastructure and application security technologies and approaches . Familiarity with requirements gathering techniques. Preferred qualifications . Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. . Must have excellent coding skills either Python or Scala, preferably Python. . Must have experience in Data Engineering domain with total . Must have designed and implemented at least 2-3 project end-to-end in Databricks. . Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint - Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security . Must have knowledge of new features available in Databricks and its implications along with various possible use-case. . Must have followed various architectural principles to design best suited per problem. . Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. . Must have strong understanding of Data warehousing and various governance and security standards around Databricks. . Must have knowledge of cluster optimization and its integration with various cloud services. . Must have good understanding to create complex data pipeline. . Must be strong in SQL and sprak-sql. . Must have strong performance optimization skills to improve efficiency and reduce cost. . Must have worked on designing both Batch and streaming data pipeline. . Must have extensive knowledge of Spark and Hive data processing framework. . Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. . Must be strong in writing unit test case and integration test. . Must have strong communication skills and have worked with cross platform team. . Must have great attitude towards learning new skills and upskilling the existing skills. . Responsible to set best practices around Databricks CI/CD. . Must understand composable architecture to take fullest advantage of Databricks capabilities. . Good to have Rest API knowledge. . Good to have understanding around cost distribution. . Good to have if worked on migration project to build Unified data platform. . Good to have knowledge of DBT. . Experience around DevSecOps including docker and Kubernetes. . Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools . Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java, Python, etc. . Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx . Experience with visualization tools such as Tableau, Power BI . Experience with machine learning tools such as mlFlow, Databricks AI/ML, Azure ML, AWS sagemaker, etc. . Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. . Experience coordinating the intersection of complex system dependencies and interactions . Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Hiring Now : Full Stack Developer (Java + Angular) Locations : Gandhinagar | Ahmedabad | Pune (Onsite) Join : Immediate Only Required Skills (5+ Years) : Backend Java 8+, Spring Boot, Spring MVC, Spring Webservices Hibernate, Jasper Reports, Oracle SQL, PL/SQL Pentaho Kettle (ETL), Linux scripting basics Frontend Angular 8+ / React 16+, TypeScript, JavaScript Angular Material, Bootstrap 4, HTML5, CSS3, SCSS Other Git, Design Patterns Bachelor's degree in computer science or related field Mandatory Requirements Passport (or must apply via Tatkal) Immediate joiners only Here's the updated job description for the Full Stack Developer role, now including Key Responsibility Areas (KRAs) : Hiring Now : Full Stack Developer (Java + Angular) - 3 Month Contract Are you a seasoned Full Stack Developer with a passion for Java and Angular? We're looking for an immediate joiner to boost our team on a 3-month contract. This is a fantastic opportunity to make an immediate impact in a fast-paced environment. This is an onsite position at one of our tech hubs in Gandhinagar, Ahmedabad, or Pune. What You'll Bring 5+ years of hands-on experience in full-stack development. A Bachelor's degree in Computer Science or a related technical field. Mandatory : You'll need a valid Passport (or be ready to apply via Tatkal for urgent processing). Mandatory : Ability to join immediately. Key Responsibility Areas (KRAs) Develop and implement robust and scalable backend services using Java 8+, Spring Boot, Spring MVC, and Spring Webservices. Build and maintain responsive and intuitive user interfaces using Angular 8+/React 16+, TypeScript, and modern front-end technologies (Angular Material, Bootstrap 4, HTML5, CSS3, SCSS). Design and optimize database schemas and write efficient SQL/PLSQL queries for Oracle databases, ensuring data integrity and performance. Integrate and manage data pipelines using tools like Pentaho Kettle (ETL) and develop basic Linux scripts for automation. Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to define, design, and ship new features. Ensure code quality through rigorous unit testing, integration testing, and participation in code reviews. Utilize Git for version control and adhere to best practices in branching, merging, and pull requests. Contribute to technical design discussions and apply design patterns to build maintainable and extensible software solutions. Troubleshoot, debug, and resolve issues across the full stack, from front-end UI glitches to backend service errors. Prepare and maintain comprehensive technical documentation for code, configurations, and processes (ref:hirist.tech)

Posted 4 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 4 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad

Work from Office

We Advantum Health Pvt. Ltd - US Healthcare MNC looking for DevOps Analyst. We Advantum Health Private Limited is a leading RCM and Medical Coding company, operating since 2013. Our Head Office is located in Hyderabad, with branch operations in Chennai and Noida. We are proud to be a Great Place to Work certified organization and a recipient of the Telangana Best Employer Award. Our office spans 35,000 sq. ft. in Cyber Gateway, Hitech City, Hyderabad Job Title: DevOps Analyst Location: Hitech City, Hyderabad, India Work from office Ph: 9177078628, 7382307530, 9059683624 Address: Advantum Health Private Limited, Cyber gateway, Block C, 4th floor Hitech City, Hyderabad. Location: https://www.google.com/maps/place/Advantum+Health+India/@17.4469674,78.3747158,289m/data=!3m2!1e3!5s0x3bcb93e01f1bbe71:0x694a7f60f2062a1!4m6!3m5!1s0x3bcb930059ea66d1:0x5f2dcd85862cf8be!8m2!3d17.4467126!4d78.3767566!16s%2Fg%2F11whflplxg?entry=ttu&g_ep=EgoyMDI1MDMxNi4wIKXMDSoASAFQAw%3D%3D Job Summary: We are seeking a skilled and motivated Software Engineer to join our dynamic team in Hyderabad. The ideal candidate with 3 to 6 years of development experience will design, develop, test, and maintain high-quality software solutions while collaborating with cross-functional teams. The role involves tackling complex technical challenges and contributing to innovative projects. Key Responsibilities: Design, develop, and maintain scalable software applications using modern programming languages and frameworks. Collaborate with stakeholders to understand requirements and translate them into technical specifications. Write clean, efficient, and well-documented code following best practices. Participate in code reviews, debugging, and optimization processes. Work with cross-functional teams, including product managers, QA engineers, and DevOps, to ensure seamless delivery of software solutions. Stay updated with emerging technologies and trends to suggest innovative solutions. Ensure adherence to project timelines and deliverables while maintaining quality standards. Will have major development responsibilities with RCM software Required Skills and Qualifications: Bachelors or masters degree in computer science, Software Engineering, or a related field. Proficiency in programming languages such as C# , Python , or similar. Experience with web development frameworks (e.g., Pentaho, Angular ) Intermediate knowledge of databases (SQL), version control systems (e.g., Git), and cloud platforms (Azure, or similar). Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and teamwork abilities. Follow us on LinkedIn, Facebook, Instagram, Youtube and Threads for all updates: Advantum Health Linkedin Page: https://www.linkedin.com/showcase/advantum-health-india/ Advantum Health Facebook Page: https://www.facebook.com/profile.php?id=61564435551477 Advantum Health Instagram Page: https://www.instagram.com/reel/DCXISlIO2os/?igsh=dHd3czVtc3Fyb2hk Advantum Health India Youtube link: https://youtube.com/@advantumhealthindia-rcmandcodi?si=265M1T2IF0gF-oF1 Advantum Health Threads link: https://www.threads.net/@advantum.health.india HR Dept, Advantum Health Pvt Ltd Cybergateway, Block C, Hitech City, Hyderabad Ph: 9177078628, 7382307530, 9059683624

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We’re looking for a skilled Fullstack Developer with a strong foundation in Java, Angular , and modern backend/frontend frameworks. What We're Looking For: ✅ Bachelor's degree in Computer Science ✅ 5+ years of hands-on development experience Must-Have Technical Skills: • Java 8+, JavaScript, TypeScript • Spring Boot, Spring MVC, Webservices, Hibernate, JasperReports • Angular 8+, React 16+ • Angular Material, Bootstrap 4, HTML5, CSS3, SCSS • Oracle SQL, PL/SQL • Pentaho Kettle • Basic Linux scripting and troubleshooting • GIT & design patterns understanding

Posted 4 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai

On-site

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities: Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 4 weeks ago

Apply

5.0 years

6 - 9 Lacs

Ahmedabad

On-site

Fullstack Developer ( Java +Angular) Location : Ahmedabad, Gandhinagar, Pune Experience- 5-8 Years Immediate /15 days joiners Detailed JD- 5+ experience with a minimum bachelor’s degree in Computer Science. Technical Skillset o Java 8+, JavaScript, Typescript o Spring Boot, Spring MVC, Spring Webservices, Spring Data, Hibernate, Jasperreports. o Angular 8+, React 16+ o Angular Material, Bootstrap 4, HTML5, CSS3, SCSS o Oracle SQL, PL/SQL development. o Pentaho Kettle. o Basic Linux Scripting and troubleshooting. o GIT Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹50,000.00 - ₹80,000.00 per month Benefits: Provident Fund Schedule: Day shift Work Location: In person

Posted 4 weeks ago

Apply

7.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Job Purpose: As a senior technical consultant, you will lead complex implementation, upgrade, integration, and customisation projects. You will convert customers’ business needs into technical solutions and be the go-to person for mentoring junior consultants. Main Duties & Responsibilities : Make a significant contribution to the successful management of customer relationships and for the identification of new opportunities within the existing customer base and thereby maximising revenue Perform the project plan and SOW QA and suggest appropriate changes. Proactively identify and communicate project risks to the Project Manager and solve them appropriately. Act as a trusted advisor to customers on their projects Assist Project Managers in post go-live project assessment. Understand the business use of configurations vs customisations. Participate in the implementation, upgrade, configuration, customisation and deployment of solutions. Integrate BMC Helix with external systems using REST APIs/SOAP web services, Pentaho Spoon, etc. Ensure that implementations align with best practices and industry standards. Collaborate with cross-functional teams, provide status updates on project deliverables, and ensure seamless integration with downstream systems. Stay up-to-date with product roadmaps, releases, features and updates. Mentor junior consultants, share knowledge and insights with the team and contribute to continuous learning. Experience: 7+ years of experience with BMC Helix ITSM, Digital Workplace, Smart IT, etc. Skills: · Learning Agility : Master the secondary skills and transform them to primary skills. Learn peripheral technologies helpful in the project delivery. · Problem Solving : Identify project risks proactively and coordinate with Architect/PM for solutionising them. · Technical Adaptability: Assist in technical QA, design technical solutions, and adapt to technology changes. · Team Collaboration: Mentor and influence collaboration across teams. Coach and mentor junior team members. · Communication Skills : Communicate with customers and stakeholders with clear and concise communication. · Ownership and Accountability : Take ownership and accountability for the complete solution. · Time and Task Management : Manage your and team's time effectively. Qualifications: Bachelors' or Masters' degree in Computer Science, Information Technology, or a related field. Good to have relevant product certifications and ITIL certification.

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Position : Senior Technical Consultant Role : Pentaho Experience : 6 to 10 Years Start : Immediate Shift Timing : 1PM to 10PM Job Description This customer-facing role requires a highly professional and experienced technical architect to design, develop, configure and support integrations with insight software's Agility PIM (Product Information Management) software primarily, but not exclusively, using Hitachi's Pentaho ETL tool. Troubleshoot and resolve integration issues, providing actionable solutions and anticipating future challenges, primarily (but not exclusively) using the Pentaho ETL tool Work with client and Agility PIM SME to scope, design and implement novel custom technical solutions that meet client goals, primarily (but not exclusively) using the Pentaho ETL tool Work collaboratively with other teams to ensure a cohesive and successful client onboarding experience Complete a comprehensive 2-week training program for Agility PIM including its Steps for Hitachi's Pentaho ETL Gain expertise in Agility PIM tool by shadowing other consultants on project work during the first 2 months At 3 months, successfully troubleshoot issues with existing integrations created with Hitachi's Pentaho ETL tool At 6 months, write basic transformations using the Agility PIM Steps for Hitachi's Pentaho ETL tool At 12 months, independently write complex integration jobs comprised of multiple transformations (ref:hirist.tech)

Posted 4 weeks ago

Apply

100.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

About Company: Our client is a global technology company headquartered in Santa Clara, California. it focuses on helping organisations harness the power of data to drive digital transformation, enhance operational efficiency, and achieve sustainability. over 100 years of experience in operational technology (OT) and more than 60 years in IT to unlock the power of data from your business, your people and your machines. We help enterprises store, enrich, activate and monetise their data to improve their customers’ experiences, develop new revenue streams and lower their business costs. Over 80% of the Fortune 100 trust our client for data solutions. The company’s consolidated revenues for fiscal 2024 (ended March 31, 2024). approximately $57.5 billion USD., and the company has approximately 296,000 employees worldwide. It delivers digital solutions utilising Lumada in five sectors, including Mobility, Smart Life, Industry, Energy and IT, to increase our customers’ social, environmental and economic value. Job Title: Data Engineer Location: Hyderabad, Bangalore, Pune (Remote) Client: Hitachi Experience: 6-9 yrs Job Type : Contract to hire . Notice Period:- Immediate joiners Only. Experience We are seeking a highly skilled and motivated Data Engineer with 6 to 9 years of hands-on experience in Data Engineering or a related role. The ideal candidate will have expertise in modern data engineering practices, a deep understanding of AWS cloud services, and the ability to build robust data pipelines and architectures. Key Responsibilities Develop and maintain scalable data pipelines and workflows using automation and orchestration tools such as Airflow. Build and optimize data architectures and models to support analytics and reporting needs. Work extensively with AWS services such as Lambda, Glue, Athena, S3, Redshift, and EC2 for data processing and storage. Ensure data integrity, quality, and security by implementing robust ETL processes and monitoring solutions. Debug and troubleshoot data pipeline issues with strong analytical and problem-solving skills. Implement modern data practices, including data lakes and real-time streaming processing capabilities. Collaborate with cross-functional teams and adapt to rapidly changing technological landscapes. Leverage tools like GIT and CI/CD pipelines for version control and deployment automation. Required Qualifications 6-9 years of experience in Data Engineering or related fields. Strong expertise in AWS cloud services (AWS Lambda, Glue, Athena, S3, etc.). Proficiency in Python and SQL. Solid understanding of data architecture and modeling concepts. Experience with ETL tools (e.g., Pentaho, SSIS, Informatica, HVR). Knowledge of database, data warehouse, and big data technologies. Experience with monitoring and logging solutions. Preferred Skills Knowledge of AI/ML and large language models (LLMs). Experience with REST APIs and Salesforce APIs. Technologies AWS Lambda, AWS Glue, Athena, S3, Redshift, EC2 Airflow, Spark, Linux

Posted 1 month ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities Architect and design solutions to meet functional and non-functional requirements. Create and review architecture and solution design artifacts. Evangelize re-use through the implementation of shared assets. Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. Proactively guide engineering methodologies, standards, and leading practices. Guidance of engineering staff and reviews of as-built configurations during the construction phase. Provide insight and direction on roles and responsibilities required for solution operations. Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. Demonstrate strong analytical and technical problem-solving skills. Ability to analyze and operate at various levels of abstraction. Ability to balance what is strategically right with what is practically realistic . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. Supporting and developing our people, including learning & development, certification & career development plans Providing technical governance and oversight for solution design and implementation Should have technical foresight to understand new technology and advancement. Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives Evangelizing the Data Engineering service offerings to both internal and external stakeholders Development of Whitepapers, blogs, webinars and other though leadership material Development of Go-to-Market and Service Offering definitions for Data Engineering Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. Build new Data capabilities, solutions, assets, accelerators, and team competencies. Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. Knowledge and experience in IT methodologies and life cycles that will be used. Familiar with solution implementation/management, service/operations management, etc. Leadership skills can inspire others and persuade. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance Must have experience to design platform on Databricks. Hands-on Experience to design and build Databricks based solution on any cloud platform. Hands-on experience to design and build solution powered by DBT models and integrate with databricks . Must be very good designing End-to-End solution on cloud platform. Must have good knowledge of Data Engineering concept and related services of cloud. Must have good experience in Python and Spark. Must have good experience in setting up development best practices. Intermediate level knowledge is required for Data Modelling. Good to have knowledge of docker and Kubernetes. Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC , SSO etc. Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. Experience building and supporting mission-critical technology components with DR capabilities. Experience with multi-tier system and service design and development for large enterprises Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Exposure to infrastructure and application security technologies and approaches Familiarity with requirements gathering techniques. Preferred qualifications Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain with total Must have designed and implemented at least 2-3 project end-to-end in Databricks. Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint – Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security Must have knowledge of new features available in Databricks and its implications along with various possible use-case. Must have followed various architectural principles to design best suited per problem. Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have strong understanding of Data warehousing and various governance and security standards around Databricks. Must have knowledge of cluster optimization and its integration with various cloud services. Must have good understanding to create complex data pipeline. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost. Must have worked on designing both Batch and streaming data pipeline. Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test. Must have strong communication skills and have worked with cross platform team. Must have great attitude towards learning new skills and upskilling the existing skills. Responsible to set best practices around Databricks CI/CD. Must understand composable architecture to take fullest advantage of Databricks capabilities. Good to have Rest API knowledge. Good to have understanding around cost distribution. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Experience around DevSecOps including docker and Kubernetes. Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java , Python, etc. Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx Experience with visualization tools such as Tableau, Power BI Experience with machine learning tools such as mlFlow , Databricks AI/ML, Azure ML, AWS sagemaker , etc. Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. Experience coordinating the intersection of complex system dependencies and interactions Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 1, 2025, 6:40:20 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Position Overview Job Title: Analytics Senior Analyst Location: Jaipur, India Corporate Title: AVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The team’s purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Office’s deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your Skills And Experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions forour stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholder’s requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

0.0 years

9 - 13 Lacs

Bengaluru

Work from Office

: Job TitleData Automation Engineer, NCT LocationBangalore, India Role Description KYC Operations play an integral part in the firms first line of defense against financial crime, reducing the risk of working with new clients (primarily Know Your Customer (KYC) risk), whilst ensuring client relationships are on-boarded and maintained efficiently. KYC Operations provide a golden source of quality reference data for CIB, underpinning the firms key Regulatory, Control & Governance standards. Within KYC Operations there is a dedicated global group KYC Transformation that drives end-to-end-delivery. Our teams partners with stakeholders in and outside of KYC Ops to ensure our processes are fit for purpose, follow a uniform standard and continuously improving our processes thereby adding a quantifiable value to support colleagues and clients in a flexible, fast and focused manner. As a Data Automation Engineer, you will build fast solutions to help Operations and other parts of the bank deliver their highest value, removing repetitive tasks, building strategic data pipelines, ensuring automation is robust and stable using solutions incl. Python, VBA, MS Power platforms (Power Automate, Power Apps, Power BI), SQL and Share Points. Our approach is to ensure the solution can be merged into strategic tooling and fits the technology design process standards. We are looking for an enthusiastic and motivated person with excellent communication skills to join our team. You will love working with us and see the value in helping people by delivering effective solutions that make a positive impact on your colleagues workload. You will be curious and able to quickly absorb organizational complexity, regulatory requirements, and business logic, translating that structure into your work. This role will offer a fantastic opportunity to join one of the most prestigious financial organisations operating all over the globe, and you will gain amazing experience. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Work with stakeholders to identify opportunities to drive business solutions and improvements Automate manual effort, providing tactical solutions to improve speed and value. Work in an agile way to deliver proof of concept and fast solutions using the appropriate technologies appropriate to the problem statements and requirements Enhance personal and team network to ensure cooperation yields efficiencies, for example sharing of solutions to a wider team, re-using existing solutions, enhancing solutions to have a wider and more beneficial business impact Your skills and experience Analyse, design, develop, test, deploy and support Digital services software solutions Exposure to ETL technologies and methods Expertise in coding/ programming in Python, VBA, and SQL skills to extract data sets efficiently Experience in developing business solutions in any of MS power Apps, MS Power Automate or RPA Excellent spatial reasoning and ability to see view process and data in two or three-dimensions. Process Mapping, Process Re-engineering & Data orientated with experience in enterprise process modelling for current and future state. The ability to generate innovative ideas and deliver effectively, highlighting blockers if needed. Exposure to workflow solutions, Alteryx, Pentaho, Celonis, linux and database tuning are desirable Documenting solutions (i.e., Creation and upkeep of artefacts - Requirement Docs, SDDs, Test Scripts, JIRA tickets, KSD - post go live) Provide L1 support to the existing RPA solution, resolve the issues with minimum TAT to ensure business resiliency Competencies: Work alongside Solutions Architects, Business Analysts and BOT controlling to contribute with solution designs Highly organized with a keen eye for detail and proven record operating in a fast- paced environment Ability to work independently and as part of the team with an enterprising spirit and a passion to learn and apply new technologies. Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Behavior skills Excellent communication skills with ability to converse clearly with stakeholders from all cultures Ability to work well in a global and virtual team, under pressure and multi-task Desire to work in a fast paced, challenging environment Self-motivated, independently, fast thinking, dynamic with exposure to internal and external stakeholders How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Logitech is the Sweet Spot for people who want their actions to have a positive global impact while having the flexibility to do it in their own way. The Role: We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment. Your Contribution: Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools. Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting. Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights. Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers. Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality. Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau Collaborate with channel data and cross-functional teams to define requirements for POS and MDM data flows. Support Customer MDM & POS Adhoc Requests and Data Clarification from the Channel Data Team and the Finance Team. Collaborate with the BIOPS team to support Quarter-end user activities and ensure compliance with SOX regulations. Should be willing to explore and learn new technologies and concepts to provide the right kind of solution. Key Qualifications: For consideration, you must bring the following minimum skills and behaviors to our team: A total of 4 to 7 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies. At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools. Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift. Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake. Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files. Exposure to standard support ticket management tools. A strong understanding of Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities. A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software. A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems. Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities. Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems. The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include: Exposure to Oracle ERP environment, Basic understanding of Reporting tools like OBIEE, Tableau Education: BS/BTech/MS in computer science Information Systems or a related technical field or equivalent industry expertise. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way. Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house. Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don’t meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you! We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can’t wait to tell you more about them being that there are too many to list here and they vary based on location. All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability. If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at +1-510-713-4866 for assistance and we will get back to you as soon as possible.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies