Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
We are looking for a highly capable Senior Full Stack engineer to be a core contributor in developing our suite of product offerings. If you love working on complex problems, and writing clean code, you will love this role. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze semi-structured data from different sources (20 million+ products from 500+ websites into our catalog of 500 million+ products). We help our customers discover new patterns in their data that can be leveraged so that they can become more competitive and increase their revenue. Essential Functions: Think like our customers – you will work with product and engineering leaders to define intuitive solutions Designing customer-facing UI and back-end services for various business processes. Developing high-performance applications by writing testable, reusable, and efficient code. Implementing effective security protocols, data protection measures, and storage solutions. Improve the quality of our solutions – you will hold yourself and your team members accountable to writing high quality, well-designed, maintainable software Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Guiding and mentoring others on the team Technologies We Use: Languages: NodeJS/NestJS/Typescript, SQL, React/Redux, GraphQL Infrastructure: AWS, Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Databases: Postgres, MongoDB, Redis, Elasticsearch, Trino, Iceberg Streaming and Queuing: Kafka, NATS, Keda Qualifications 6+ years of professional software engineering/development experience. Proficiency with architecting and delivering solutions within a distributed software platform Full stack engineering experience, including front end frameworks (React/Typescript, Redux) and backend technologies such as NodeJS/NestJS/Typescript, GraphQL Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Proven ability to work and effectively, prioritize and organize your work in a highly dynamic environment Proven track record of working in highly distributed event driven systems. Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Trino, etc.) Solid understanding of Data Pipeline and Workflow Automation – orchestration tools, scheduling and monitoring Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of Data Lakes, Data Warehouses, and modeling practices (Data Vault, etc.) and experience leveraging data lake solutions (e.g. AWS Glue, DBT, Trino, Iceberg, etc.) . Ability to clean, transform, and aggregate data using SQL or scripting languages Ability to design and estimate tasks, coordinate work with other team members during iteration planning Solid understanding of AWS, Linux and infrastructure concepts Track record of lifting and challenging teammates to higher levels of achievement Experience measuring, driving and improving the software engineering process Good testing habits and strong eye for quality. Outstanding organizational, communication, and relationship building skills conducive to driving consensus; able to work well in a cross-functional environment Experience working in an agile team environment Ownership – feel a sense of personal accountability/responsibility to drive execution from start to finish. Drive adoption of Wiser's Product Delivery organization principles across the department. Bonus Points Experience with CQRS Experience with Domain Driven Design Experience with C4 modeling Experience working within a retail or ecommerce environment Experience with AI Coding Agents (Windsurf, Cursor, Claude, ChatGPT, etc) – Prompt Engineering Why Join Wiser Solutions? Work on an industry-leading product trusted by top retailers and brands. Be at the forefront of pricing intelligence and data-driven decision-making. A collaborative, fast-paced environment where your impact is tangible. Competitive compensation, benefits, and career growth opportunities. Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are seeking a highly capable Data Platform Engineer to build and maintain a secure, scalable, and air-gapped-compatible data pipeline that supports multi-tenant ingestion, transformation, warehousing, and dashboarding . You’ll work across the stack: from ingesting diverse data sources (files, APIs, DBs), transforming them via SQL or Python tools, storing them in an OLAP-optimized warehouse, and surfacing insights through customizable BI dashboards. Key Responsibilities: 1. Data Ingestion (ETL Engine): Design and maintain data pipelines to ingest from: File systems: CSV, Excel, PDF, binary formats Databases: Using JDBC connectors (PostgreSQL, MySQL, etc.) APIs: REST, XML, GraphQL endpoints Implement and optimize: Airflow for scheduling and orchestration Apache NiFi for drag-and-drop pipeline development Kafka / Redis Streams for real-time or event-based ingestion Develop custom Python connectors for air-gapped environments Handle binary data using PyPDF2 , protobuf , OpenCV , Tesseract , etc. Ensure secure storage of raw data in MinIO , GlusterFS , or other vaults 2. Transformation Layer: Implement SQL/code-based transformation using: dbt-core for modular SQL pipelines Dask or Pandas for mid-size data processing Apache Spark for large-scale, distributed ETL Integrate Great Expectations or other frameworks for data quality validation (optional in on-prem) Optimize data pipelines for latency, memory, and parallelism 3. Data Warehouse (On-Prem): Deploy and manage on-prem OLAP/RDBMS options including: ClickHouse for real-time analytics Apache Druid for event-driven dashboards PostgreSQL , Greenplum , and DuckDB for varied OLAP/OLTP use cases Architect multi-schema / multi-tenant isolation strategies Maintain warehouse performance and data consistency across layers 4. BI Dashboards: Develop and configure per-tenant dashboards using: Metabase (preferred for RBAC + multi-tenant) Apache Superset or Redash for custom exploration Grafana for technical metrics Embed dashboards into customer portals Configure PDF/Email-based scheduled reporting Work with stakeholders to define marketing, operations, and executive KPIs Required Skills & Qualifications: 5+ years of hands-on experience with ETL tools , data transformation , and BI platforms Advanced Python skills for custom ingestion and transformation logic Strong understanding of SQL , data modeling , and query optimization Experience with Apache NiFi , Airflow , Kafka , or Redis Streams Familiarity with at least two: ClickHouse , Druid , PostgreSQL , Greenplum , DuckDB Experience building multi-tenant data platforms Comfort working in air-gapped / on-prem environments Strong understanding of security, RBAC , and data governance practices Nice-to-Have Skills: Experience in regulated industries (BFSI, Telecom, government) Knowledge of containerization (Docker/Podman) and orchestration (K8s/OpenShift) Exposure to data quality and validation frameworks (e.g., Great Expectations) Experience with embedding BI tools in web apps (React, Django, etc.) What We Offer: Opportunity to build a cutting-edge, open-source-first data platform for real-time insights Collaborative team environment focused on secure and scalable data systems Competitive salary and growth opportunities
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for better performance. - Working knowledge of Big Query, Fire Store, Memory Store, Spanner, and bare metal setup for PostgreSQL. - Expertise in tuning Alloy DB/Cloud SQL database for optimal performance. - Experience with GCP Data migration service, MongoDB, Cloud Dataflow, Disaster Recovery, job scheduling, logging techniques, and OLTP/OLAP. - Desirable: GCP Database Engineer Certification. Roles & Responsibilities: - Develop, test, and maintain data architectures. - Migrate Enterprise Oracle database from On-Prem to GCP cloud, focusing on autovacuum in PostgreSQL. - Tuning autovacuum in PostgreSQL. - Performance tuning of PostgreSQL stored procedures and queries. - Convert Oracle stored procedures and queries to PostgreSQL equivalents. - Create a hybrid data store with Datawarehouse, NoSQL GCP solutions, and PostgreSQL. - Migrate Oracle table data to Alloy DB. - Lead the database team. Mandatory Skills: PostgreSQL, PLSQL, Bigquery, GCP Cloud, tuning, and optimization. To apply, please share your resume at sonali.mangore@impetus.com with details of your current CTC, expected CTC, notice period, and Last Working Day (LWD).,
Posted 1 week ago
3.0 years
0 Lacs
Greater Chennai Area
On-site
Responsibilities Participate in requirements definition, analysis, and the design of logical and physical data models for Dimensional Data Model, NoSQL, or Graph Data Model. Lead data discovery discussions with Business in JAD sessions and map the business requirements to logical and physical data modeling solutions. Conduct data model reviews with project team members. Capture technical metadata through data modeling tools. Ensure database designs efficiently support BI and end user requirements. Drive continual improvement and enhancement of existing systems. Collaborate with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation. Collaborate with Data Architects for data model management, documentation, and version control. Maintain expertise and proficiency in the various application areas. Maintain current knowledge of industry trends and standards. Required Skills Strong data analysis and data profiling skills. Strong conceptual, logical, and physical data modeling for VLDB Data Warehouse and Graph DB. Hands-on experience with modeling tools such as ERWIN or another industry-standard tool. Fluent in both normalized and dimensional model disciplines and techniques. Minimum of 3 years' experience in Oracle Database. Hands-on experience with Oracle SQL, PL/SQL, or Cypher. Exposure to Databricks Spark, Delta Technologies, Informatica ETL, or other industry-leading tools. Good knowledge or experience with AWS Redshift and Graph DB design and management. Working knowledge of AWS Cloud technologies, mainly on the services of VPC, EC2, S3, DMS, and Glue. Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience). Excellent verbal and written communication skills, including the ability to describe complex technical concepts in relatable terms. Ability to manage and prioritize multiple workstreams with confidence in making decisions about prioritization. Data-driven mentality. Self-motivated, responsible, conscientious, and detail-oriented. Effective oral and written communication skills. Ability to learn and maintain knowledge of multiple application areas. Understanding of industry best practices pertaining to Quality Assurance concepts and Level : Bachelor's degree in Computer Science, Engineering, or relevant fields with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions. 3+ years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB. AWS Solutions Architect Professional Level certifications. Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems. Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery (ref:hirist.tech)
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description Acronotics Limited is a specialist consulting and services firm focused on modern-age automation technologies such as Robotic Process Automation (RPA) and Artificial Intelligence (AI). We apply human intelligence to build cutting-edge robotic automation & AI solutions for our clients. Our team consists of world-class automation & AI engineers, designers, consultants, and thought leaders. Our mission is to help clients design, develop, implement, and run game-changing, transformative robotic automation and AI solutions. Role Description We are looking for a detail-oriented and strategic Business Analyst with experience of working on IT projects dealing with corporate finance data. The ideal candidate will work at the intersection of business, data, and technology, acting as a bridge between the Finance team and the technical team. They will be instrumental in gathering requirements, validating outputs, ensuring alignment with business objectives, and supporting iterative delivery across phases. Key Responsibilities: Collaborate with stakeholders from Finance team to capture business requirements and translate them into actionable technical inputs. Analyze financial reports (e.g., Key Figures, Financial Statements, Capex Reports) and map them to underlying data structures (Power BI, OLAP Cubes). Facilitate UAT (User Acceptance Testing) and maintain traceability between requirements, test cases, and outcomes. Participate in Agile ceremonies and sprint reviews, ensuring business alignment and timely sign-offs. Work with data owners to document dataset definitions, KPIs, cube hierarchies, and metadata needed for model context. Requirements: 5+ years of experience as a Business Analyst in an IT services company working on corporate finance projects. Proficient in creating BRDs, user stories, workflows, and test cases. Strong understanding of financial reports, financial KPIs, variance analysis, forecasting, and budgeting processes. Experience working with BI tools such as Power BI and OLAP cubes (SSAS). Familiarity with Excel and PowerPoint-based financial commentaries and how they are used in decision-making. Exposure to AI, ML, or LLM-based platforms (e.g., Azure OpenAI, Copilot interfaces) is a plus. Comfortable working with semi-structured and structured data sources. Excellent communication skills – able to distill complex technical outputs into business-friendly narratives. Good to have: Experience of working on AI implementation projects. Experience with tools like JIRA, Confluence, or Azure DevOps. Certification in Business Analysis (CBAP/CCBA) or Agile (Scrum Product Owner/BA). Contribute to prompt engineering and RAG (Retrieval-Augmented Generation) context definition to improve AI performance. Location: Bangalore
Posted 1 week ago
8.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments Over 8 + years of experience in developing BI applications utilizing SQL server/ SF/ GCP/ PostgreSQL, BI stack, Power BI, and Tableau. Practical understanding of the Data modelling (Dimensional & Relational) concepts like Star-Schema Modelling, Snowflake Schema Modelling, Fact and Dimension tables. Ability to translate the business requirements into workable functional and non-functional requirements. Capable of taking ownership and communicating with C Suite executives & Stakeholders. Extensive database programming experience in writing T-SQL, User Defined Functions, Triggers, Views, Temporary Tables Constraints, and Indexes using various DDL and DML commands. Experienced in creating SSAS based OLAP Cubes and writing complex DAX. Ability to work with external tools like Tabular Editor and DAX Studio. Understand complex and customize Stored Procedures and Queries for implementing business logic and process in backend, for data extraction. Hands on experience in Incremental refresh, RLS, Parameterization, Dataflows and Gateways. Experience in Design, development of Business Intelligence Solutions using SSRS and Power BI Experience in optimization of PBI reports implementing Mixed and Direct Query modes. Skills Power Bi,Power Tools,Data Analysis
Posted 1 week ago
4.0 - 7.0 years
9 - 13 Lacs
Hyderabad
Work from Office
PowerBi Modelling Performance tuning and optimization PowerBi embedded APIs Concept of Embedded Tokens Custom UI Integration Row Level Security Concepts Hands on knowledge on incremental refresh powerM scripting Enhanced XMLA scripting
Posted 1 week ago
7.0 - 12.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Key Responsibilities: Data Engineering & Architecture: Design, develop, and maintain high-performance data pipelines for structured and unstructured data using Azure Data Bricks and Apache Spark. Build and manage scalable data ingestion frameworks for batch and real-time data processing. Implement and optimize data lake architecture in Azure Data Lake to support analytics and reporting workloads. Develop and optimize data models and queries in Azure Synapse Analytics to power BI and analytics use cases. Cloud-Based Data Solutions: Architect and implement modern data lakehouses combining the best of data lakes and data warehouses. Leverage Azure services like Data Factory, Event Hub, and Blob Storage for end-to-end data workflows. Ensure security, compliance, and governance of data through Azure Role-Based Access Control (RBAC) and Data Lake ACLs. ETL/ELT Development: Develop robust ETL/ELT pipelines using Azure Data Factory, Data Bricks notebooks, and PySpark. Perform data transformations, cleansing, and validation to prepare datasets for analysis. Manage and monitor job orchestration, ensuring pipelines run efficiently and reliably. Performance Optimization: Optimize Spark jobs and SQL queries for large-scale data processing. Implement partitioning, caching, and indexing strategies to improve performance and scalability of big data workloads. Conduct capacity planning and recommend infrastructure optimizations for cost-effectiveness. Collaboration & Stakeholder Management: Work closely with business analysts, data scientists, and product teams to understand data requirements and deliver solutions. Participate in cross-functional design sessions to translate business needs into technical specifications. Provide thought leadership on best practices in data engineering and cloud computing. Documentation & Knowledge Sharing: Create detailed documentation for data workflows, pipelines, and architectural decisions. Mentor junior team members and promote a culture of learning and innovation. Required Qualifications: Experience: 7+ years of experience in data engineering, big data, or cloud-based data solutions. Proven expertise with Azure Data Bricks, Azure Data Lake, and Azure Synapse Analytics. Technical Skills: Strong hands-on experience with Apache Spark and distributed data processing frameworks. Advanced proficiency in Python and SQL for data manipulation and pipeline development. Deep understanding of data modeling for OLAP, OLTP, and dimensional data models. Experience with ETL/ELT tools like Azure Data Factory or Informatica. Familiarity with Azure DevOps for CI/CD pipelines and version control. Big Data Ecosystem: Familiarity with Delta Lake for managing big data in Azure. Experience with streaming data frameworks like Kafka, Event Hub, or Spark Streaming. Cloud Expertise: Strong understanding of Azure cloud architecture, including storage, compute, and networking. Knowledge of Azure security best practices, such as encryption and key management. Preferred Skills (Nice to Have): Experience with machine learning pipelines and frameworks like MLFlow or Azure Machine Learning. Knowledge of data visualization tools such as Power BI for creating dashboards and reports. Familiarity with Terraform or ARM templates for infrastructure as code (IaC). Exposure to NoSQL databases like Cosmos DB or MongoDB. Experience with data governance to Job ID R-72547 Date posted 07/17/2025
Posted 1 week ago
10.0 - 15.0 years
7 - 11 Lacs
Kochi
Work from Office
: Join global organization with 82000+ employees around the world, as a Lead Data warehouse Developer based in IQVIA Kochi/ Bangalore. You will be part of IQVIA s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. Technical Skills: Very good understanding of Data Warehousing Design Methodology, Data Warehouse Concepts, Database design/architecture including both OLAP OLTP data and complete lifecycle of project from design to development and implementation. Extensive experience in Analysis, Design, Development, Implementation, Deployment and maintenance of Business Intelligence applications. Expertise in identification of Business and Data requirements and converting them into conceptual and logical data models. Excellent data-analysis skills and ability to translate specifications into a working Report. Experienced in Training Supporting end-user reporting needs. Exposure to Dimensional Data modeling- Star Schema modeling and Snowflake modeling. Possess excellent communication, decision-making, Problem Solving, Analytical, and Interpersonal skills with result oriented dedication towards goals. Capable of working as a Team Member or Individually with minimum supervision. Flexible and versatile to adapt to any new environment with a strong desire to keep pace with latest technologies. Key Responsibilities: Minimum 10+ years of experience in IT industry; should have worked on a large-scale client implementation project. 4-6 in Managing the daily activities of the team responsible for the design, implementation, maintenance, and support of data warehouse systems and related data marts. Oversees data design and the creation of database architecture and data repositories. Drive change to implement an efficient and effective data-warehousing strategy. Ensure that projects are accurately estimated and delivered to schedule. Work closely with the business and developers on issues related to design and requirements. Actively contribute to the process of continual improvement, regarding self, team and systems Ensure that development standards, policies and procedures are adhered to. Will work with analysts from the business, your development team, and other areas to deliver data-centric projects. Will be responsible for the maintenance and improvement of existing data batch-processes and the implementation of new ETL and BI systems. Coordinate internal resources and third parties/vendors for the flawless execution of projects. Ensure that all projects are delivered on-time, within scope and within budget. Developing project scopes and objectives, involving all relevant stakeholders, and ensuring technical feasibility Hands-on expertise in ETL, DWH and BI development. Competencies: Strong adaptability and problem-solving skills in an agile work setting. Extensive hands-on experience with key Azure services and SQL scripting. Ability to develop and implement effectively. Proven analytical and troubleshooting skills. Excellent written, Oral communication. Qualifications: Graduate: B-Tech/ BE/ MCA Certifications: (Good to have): DWH, SQL, BI
Posted 1 week ago
0 years
5 - 9 Lacs
Bengaluru
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role Assistant Manager/Business Analyst – BI Developer Responsibilities: Lead and improve existing reports delivery Daily control activities, Query Management using Service Now, improvement, support on new development Proficient in visual reports, dashboards and critical metric scorecards using Qlik or other reporting tools and develop and execute database queries and analyses Importing and transforming data for reporting purpose and build tools to store data (e.g. OLAP cubes) Conduct unit testing and troubleshooting and develop and update technical documentation and have Qlik Experience, Other Data Visualization Tools Such as Power BI, Tableau, Query Management tool Experience - Service Now Qualification we seek in you Minimum Qualification Should be graduate in mathematics/statistics, computer science or engineering. Preferred MCA or Post Graduate Preferred Skill Set Experienced in report maintenance, query management, development and deployment using Qlik or any other reporting tools. Must be efficient in front-end development and know visualization standard methodologies. Experience in database crafting and SQL skills. Experienced in RDMS such as MS SQL Server, Oracle etc. Excellent Social skills Experienced in data integration through extracting, transforming and loading (ETL) data from various sources and have strong analytical and logical attitude Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Business Analyst Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 21, 2025, 3:04:51 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time
Posted 1 week ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Preferred Education Master's Degree Required Technical And Professional Expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 week ago
0 years
5 - 14 Lacs
Chennai
On-site
Job Title: Data Modeller Location: Chennai (On-site only) Employment Type: Full-Time Job Description: We are looking for an experienced Data Modeller to join our team and drive data architecture for high-performance OLTP and OLAP systems. The ideal candidate will have deep expertise in data modeling methodologies and performance optimization, with hands-on experience across various GCP data platforms. Key Responsibilities: Design and implement data models for both OLTP (transactional) and OLAP (analytical) systems. Develop and maintain Conceptual, Logical, and Physical data models . Apply advanced techniques such as indexing, data partitioning, and sharding to optimize performance. Identify and optimize variables affecting database performance for real-time reporting and high-throughput applications. Collaborate with stakeholders to translate business requirements into scalable and efficient data structures. Work closely with application developers and data engineers to ensure models are aligned with system architecture. Required Skills & Experience: Proven hands-on experience in data modeling for complex systems. Strong understanding of indexing, partitioning, and sharding strategies with practical implementation exposure. Expertise in performance tuning and designing for high concurrency and real-time access . Proficiency in at least one data modeling tool , preferably Erwin or DBSchema . Solid knowledge of GCP databases , including: AlloyDB (PostgreSQL-compatible) Cloud SQL (MySQL/PostgreSQL) BigQuery (OLAP) Preferred Qualifications: Functional knowledge of the mutual fund or financial services industry is a strong advantage. Experience in cloud-native architecture and data governance practices is desirable. Additional Details: Location: This is a Chennai-based role. Office presence is mandatory. Work Type: Full-time, individual contributor role within the data engineering or architecture team. Job Types: Full-time, Permanent Pay: ₹514,912.86 - ₹1,497,845.15 per year Benefits: Health insurance Provident Fund Work Location: In person
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Saviynt is an identity authority platform built to power and protect the world at work. In a world of digital transformation, where organizations are faced with increasing cyber risk but cannot afford defensive measures to slow down progress, Saviynt’s Enterprise Identity Cloud gives customers unparalleled visibility, control and intelligence to better defend against threats while empowering users with right-time, right-level access to the digital technologies and tools they need to do their best work. The Business Analyst serves a crucial role on Saviynt’s Analytics team. They ensure that details behind requests are captured, realistic, and understood by those charged with delivering on the requests. They act as a conduit for consistent communication between stakeholders and the delivery team. They coordinate projects in a way that minimizes risk and consistently deliver results on time. Above all else, they ensure that content is delivered with a high degree of quality that leaders can trust. WHAT YOU WILL BE DOING BI Solution Development: Design, develop, and deploy business intelligence solutions (e.g., dashboards, reports, data visualizations) using BI tools like Tableau. Also be able to design, develop and deploy reporting solutions using native application report builders Develop complex queries, reports, stored procedures, and data models to support business intelligence requirements Leverage a wide range of BI technologies and platforms to build interactive dashboards, visualizations, and automated reports that provide meaningful insights to business users Implement and integrate data from various sources into BI platforms, ensuring consistency and accuracy in reporting Business & Stakeholder Engagement: Work closely with business stakeholders to understand business needs and translate them into technical specifications for BI development Gather and document business requirements, define KPIs, and ensure alignment of BI solutions with business goals Provide ongoing support to business teams by responding to queries, troubleshooting, and delivering new features or updates to existing BI solutions Serve as the main point of contact for business users regarding BI tool functionality, data accessibility, and reporting needs Leadership & Mentorship: Providing guidance on BI best practices, query optimization, report development, and troubleshooting Foster a culture of continuous improvement within the BI team, sharing knowledge and promoting industry best practices for data analysis and visualization Collaborate with senior leadership to define BI strategies and contribute to the roadmap for future BI development BI Tool Administration & Configuration: Oversee the administration and configuration of BI tools to ensure optimal performance, security, and scalability Monitor BI tool performance and make recommendations for improvements or adjustments as needed Manage user permissions and security roles within BI platforms to ensure data governance and compliance Troubleshoot data integration issues and work with the data engineering team to resolve any discrepancies or data quality issues Documentation & Reporting: Develop and maintain documentation for BI solutions, including data models, data flow diagrams, report specifications, and user guides Document all development processes, queries, and configurations to ensure consistency and maintainability Create clear, actionable documentation for users, guiding them through how to access, interpret, and utilize BI solutions effectively Testing & Quality Assurance: Perform functional, regression, and performance testing on BI reports, dashboards, and data models to ensure data accuracy, reliability, and optimal performance Work with business stakeholders to perform user acceptance testing (UAT) on BI solutions before full deployment Continuously monitor and improve the quality of BI solutions through ongoing testing and feedback WHAT YOU BRING: Bachelor’s degree in Computer Science, Information Systems, Business Analytics, Data Science, or a related field (or equivalent experience) 6+ years of experience in BI development or a related field, with a strong background in developing data-driven solutions using BI tools Proven experience in the development, deployment, and maintenance of BI reports, dashboards, and data visualizations Experience with database systems (SQL, OLAP, or data warehousing) and data modeling concepts Technical Skills : Advanced proficiency in BI tools such as Tableau , Power BI, experience with multiple tools is a plus.(Tableau is must) Strong experience with SQL, Snowflake for DBMS operations , writing optimized SQL queries , stored procedure and functions Knowledge of data warehousing, cloud platforms (e.g. Azure, AWS), and big data technologies is advantageous Experience with advanced reporting techniques, including Tableau calculations or scripting for complex visualizations Knowledge of at least one ERP tool (e.g. Salesforce )
Posted 1 week ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description Data Architect | Power BI Preferred: Trivandrum / Cochin Other Locations: Chennai, Bangalore Additional Comments Over 10+ years of experience in developing BI applications using SQL Server, Salesforce (SF), GCP, PostgreSQL, BI stack, Power BI, and Tableau. Strong practical understanding of data modeling concepts—both Dimensional (Star Schema, Snowflake Schema) and Relational, including Fact and Dimension tables. Skilled in translating business requirements into detailed functional and non-functional specifications. Capable of taking ownership of deliverables and effectively communicating with C-suite executives and stakeholders. Extensive database programming experience in T-SQL, including User Defined Functions, Triggers, Views, Temporary Tables, Constraints, and Indexes, using various DDL and DML operations. Proficient in building SSAS-based OLAP Cubes and writing complex DAX expressions. Hands-on expertise with external tools such as Tabular Editor and DAX Studio. Strong ability to customize and optimize stored procedures and backend queries for complex data extraction and business logic. Practical experience with Incremental Refresh, Row-Level Security (RLS), Parameterization, Dataflows, and Gateways. Proven experience in designing and developing BI solutions using SSRS and Power BI. Skilled in optimizing Power BI reports using Mixed Mode and DirectQuery. Skills Power BI Power Tools Data Analysis Microsoft Fabric Skills Power Bi,Power Tools,Data Analysis,Fabric
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
HighRadius is looking for a Cloud & Data Engineer to own our end-to-end data platform: building fault-tolerant ETL pipelines, administering and tuning our OLTP/OLAP databases, and automating all infrastructure as code. You’ll ensure data flows smoothly from ingestion through transformation to runtime analytics. Responsibilities Design, implement and operate ETL workflows (Airflow, AWS Glue) Administer PostgreSQL, MySQL or SQL Server: performance tuning, backups, restores.\ Manage schema migrations via Flyway/Liquibase and version control Provision and maintain data infrastructure with Terraform & Ansible Monitor job health & database metrics; troubleshoot failures and slow queries Automate scaling, snapshots and cost controls for RDS clusters Secure data environment with IAM, network policies and encryption Participate in 24×7 on-call rotations; author runbooks and postmortems Collaborate on data modeling, indexing strategies and query optimization Document all data flows, tuning guides, and runbooks in Confluence Requirements B.Tech/BE in CS, Information Systems or equivalent 4+ years building and operating ETL pipelines 3+ years as DBA in PostgreSQL/MySQL or SQL Server Hands-on with Airflow, Informatica or AWS Glue; strong Python/Java/Scala skills Proven ability to profile and tune complex SQL queries Terraform/Ansible experience for infra automation Familiarity with monitoring tools (Prometheus, Grafana, CloudWatch).
Posted 1 week ago
8.0 years
4 - 9 Lacs
Bengaluru
On-site
Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us as a Senior Advisor on our Data Science team in Bangalore to do the best work of your career and make a profound social impact. What you’ll achieve Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. You will: Develop Gen AI-based solutions to tackle real-world challenges using extensive datasets of text, images, and more Design and manage experiments; research new algorithms and optimization methods Build and maintain data pipelines and platforms to operationalize Machine Learning models at scale Demonstrate a passion for blending software development with Gen AI and ML Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here’s what we are looking for with this role: Essential Requirements Design, implement, test and maintain ML solutions within Dell's services organization Engage in design discussions, code reviews, and interact with various stakeholders Collaborate across functions to influence business solutions with technical expertise Thrive in a startup-like environment, focusing on high-priority tasks Desired Requirements Proficiency in Data Science Platforms (Domino Data Lab, Microsoft Azure, AWS, Google Cloud) | Deep knowledge in ML, data mining, statistics, NLP, or related fields | Experience in object-oriented programming (C#, Java) and familiarity with Python, Spark, TensorFlow, XGBoost | Experience in productionizing ML models and scaling them for low-latency environments | Proficient in Data Mining, ETL, SQL OLAP, Teradata, Hadoop 8+ years of related experience with a bachelor’s degree; or 6+ years with a Master’s; or 3+ years with a PhD; or equivalent experience Who we are We believe that each of us has the power to make an impact. That’s why we put our team members at the center of everything we do. If you’re looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we’re looking for you. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us. Application closing date: 20th Aug 2025 Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Job ID: R274037
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Senior Advisor, Data Science Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us as a Senior Advisor on our Data Science team in Bangalore to do the best work of your career and make a profound social impact. What You’ll Achieve Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. You will: Develop Gen AI-based solutions to tackle real-world challenges using extensive datasets of text, images, and more Design and manage experiments; research new algorithms and optimization methods Build and maintain data pipelines and platforms to operationalize Machine Learning models at scale Demonstrate a passion for blending software development with Gen AI and ML Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here’s what we are looking for with this role: Essential Requirements Design, implement, test and maintain ML solutions within Dell's services organization Engage in design discussions, code reviews, and interact with various stakeholders Collaborate across functions to influence business solutions with technical expertise Thrive in a startup-like environment, focusing on high-priority tasks Desired Requirements Proficiency in Data Science Platforms (Domino Data Lab, Microsoft Azure, AWS, Google Cloud) | Deep knowledge in ML, data mining, statistics, NLP, or related fields | Experience in object-oriented programming (C#, Java) and familiarity with Python, Spark, TensorFlow, XGBoost | Experience in productionizing ML models and scaling them for low-latency environments | Proficient in Data Mining, ETL, SQL OLAP, Teradata, Hadoop 8+ years of related experience with a bachelor’s degree; or 6+ years with a Master’s; or 3+ years with a PhD; or equivalent experience Who We Are We believe that each of us has the power to make an impact. That’s why we put our team members at the center of everything we do. If you’re looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we’re looking for you. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us. Application closing date: 20th Aug 2025 Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Read the full Equal Employment Opportunity Policy here. Job ID: R274037
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Solution Architect, you will collaborate with stakeholders to define the long-term Business Intelligence (BI) strategy, selecting appropriate tools and technologies. You will be responsible for designing and developing data models, ETL packages, OLAP cubes, and reports using Power BI, while ensuring adherence to best practices. Your role will also involve documenting source-to-target mappings, data dictionaries, and database design. Additionally, you will work with APIs and tools like Postman to build and test APIs, seamlessly integrating them into the overall architecture. In the realm of Data Modeling and Governance, you will develop and maintain reference architectures for reports based on different data sources such as Data Warehouse and SharePoint. It will be crucial to uphold data governance standards, ensuring proper data refreshes and publishing of common dataset hierarchies. Implementing security measures to safeguard sensitive data will be an integral part of your responsibilities. As a technical leader, you will conduct technical reviews of reports and dashboards before their production deployment. Identifying areas for improvement to optimize data flows will be essential. Providing hands-on guidance to users for optimizing their data models and reports, ensuring efficient performance and usability, is a key aspect of your role. Managing Power BI Premium capacity will be another significant responsibility, including allocation, monitoring, and optimization. Effectively allocating workspaces based on business needs and ensuring efficient resource utilization within the Premium capacity will be part of your duties. You will also be involved in user support and community engagement by addressing user queries related to Power BI and other BI technologies within the organization. Proactively troubleshooting and resolving data model issues for users when necessary will be imperative. Organizing monthly community calls to support, train, and guide citizen developers, thereby fostering a strong BI community, is an essential part of your role. Join our global, inclusive, and diverse team with a shared purpose of improving the quality of life through innovative motion systems. We value diversity, knowledge, skills, creativity, and talents that each employee brings, fostering an inclusive and equitable workplace where all employees are respected and valued. Our commitment is to inspire employees to grow, take ownership, and find fulfillment and meaning in their work.,
Posted 1 week ago
8.0 - 15.0 years
0 Lacs
karnataka
On-site
You will be responsible for owning the post-sale relationship with assigned accounts, ensuring high levels of satisfaction and retention. As the primary point of contact and trusted advisor to customers, you will need to understand customer goals and align product usage to deliver value. Your role will involve identifying upsell/cross-sell opportunities and closing them, as well as collaborating with product and support teams to address customer needs and advocate for customer priorities. Regular business reviews and performance check-ins will be conducted to track key metrics such as account growth, retention, adoption, usage, and engagement to manage account health. To be successful in this position, you should have experience in the Banking and Financial industry, particularly with large banks. You are expected to have at least 5 years of experience in account management, customer success, or a client-facing role in a product/SaaS company. Strong communication, relationship-building, and problem-solving skills are essential, along with the ability to work with cross-functional teams and manage multiple accounts. Additionally, you should be able to understand technical products and effectively translate customer needs to internal teams. Exposure to B2B SaaS environments and the ability to analyze customer data to derive insights would be considered nice-to-have qualifications. Familiarity with technology such as Big data OLAP, OLAP, Data Engineering, Data warehousing, and ETL is also a plus.,
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title: FLEXCUBE Reports Developer with Qlik Sense Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Career Level - IC2 Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential.
Posted 2 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title: FLEXCUBE Reports Developer with Qlik Sense Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Career Level - IC2 Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Job Description: About CloudNuro.ai CloudNuro.ai is a next-generation SaaS company focused on bringing artificial intelligence and machine learning to SaaS operations and management, optimizing SaaS spend, improving security and compliance, developing SaaS insight, and maximizing ROI for SaaS investment. CloudNuro offers a central point from where, apart from discovering the growing stack of SaaS applications, software administrators can manage application access for users/teams, licensing and spending, IT workflows, integration, data and access security, SaaS related process automation and policy compliance. Our customers are Fortune 500 corporations and public sector organizations, making your contributions vital to improving IT security and governance. For more information, please visit www.cloudnuro.ai and follow us on LinkedIn here at Job Overview We are on the cusp of a major transformation in our data capabilities, and we are looking for a hands-on Data Platform Engineer to help lead the way. Working alongside a Staff Data Architect and Engineer, you will play a key role in designing and setting up the next generation of our lakehouse and analytics platform a foundation that will directly drive the next phase of our company s growth and innovation. This is a unique opportunity to build an entire modern data stack from the ground up, combining cutting-edge open-source tools and cloud-native infrastructure. Key Responsibilities Hands-on Architecture and Development: Define, design, prototype, and deliver a robust lakehouse and analytics platform leveraging best-in-class technologies. Build the Core Platform: Collaborate with the Staff Data Architect to set up core infrastructure involving ingestion (ETL/ELT), warehouse, transformation, orchestration, and BI/analytics layers. Leverage Modern Data Stack: Work across an environment involving GCP-based compute instances and containers (Airbyte, ClickHouse, dbt, metriql, Prefect, Metabase, Next.js frontend) bringing cutting-edge engineering principles to life. Optimize for Scale and Security: Implement warehouse best practices, optimize performance for large-scale analytics, and enforce role- and attribute-based access controls. Data Modeling Excellence: Design flexible and scalable models to support real-time analytics, complex business queries, and long-term historical analysis. Innovation and Automation: Drive automation wherever possible in data ingestion, validation, transformation, and reporting processes. Cross-functional Collaboration: Work closely with Product, Engineering, and Leadership teams to ensure alignment with broader company objectives and business strategies. Mentor and Uplift: Provide technical mentorship to junior engineers and set high standards for operational excellence across the data organization. Drive Success Metrics: Align milestone progress with leadership goals around platform adoption, scalability, reliability, and impact on business KPIs. Skills Required 5+ years of experience designing, building, and scaling modern data platforms. Proven hands-on expertise with ETL/ELT pipelines, cloud-based warehouses (SQL, NoSQL, MPP, OLAP/OLTP), and advanced database optimization. Strong SQL and Python skills for data engineering and automation. Familiarity with setting up containerized data services (Docker/Kubernetes environments). Deep understanding of cloud platforms at least one of GCP, AWS, or Azure. Exposure to analytics workflows and BI environments. Strong experience working with Git, Jira, Confluence, Jenkins, and modern DevOps pipelines. What Wed Love to See Direct experience standing up a lakehouse architecture in a startup or fast-moving environment. Hands-on experience with tools such as Airbyte, dbt, Prefect, Metabase, metriql, or equivalents. Experience developing customer-facing analytics platforms or embedded analytics solutions. Solid foundation in time-series data management, trend analysis, and anomaly detection. Ability to thrive in a high-ownership, highly autonomous role with startup energy. Why us? Build the Foundation: You ll help architect a core platform that will power the companys data products and operational excellence for years to come. Be Early: As an early member of the team, you ll have outsized influence on technology choices, design decisions, and operational frameworks. Own and Accelerate: Take on massive ownership, ship at startup speed, and see your work directly drive business outcomes. Culture of Excellence: Join a sharp, no-nonsense team focused on integrity, innovation, and rapid personal and professional growth. Compensation We offer a competitive package commensurate with your experience and the value you bring. Role: Data Platform Engineer Experience: 5+ years Bengaluru, Karnataka, India (Hybrid) Department: Engineer Salary : Negotiable (Not Disclosed) Our Values Dedicated to our values, we uphold integrity, accountability and inclusivity in all that we do. Together, we shape a better world. Earn Trust The singular focus of our business is to earn the trust of our people, partners, customers and investors. Invent and Simplify Innovation is our passion, and we continuously work to simplify and make it easy for our people, customers and customers customers. Customer Delight Customer success is our obsession; we start with the customer s mindset, work backwards and strive hard to earn and keep customer trust. Deliver Results A measurable result is the measure of our business; we guarantee to deliver results with the right quality and in a timely fashion.
Posted 2 weeks ago
10.0 years
5 - 7 Lacs
Cochin
On-site
Job Overview: Join global organization with 82000+ employees around the world, as a Lead Data warehouse Developer based in IQVIA Kochi/ Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. Technical Skills: Very good understanding of Data Warehousing Design Methodology, Data Warehouse Concepts, Database design/architecture including both OLAP & OLTP data and complete lifecycle of project from design to development and implementation. Extensive experience in Analysis, Design, Development, Implementation, Deployment and maintenance of Business Intelligence applications. Expertise in identification of Business and Data requirements and converting them into conceptual and logical data models. Excellent data-analysis skills and ability to translate specifications into a working Report. Experienced in Training & Supporting end-user reporting needs. Exposure to Dimensional Data modeling- Star Schema modeling and Snowflake modeling. Possess excellent communication, decision-making, Problem Solving, Analytical, and Interpersonal skills with result oriented dedication towards goals. Capable of working as a Team Member or Individually with minimum supervision. Flexible and versatile to adapt to any new environment with a strong desire to keep pace with latest technologies. Key Responsibilities: Minimum 10+ years of experience in IT industry; should have worked on a large-scale client implementation project. 4-6 in Managing the daily activities of the team responsible for the design, implementation, maintenance, and support of data warehouse systems and related data marts. Oversees data design and the creation of database architecture and data repositories. Drive change to implement an efficient and effective data-warehousing strategy. Ensure that projects are accurately estimated and delivered to schedule. Work closely with the business and developers on issues related to design and requirements. Actively contribute to the process of continual improvement, regarding self, team and systems Ensure that development standards, policies and procedures are adhered to. Will work with analysts from the business, your development team, and other areas to deliver data-centric projects. Will be responsible for the maintenance and improvement of existing data batch-processes and the implementation of new ETL and BI systems. Coordinate internal resources and third parties/vendors for the flawless execution of projects. Ensure that all projects are delivered on-time, within scope and within budget. Developing project scopes and objectives, involving all relevant stakeholders, and ensuring technical feasibility Hands-on expertise in ETL, DWH and BI development. Competencies: Strong adaptability and problem-solving skills in an agile work setting. Extensive hands-on experience with key Azure services and SQL scripting. Ability to develop and implement effectively. Proven analytical and troubleshooting skills. Excellent written, Oral communication. Qualifications: Graduate: B-Tech/ BE/ MCA Certifications: (Good to have): DWH, SQL, BI IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs.iqvia.com
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team. To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service. The role description gives you an overview of the responsibilities; it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of the data warehouse i.e. Vertica. In this role, you will manage a team of data warehouse engineers to develop data modelling, design ETL data pipeline, manage the issues and upgrades, fine-tune performance, migrate, govern and maintain security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouse will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing : Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling : Responsible for designing and implementing data modelling solutions to ensure that the organization’s data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies like OLTP, OLAP (Oracle / MSSQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableau etc.). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group: The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which is focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to offer products and solutions that truly benefit customers, we believe unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavour, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough