Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
13 - 18 years
45 - 50 Lacs
Bengaluru
Work from Office
Assist in managing a team responsible for package software application development. Contribute to the timely delivery of software solutions. Participate in code reviews and maintain coding standards. Coordinate communication within the team and resolve project challenges. Additional = Good to have knowledge of Annuity Insurance. Delivering assigned tasks within the delivery cycle of an application development project. Tasks may include installing new systems applications; updating applications; performing configuration and testing activities; applications programming for assigned modules within a larger program. You will be working under supervision from the Technical lead/Project Manager or a Senior Developer to accomplish assigned tasks. At the same time contribute a design for specific deliverables and assist in the development of technical solutions. Prepare software technical documentation based on functional documentation and specifications, taking into account any specified functional and technical requirements. You will be part of a fast growing and exciting division whose culture is entrepreneurial, professional, rooted in teamwork and innovation. You will participate as part of a team and maintain good relationships with team members and customers. You are expected to work within an international environment, using a broad set of technologies and frameworks. Skills Must have 13 years total industry experience., at least. Must have experience on managing and leading project teams in an enterprise level projects. Hands on experience on Magic XPA developer, preferably for Transcend Life and Annuity product. Hands on experience in design and development (Oracle PL/SQL) of databases. Hands on experience on Unix shell scripting. Liaise with business stakeholders directly (where required) to refine requirements, understand them and provide guidance. Developing new annuity products, enhance existing products with new rider features, be involved in production support for the annuity policy administration system, and work in a Kanban environment, leading a product by understanding business requirements and refining it for further development. Work with business team to understand the problem and craft the solution which really fits into client needs. A team player with excellent interpersonal and problem-solving skills, logical-thinking and analytical abilities Excellent communication skills with ability to clearly communicate with client stakeholders.
Posted 2 months ago
13 - 15 years
45 - 50 Lacs
Hyderabad
Work from Office
Responsibilities Assist in managing a team responsible for package software application development. Contribute to the timely delivery of software solutions. Participate in code reviews and maintain coding standards. Coordinate communication within the team and resolve project challenges. Additional = Good to have knowledge of Annuity Insurance. Delivering assigned tasks within the delivery cycle of an application development project. Tasks may include installing new systems applications; updating applications; performing configuration and testing activities; applications programming for assigned modules within a larger program. You will be working under supervision from the Technical lead/Project Manager or a Senior Developer to accomplish assigned tasks. At the same time contribute a design for specific deliverables and assist in the development of technical solutions. Prepare software technical documentation based on functional documentation and specifications, taking into account any specified functional and technical requirements. You will be part of a fast growing and exciting division whose culture is entrepreneurial, professional, rooted in teamwork and innovation. You will participate as part of a team and maintain good relationships with team members and customers. You are expected to work within an international environment, using a broad set of technologies and frameworks. Skills Must have 13 years total industry experience., at least. Must have experience on managing and leading project teams in an enterprise level projects. Hands on experience on Magic XPA developer, preferably for Transcend Life and Annuity product. Hands on experience in design and development (Oracle PL/SQL) of databases. Hands on experience on Unix shell scripting. Liaise with business stakeholders directly (where required) to refine requirements, understand them and provide guidance. Developing new annuity products, enhance existing products with new rider features, be involved in production support for the annuity policy administration system, and work in a Kanban environment, leading a product by understanding business requirements and refining it for further development. Work with business team to understand the problem and craft the solution which really fits into client needs. A team player with excellent interpersonal and problem-solving skills, logical-thinking and analytical abilities Excellent communication skills with ability to clearly communicate with client stakeholders.
Posted 2 months ago
13 - 16 years
45 - 50 Lacs
Chennai
Work from Office
Responsibilities Assist in managing a team responsible for package software application development. Contribute to the timely delivery of software solutions. Participate in code reviews and maintain coding standards. Coordinate communication within the team and resolve project challenges. Additional = Good to have knowledge of Annuity Insurance. Delivering assigned tasks within the delivery cycle of an application development project. Tasks may include installing new systems applications; updating applications; performing configuration and testing activities; applications programming for assigned modules within a larger program. You will be working under supervision from the Technical lead/Project Manager or a Senior Developer to accomplish assigned tasks. At the same time contribute a design for specific deliverables and assist in the development of technical solutions. Prepare software technical documentation based on functional documentation and specifications, taking into account any specified functional and technical requirements. You will be part of a fast growing and exciting division whose culture is entrepreneurial, professional, rooted in teamwork and innovation. You will participate as part of a team and maintain good relationships with team members and customers. You are expected to work within an international environment, using a broad set of technologies and frameworks. Skills Must have 13 years total industry experience., at least. Must have experience on managing and leading project teams in an enterprise level projects. Hands on experience on Magic XPA developer, preferably for Transcend Life and Annuity product. Hands on experience in design and development (Oracle PL/SQL) of databases. Hands on experience on Unix shell scripting. Liaise with business stakeholders directly (where required) to refine requirements, understand them and provide guidance. Developing new annuity products, enhance existing products with new rider features, be involved in production support for the annuity policy administration system, and work in a Kanban environment, leading a product by understanding business requirements and refining it for further development. Work with business team to understand the problem and craft the solution which really fits into client needs. A team player with excellent interpersonal and problem-solving skills, logical-thinking and analytical abilities Excellent communication skills with ability to clearly communicate with client stakeholders.
Posted 2 months ago
6 - 8 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Title: Senior DBA Location: Bangalore, India Job Type: Full-Time Company Overview: We are a leading provider of enterprise-level software solutions seeking an experienced Senior DBA tmanage our database operations and ensure the integrity, performance, and security of our database systems. The ideal candidate will combine strong database expertise with problem-solving skills tmaintain excellence in our database infrastructure. Position Summary: As a Senior DBA, you will be responsible for the implementation, configuration, maintenance, and performance of critical database systems. Your primary focus will be on SQL Server and other RDBMS platforms, ensuring high availability and consistent performance of our applications. Key Responsibilities: Database Management: o Manage and maintain RDBMS including SQL Server. o Perform database maintenance, administration, backup, and recovery. o Conduct capacity planning and ensure database performance optimization. Reporting and Visualization: o Design and develop reports using SSRS and other visualization tools. o Utilize Power BI for advanced data visualization (optional). Optimization: o Perform query tuning and database performance optimization. Data Modeling and Architecture: o Develop logical and physical data models. o Utilize data modeling tools and understand data architecture fundamentals. Data Engineering: o Implement ETL processes and understand data warehousing fundamentals. o Work with OLAP/OLTP systems and cloud databases (AWS, Azure). NoSQL Databases: o Understand NoSQL database fundamentals and work with databases like Azure Cosmos DB, MongoDB, Cassandra, and Amazon DynamoDB (optional). In-Memory Databases: o Manage in-memory databases like Redis and Memcache (optional). Programming: o Develop and maintain applications using Java/C# and .NET/J2EE. Agile Methodology: o Apply Agile principles in database management and development. Required Technical Competencies: Strong command of RDBMS fundamentals including DDL, DML, DQL. Advanced knowledge of database normalization and referential integrity. Proven expertise in SQL Server administration and maintenance. Experience with backup strategies and disaster recovery planning. Demonstrated ability in capacity planning and resource optimization. Experience with enterprise reporting solutions (SSRS). Knowledge of business intelligence tools (Power BI). Understanding of data warehouse architectures. Proven track record in database performance optimization. Experience in logical and physical data modeling. Expertise in query optimization and tuning. Working knowledge of cloud platforms (AWS, Azure). Experience with cloud databases (RDS, Azure SQL DB). Understanding of NoSQL database concepts. Basic programming knowledge (.NET/Java). Required Soft Skills: Strong teamwork, communication, and articulation skills. Flexibility, adaptability, and problem-solving abilities. Analytical and critical thinking skills. Ability tplan, track, and manage tasks effectively. Strong work ethic and accountability. Client-focused approach. Calm under pressure. Strong planning and organizational skills. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Minimum 2 years of DBA experience. Proven track record of managing database operations. Work Environment: Location: Bangalore, India Flexible work arrangement available - twice a week from the office On-call rotation for critical issues. Additional Information: Position reports tNir Koren. Responsible for enterprise-wide database operations. Opportunity for career growth and advancement.
Posted 2 months ago
5 - 8 years
6 - 10 Lacs
Bengaluru
Work from Office
KEY ACCOUNTABILITIES The Azure Data Support engineer focuses on data-related tasks in Azure. Manage, monitor, and ensure the security and privacy of data to satisfy business needs. Monitor real time and batch processes to ensure data accuracy. Monitor azure pipelines and troubleshoot where required. Enhance existing pipelines, databricks notebooks as and when required. Involved in development stages of new pipelines as and when required. Troubleshoot pipelines, real time replication jobs and ensuring minimum data lag. Available to work on a shift basis to cover monitoring during weekends. (one weekend out of three) Act as an ambassador for DP World at all times when working; promoting and demonstrating positive behaviours in harmony with DP World s Principles, values and culture; ensuring the highest level of safety is applied in all activities; understanding and following DP World s Code of Conduct and Ethics policies Perform other related duties as assigned JOB CONTEXT Responsible for monitoring and enhancing existing data pipelines using Microsoft Stack. Responsible for enhancement of existing data platforms. Experience with Cloud Platforms such as Azure, AWS , Google Cloud etc. Experience with Azure data Lake, Azure data lake Analytics , Azure SQL Database, Azure, Data Bricks and Azure SQL Data warehouse. Good understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks. Good understanding of Big Data Hadoop and Yarn architecture along with various Hadoop Demons such as Job Tracker, Task Tracker, Name Node, Data Node, Resource/Cluster Manager, and Kafka (distributed stream-processing) . Experience in Database Design and development with Business Intelligence using SQL Server 2014/2016, Integration Services (SSIS), DTS Packages, SQL Server Analysis Services (SSAS) , DAX, OLAP Cubes, Star Schema and Snowflake Schema . Monitoring of pipelines in ADF and experience with Azure SQL, Blob storage, Azure SQL Data warehouse . Experience in a support environment working with real time data replication will be a plus. QUALIFICATIONS, EXPERIENCE AND SKILLS Qualification : Bachelor/master s in computer science/IT or equivalent. Azure certifications will be an added advantage (Certification in AZ-900 and/or AZ-204, AZ-303, AZ-304 or AZ-400 , DP200 DP201). ITIL certification a plus. Experience : 5 - 8 Years Must Have Skills: Azure Data lake, Data factory, Azure Databricks Azure SQL database, Azure SQL Datawarehouse. Hadoop ecosystem. Azure analytics services. Programming Python, R, Spark SQL Good to Have Skills: MSBI (SSIS, SSAS, SSRS), Oracle, SQL, PL/SQL Data Visualization, Power BI Data Migration #LI-AA6
Posted 2 months ago
5 - 9 years
12 - 17 Lacs
Bengaluru
Work from Office
Develop and maintain reporting applications using Cognos Business Intelligence components, including Report Studio, Framework Manager, Query Studio, and Cognos Connection. Design and implement complex reports in Cognos 10 and 11 BI Suite with advanced features like drill-through, master-detail relationships, conditional formatting, report bursting, and macros. Enhance existing Framework packages by integrating new namespaces, query subjects, query items, and data tables based on evolving business requirements. Perform metadata modeling tasks such as creating projects, managing metadata, preparing business views, and publishing packages to the Cognos portal. Administer Cognos environments, including creating data source connections, deploying reports from development to production environments, and managing user permissions for packages and folders. Conduct OLTP/OLAP system analysis and maintain database schemas, such as Star and Snowflake schemas, to ensure efficient data management. Optimize report performance, write and debug complex SQL queries, and troubleshoot issues to maintain seamless reporting operations. Leverage domain expertise in banking and finance to align reporting solutions with industry-specific needs. Qualifications: Proficiency in IBM Cognos BI Suite (versions 10 and 11), including Report Studio, Framework Manager, Query Studio, and Event Studio. Strong experience in metadata modeling and Cognos administration. Hands-on expertise in OLTP/OLAP systems and database schema design (Star and Snowflake schemas). Proficiency in SQL with experience in writing and troubleshooting complex queries. Knowledge of banking and finance processes and reporting needs is a significant advantage. A minimum of 5+ years of relevant experience in Cognos BI development and administration. Excellent problem-solving abilities, attention to detail, strong communication skills, and the ability to work collaboratively with cross-functional teams. Location: On site Jeddah location (Middle East)
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
About The Role General Skills Good Interpersonal skill and ability tomanage multiple tasks with enthusiasm. Interact with clients to understand therequirements. 4 to 8 years of total IT experience, with min3+ Power BI Technical Skills Understand business requirements in MSBIcontext and design data models to transform raw data into meaningful insights. Awareness of star and snowflake schemas inDWH and DWH concepts Should be familiar and experienced in T-SQL Have good knowledge and experience in SSIS(ETL) Creation of Dashboards & VisualInteractive Reports usingPower BI Extensive experience in both Power BI Service& Power BI On-Premise environment Createrelationships between data sources and develop data models accordingly Experiencein implementing Tabular model implementation and row level data security. Experiencein writing and optimizing DAX queries. Experiencein Power Automate Flow Performancetuning and optimization of Power BI reports Good understanding of Data warehouseconcepts. Knowledge of Microsoft Azure analytics is aplus. Good to have Azure BIskills (ADF, ADB, Synapse) Good UI/UXexperience / knowledge Knowledge in Tabular Models Knowledge in Paginated Reports General Skills GoodInterpersonal skill and ability to manage multiple tasks with enthusiasm Interactwith clients to understand the requirements Up to dateknowledge about the best practices and advancements in Power BI Shouldhave an analytical and problem solving mindset and approach 6 to 10years of total IT experience, with min 3+ years in Power BI Technical Skills Understandbusiness requirements in BI context and design data models to transform rawdata into meaningful insights Goodknowledge on all variants of Power BI (Power BI Embedded, Power BI Service,Power BI Report Server) Strong SQLskills and SQL Performance tuning Provideexpertise in Data Modeling and Database Design and provide recommendations Makesuggestions / best practices in implementing data models, ETL packages, OLAPcubes, and Reports Experienceworking with direct query and import mode Expertisein implementing static & dynamic Row level security Knowledgeto integrate Power BI reports in external web applications Shouldhave experience setting up data gateways & data preparation Creationof Dashboards & Visual Interactive Reports using Power BI Experienceworking with third party custom visuals like Zebra BI etc. Createrelationships between data sources and develop data models accordingly Have goodknowledge of the various DAX functions and ability to write complex DAX queries Awarenessof star and snowflake schemas in DWH and DWH concepts Knowledgein Tabular Models Befamiliar with creating TSQL objects, scripts, views, and stored procedure check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
10 - 13 years
20 - 35 Lacs
Gurgaon
Work from Office
Acuity Knowledge Partners Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moodys Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Director Experience Level-10+yrs Department-IT Location-Gurgaon Job Summary We are looking for a Senior Reporting & Analytics Lead to design, implement, and manage the enterprise reporting and analytics strategy for a global ERP & CRM rollout. This role requires deep expertise in Oracle OTBI, Fusion Reporting, Power BI, and enterprise data visualisation. The ideal candidate will lead the reporting architecture, data modelling, dashboard development, and analytics capabilities to support business decision-making across multiple regions and business units Key Responsibilities Strategy & Architecture: • Define and execute the enterprise-wide reporting and analytics strategy aligned with the global ERP & CRM rollout. • Lead the design, development, and governance of Oracle OTBI, Fusion Reports, and Power BI dashboards. • Develop and optimise the reporting data model and architecture, ensuring scalability, security, and performance. • Collaborate with data engineers, business analysts, and functional leads to translate business needs into actionable reports and analytics solutions. • Ensure data governance, integrity, and compliance with organisational policies and regulatory standards. Development & Execution: • Lead hands-on development of reports, dashboards, and KPIs using Oracle OTBI, Fusion BI Publisher, and Power BI. • Design and develop data models, ETL pipelines, and data transformations to support reporting and analytics. • Implement real-time, interactive dashboards and self-service analytics capabilities for business users. • Optimise data extraction, transformation, and visualisation processes for performance and scalability. • Establish best practices for data storytelling, visualisation, and dashboard usability. • Develop and implement automated report distribution and scheduling for key stakeholders. Stakeholder Engagement & Governance: • Work closely with finance, operations, HR, sales, and supply chain teams to define reporting requirements and priorities. • Act as the primary point of contact for all reporting and analytics initiatives, ensuring alignment with business goals. • Provide regular reporting, insights, and recommendations to senior leadership and program stakeholders. • Establish reporting governance policies, ensuring standardised definitions and consistency across business units. • Conduct training and enablement sessions for business users on self-service reporting tools. Technical & Tool Expertise: • Expert-level proficiency in Power BI (DAX, Power Query, data modeling, paginated reports). • Strong hands-on experience with Oracle OTBI and Fusion BI Publisher for ERP & CRM reporting. • Proficiency in SQL, PL/SQL, and database optimisation techniques. • Experience with ETL tools (Oracle Data Integrator, Informatica, Azure Data Factory, etc.). • Knowledge of data warehousing, OLAP, and cloud-based reporting solutions (Oracle Cloud, Azure, AWS). • Experience integrating ERP & CRM data with external reporting tools. • Familiarity with AI/ML-driven analytics, predictive modelling, and advanced data visualisation techniques. Key Competencies •Essential Skills & Experience: • 10+ years of experience in enterprise reporting, analytics, and BI development. • Proven track record of leading large-scale ERP & CRM reporting initiatives. • Expertise in Oracle OTBI, Fusion Reporting, and Power BI development. • Strong understanding of ERP & CRM data structures, business processes, and reporting needs. • Hands-on experience in data modelling, ETL, and database optimisation. • Ability to translate business requirements into technical reporting solutions. • Excellent communication and stakeholder management skills. Preferred Qualifications: • Oracle Cloud, Power BI, or Azure Data certifications. • Experience with Salesforce or other CRM analytics. • Knowledge of AI-driven analytics, machine learning, or advanced forecasting. • Exposure to DevOps and CI/CD practices in BI development.
Posted 2 months ago
4 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
System integration of heterogeneous data sources and working on technologies used in the design, development, testing, deployment, and operations of DW BI solutions Create and maintain documentation, architecture designs and data flow diagrams Help to deliver scalable solutions on the MSBI platforms and Hadoop Implement source code versioning, standard methodology tools and processes for ensuring data quality Collaborate with business professionals, application developers and technical staff working in an agile process environment Assist in activities such as Source System Analysis, creating and documenting high level business model design, UAT, project management etc Skills: What you need to succeed 4+ years of relevant work experience SSIS, SSAS, DW, Data Analysis and Business Intelligence Must have expert knowledge of Data warehousing tools SSIS, SSAS, DB Must have expert knowledge of TSQL, stored procedure, database performance tuning Strong in Data Warehousing, Business Intelligence and Dimensional Modelling concepts with experience in Designing, developing maintaining ETL, database OLAP Schema and Public Objects (Attributes, Facts, Metrics etc ) Good to have experience in developing Reports and Dashboards using BI reporting tools like Tableau, SSRS, Power BI etc Fast learner, analytical and skill to understand multiple businesses, their performance indicators Bachelors degree in Computer Science or equivalent Superb communication and presentation skills QUALIFICATIONS Must Have Skills SSIS SSAS TSQL DWH ETL Good To Have Skills TABLEAU POWER BI Minimum Education Level Bachelors or Equivalent
Posted 2 months ago
3 - 5 years
13 - 15 Lacs
Gurgaon
Work from Office
Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth. Data Engineer at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities. Job Responsibilities As a Data Engineer, you will be joining our Data Engineering Modernization team transforming our global financial network and improving our data products and services we provide to our internal customers. This team will leverage cutting edge data engineering modernization techniques to develop scalable solutions for managing data and building data products. In this role, you are expected to Involve from inception of projects to understand requirements, architect, develop, deploy, and maintain data. Work in a multi-disciplinary, agile squad which involves partnering with program and product managers to expand product offering based on business demands. Focus on speed to market and getting data products and services in the hands of our stakeholders and passion to transform financial industry is key to the success of this role. Maintain a positive and collaborative working relationship with teams within the NCR Atleos technology organization, as well as with wider business. Creative and inventive problem-solving skills for reduced turnaround times are required, and valued, and will be a major part of the job. And Ideal candidate would have: BA/BS in Computer Science or equivalent practical experience Experience applying machine learning and AI techniques on modernizing data and reporting use cases. Overall 3+ years of experience on Data Analytics or Data Warehousing projects. At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure. Microsoft Azure, ADF, Synapse. Programming in Python, PySpark, with experience using pandas, ml libraries etc. Data streaming with Flink/Spark structured streaming. Open-source orchestration frameworks like DBT, ADF, AirFlow Open-source data ingestion frameworks like Airbyte, Debezium Experience migrating from traditional on-prem OLTP/OLAP databases to cloud native DBaaS and/or NoSQL databases like Cassandra, Neo4J, Mongo DB etc. Deep expertise operating in a cloud environment, and with cloud native databases like Cosmos DB, Couchbase etc. Proficiency in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling. Excellent design, development, and tuning experience with SQL (OLTP and OLAP) and NoSQL databases. Experience with modern database DevOps tools like Liquibase or Redgate Flyway or DBmaestro. Deep understanding of data security and compliance, and related architecture Deep understanding and strong administrative experience with distributed data processing frameworks such as Hadoop, Spark, and others Experience with programming languages like Python, Java, Scala, and machine learning libraries. Experience with dev ops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps Experience with Agile development concepts and related tools. Ability to tune and trouble shoot performance issues across the codebase and database queries. Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions. Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Additional Skills: Leverage machine learning and AI techniques on operationalizing data pipelines and building data products. Provide data services using APIs. Containerization data products and services using Kubernetes and/or Docker.
Posted 2 months ago
8 - 10 years
25 - 27 Lacs
Chennai
Work from Office
Role As the Senior Lead for AI and Data Warehouse at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities -Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. -Drive data modeling efforts for analytics, dashboards, and knowledge graphs. -Oversee the implementation of parquet-based data lakes. -Work on OLAP databases, ensuring optimal data structure for reporting and querying. -Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. -Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. -Mentor and lead a team of engineers, building out the data and AI services organization. Requirements -8-10 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. -Strong proficiency in SQL and OLAP database technologies. -Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. -Proven experience with parquet-based data lake implementations. -Expertise in building highly scalable, high-volume data pipelines. -Experience with modular, reusable, low-code-based implementations. -Involvement in large-scale enterprise big data implementations. -Initiative-taker with strong motivation and the ability to lead a growing team. Preferred -Experience leading a team or building out a new department. -Experience with cloud-based data platforms and AI services. -Familiarity with supply chain technology or fulfilment platforms is a plus. Join us at Pando and lead the transformation of our AI and data services, delivering innovative solutions for global enterprises!
Posted 2 months ago
2 - 3 years
18 - 20 Lacs
Mumbai
Work from Office
Job Title: Product Engineer - Big Data Location: Mumbai Experience: 3 - 8 Yrs Job Summary: As a Product Engineer - Big Data , you will be responsible for designing, building, and optimizing large-scale data processing pipelines using cutting-edge Big Data technologies. Collaborating with cross-functional teams including data scientists, analysts, and product managers you will ensure data is easily accessible, secure, and reliable. Your role will focus on delivering high-quality, scalable solutions for data storage, ingestion, and analysis, while driving continuous improvements throughout the data lifecycle. Key Responsibilities: ETL Pipeline Development Optimization: Design and implement complex end-to-end ETL pipelines to handle large-scale data ingestion and processing. Utilize AWS services like EMR, Glue, S3, MSK (Managed Streaming for Kafka), DMS (Database Migration Service), Athena, and EC2 to streamline data workflows, ensuring high availability and reliability. Big Data Processing: Develop and optimize real-time and batch data processing systems using Apache Flink, PySpark, and Apache Kafka . Focus on fault tolerance, scalability, and performance. Work with Apache Hudi for managing datasets and enabling incremental data processing. Data Modeling Warehousing: Design and implement data warehouse solutions that support both analytical and operational use cases. Model complex datasets into optimized structures for high performance, easy access, and query efficiency for internal stakeholders. Cloud Infrastructure Development: Build scalable cloud-based data infrastructure leveraging AWS tools. Ensure data pipelines are resilient and adaptable to changes in data volume and variety, while optimizing costs and maximizing efficiency using Managed Apache Airflow for orchestration and EC2 for compute resources. Data Analysis Insights: Collaborate with business teams and data scientists to understand data needs and deliver high-quality datasets. Conduct in-depth analysis to derive insights from the data, identifying key trends, patterns, and anomalies to drive business decisions. Present findings in a clear, actionable format. Real-time Batch Data Integration: Enable seamless integration of real-time streaming and batch data from systems like AWS MSK . Ensure consistency in data ingestion and processing across various formats and sources, providing a unified view of the data ecosystem. CI/CD Automation: Use Jenkins to establish and maintain continuous integration and delivery pipelines. Implement automated testing and deployment workflows, ensuring smooth integration of new features and updates into production environments. Data Security Compliance: Collaborate with security teams to ensure data pipelines comply with organizational and regulatory standards such as GDPR, HIPAA , or other relevant frameworks. Implement data governance practices to ensure integrity, security, and traceability throughout the data lifecycle. Collaboration Cross-Functional Work: Partner with engineers, data scientists, product managers, and business stakeholders to understand data requirements and deliver scalable solutions. Participate in agile teams, sprint planning, and architectural discussions. Troubleshooting Performance Tuning: Identify and resolve performance bottlenecks in data pipelines. Ensure optimal performance through proactive monitoring, tuning, and applying best practices for data ingestion and storage. Skills Qualifications: Must-Have Skills: AWS Expertise: Hands-on experience with core AWS services related to Big Data, including EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, Athena, and EC2 . Strong understanding of cloud-native data architecture. Big Data Technologies: Proficiency in PySpark and SQL for data transformations and analysis. Experience with large-scale data processing frameworks like Apache Flink and Apache Kafka . Data Frameworks: Strong knowledge of Apache Hudi for data lake operations, including CDC (Change Data Capture) and incremental data processing. Database Modeling Data Warehousing: Expertise in designing scalable data models for both OLAP and OLTP systems. In-depth understanding of data warehousing best practices. ETL Pipeline Development: Proven experience in building robust, scalable ETL pipelines for processing real-time and batch data across platforms. Data Analysis Insights: Strong problem-solving skills with a data-driven approach to decision-making. Ability to conduct complex data analysis to extract actionable business insights. CI/CD Automation: Basic to intermediate knowledge of CI/CD pipelines using Jenkins or similar tools to automate deployment and monitoring of data pipelines.
Posted 2 months ago
4 - 8 years
8 - 12 Lacs
Bengaluru
Work from Office
Requirements : Proficient in Azure / GCP / AWS cloud services (SaaS, PaaS, CaaS, and IaaS) Designing custom applications using microservices, event driven architecture, single-page applications, and micro frontends Strong experience with front-end frameworks such as React or Angular Exposure to network security, infrastructure security and application security, data privacy and compliance requirements Proficient in transactional and analytical database systems, including OLTP (Online Transaction Processing) for real-time data management and OLAP (Online Analytical Processing) for complex data analysis Skilled in NoSQL/document databases for flexible data storage and adept at leveraging data analytics for insightful decision-making Proven ability in setting up and managing CI/CD pipelines for seamless software delivery and infrastructure as code (IaC)
Posted 2 months ago
10 - 17 years
20 - 27 Lacs
Bengaluru
Work from Office
Develop and maintain reporting applications using Cognos Business Intelligence components, including Report Studio, Framework Manager, Query Studio, and Cognos Connection. Design and implement complex reports in Cognos 10 and 11 BI Suite with advanced features like drill-through, master-detail relationships, conditional formatting, report bursting, and macros. Enhance existing Framework packages by integrating new namespaces, query subjects, query items, and data tables based on evolving business requirements. Perform metadata modeling tasks such as creating projects, managing metadata, preparing business views, and publishing packages to the Cognos portal. Administer Cognos environments, including creating data source connections, deploying reports from development to production environments, and managing user permissions for packages and folders. Conduct OLTP/OLAP system analysis and maintain database schemas, such as Star and Snowflake schemas, to ensure efficient data management. Optimize report performance, write and debug complex SQL queries, and troubleshoot issues to maintain seamless reporting operations. Leverage domain expertise in banking and finance to align reporting solutions with industry-specific needs. Qualifications: Proficiency in IBM Cognos BI Suite (versions 10 and 11), including Report Studio, Framework Manager, Query Studio, and Event Studio. Strong experience in metadata modeling and Cognos administration. Hands-on expertise in OLTP/OLAP systems and database schema design (Star and Snowflake schemas). Proficiency in SQL with experience in writing and troubleshooting complex queries. Knowledge of banking and finance processes and reporting needs is a significant advantage. A minimum of 5+ years of relevant experience in Cognos BI development and administration. Excellent problem-solving abilities, attention to detail, strong communication skills, and the ability to work collaboratively with cross-functional teams. Location: On site Jeddah location (Middle East)
Posted 2 months ago
5 years
0 Lacs
Mumbai, Maharashtra
Work from Office
You are a strategic thinker passionate about driving solutions in data visualization. You have found the right team. As a Data Visualization Associate within our Databricks team, you will be responsible for designing, developing, and optimizing data models to support data integration, transformation, and analytics. We value your expertise in handling data from various sources and your commitment to ensuring scalable, efficient, and high-quality data solutions. Job Responsibilities : Design and implement data models (conceptual, logical, and physical) to support business requirements. Hands on Erwin tool experience is added advantage. Work with structured and unstructured data from multiple sources and integrate them into Databricks . Develop ETL/ELT pipelines to extract, transform, and load data efficiently. Optimize data storage, processing, and performance in Databricks. Collaborate with data engineers, analysts, and business stakeholders to understand data needs. Ensure data governance, quality, and compliance with industry standards. Create and maintain documentation for data models, pipelines, and architectures. Troubleshoot and optimize queries and workflows for performance improvement. Create/modify queries at consumption level for end users. Required qualifications, capabilities, and skills : 5+ years of experience in data modeling/data engineering. Strong expertise in Databricks, Delta Lake, Apache Spark, advanced queries Experience with SQL, Python for data manipulation. Knowledge of ETL/ELT processes and data pipeline development. Hands-on experience with data warehousing, relational databases, and NoSQL . Familiarity with data governance, security, and compliance best practices . Strong problem-solving skills and the ability to work in an agile environment . Preferred qualifications, capabilities and skills: Experience working with large-scale data systems and streaming data . Knowledge of business intelligence (BI) tools and reporting frameworks . Experience in finance domain (P&A, Markets etc) will be preferable. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience with OLAP tools (TM1, Essbase, Atoti etc) is a plus.
Posted 3 months ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 03/20/2025 Job Type Permanent RSD NO 10503 Industry IT Services City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600086 Job Description Job Description: We are looking for an Automation Quality Engineer who is self-motivated, creative and proactive, to work successfully in a fast-paced environment including multiple platforms and architectures, diverse technologies and Cloud environments. The individual will work closely with various stakeholders throughout the SDLC, executing automated test iterations, tracking & reporting test results, troubleshooting and coordinating the bug fixes. The individual should have a strong understanding of agile processes and the related Quality lifecycle and automation methodology. Responsibilities: • Design, Develop, Execute and analyze automation test scripts & test results. • Estimate test accurately and coordinate with team members for work activities. • Apply design and build automated testing capabilities under BDD umbrella. • Record test results and report and verify software bug fixes to accept automation criteria. • Coordinate with program and development management teams in product development lifecycle to conform end user product and quality requirements and shipment schedule. Required Skills: • Expert in writing automated scripts using scripting languages like Python, Java, Java Script • Expert in designing and building automated testing framework using BDD tools like Cucumber, Selenium, TestCafe. • Expertise is Performance Testing using K6 and similar tools. • Hands-on experience in building and implementing CI/CD automation strategies for testing using tools like Jenkins, Bamboo, Azure Devops • Experienced in designing automated test scripts testing Operational/Relational/OLAP databases and data warehouses. • Expert in testing APIs and Micro-services for both UI layer and data layer for data extraction, preparations and consumption modules. • Exposure and familiarity with SQL databases • Strong Analytical skills to understand complex business logic and calculations. • Experience in writing clear, concise and comprehensive test plans and test cases • Good communication skills. • Exposure to financial domain would be preferred • Azure/AWS Certification would be preferred At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.
Posted 3 months ago
1 - 3 years
8 - 12 Lacs
Bengaluru
Work from Office
We are seeking a talented Data/Software Engineer II with expertise in big data processing and ETL pipeline management. Additionally, a solid background in software engineering including building scalable performant web systems with clear focus on reusable modules. Key Responsibilities: - Design, develop, and maintain ETL pipelines to process large-scale datasets efficiently and reliably. - Build and optimize Spark-based data pipelines to perform transformations and aggregate data for analytics and machine learning applications. - Implement AWS Glue jobs to support data ingestion, transformation, and integration across various data sources. - Leverage Apache Iceberg for efficient data storage, management, and querying, with a focus on performance and scalability. - Utilize Airflow to orchestrate complex workflows and ensure the timely and efficient execution of data processing jobs. - Implement Change Data Capture (CDC) processes to capture real-time changes from source systems and integrate them into downstream data systems. - Build scalable and efficient ETL solutions that maintain high data quality and data governance standards. - Develop, test, and deploy web services for data access APIs, integrating data pipelines with other applications. - Ability to translate fuzzy business problems into technical problems come up with design, estimates, planning, execution deliver the solution independently. - Use MySQL, MongoDB, and other database technologies to store and retrieve data as needed for ETL processes and web services. Required Skills and Qualifications: - 2-4 years of experience as a data/software engineer or in a related role, preferably in a fast-paced and data-intensive environment. - Strong experience with Spark for batch and real-time data processing, including writing and optimizing Spark jobs. - Strong problem-solving skills and a proactive approach to tackling complex data challenges. - Knowledge of Apache Kafka or similar streaming technologies for real-time data processing. - Understanding of distributed systems and cloud-based architectures. - Strong expertise in - coding, data structures, algorithms, low-level class DB design, high-level system design, and architecting for high scale using distributed systems. - Excellent communication and collaboration skills to work effectively within cross-functional teams. - Experience with CDC techniques and tools to capture data changes in real-time. - Experience with SQL and OLAP data stores.
Posted 3 months ago
3 - 6 years
12 - 14 Lacs
Indore, Noida
Work from Office
We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming - Ability to write complex SQL Queries, Stored Procedures. Migration - Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure queries to PostgreSQL stored procedures Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team
Posted 3 months ago
4 - 6 years
1 - 5 Lacs
Pune
Work from Office
4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required
Posted 3 months ago
10 - 12 years
16 - 21 Lacs
Chennai, Mumbai, Bengaluru
Work from Office
ROLE SUMMARY Pfizer s purpose, breakthroughs that change patients lives, is rooted in being a science driven and a patient focused company. Digital, AI, data, and analytics are central to driving innovation at Pfizer. The Senior Manager, Product Management & Delivery will be on a team responsible for maintaining the overall vision for the different data products supporting our Enabling Functions businesses. This Senior Manager, Product Management & Delivery position will work closely with our business leads and client partners to determine opportunities and set roadmaps that drive measurable innovation and impact. This position will work closely with Artificial Intelligence Data & Analytics (AIDA) EF Product & Delivery Lead on developing standard portfolio processes, align roadmap priorities with Solution Delivery managers and guide Product Analysts. ROLE RESPONSIBILITIES Stakeholder Engagement: Collaborate with prospective users and clients to understand and anticipate their needs and translate them into product requirements. Regularly communicate with stakeholders to gather feedback and ensure their needs are being met. Effectively communicate to AIDA EF leadership when required to manage stakeholder expectations. Monitoring and evaluating product progress at each stage of the process, updating Objectives & Key Results (OKRs) in the Agile Measurement Framework. Serve as a single point of contact and accountability to the different business you serve. Shapes thinking on how technology solutions support the goals of the business. When required Identify, manage, and bring to resolution all project risks/ issues. Escalate major project risks to scope, budget, or timeline to appropriate escalation points in a timely manner. Vision and Strategy: Develop and communicate the product vision and strategy to stakeholders and the team. Ensure alignment with business goals and customer needs. Creating and maintain a product road map based on this vision. Product Development & Continuous Improvement: Drive the execution of all processes in the product lifecycle, including product discovery, planning, requirements and roadmap development, and product launch. Create product strategy documents that describe business cases, high-level use cases, technical requirements, value propositions. Identify areas for improvement and implement changes to enhance the product and the development process. Establish best practices and ways of working that measure team efficiency and manages capacity. Actively monitor & evaluate emerging technologies with an eye on how it could be leveraged to improve your digital products and provide additional benefit to the customer. Interpersonal and Administration: Strong interpersonal, communication, influencing, analytical and problem-solving skills. Ability to compose and present material to communicate difficult or complex concepts and gain consensus. Partner with the other Digital organizations to align on broader strategies and best practices and ensure consistent use. Manage product budget and expenses ensuring that product is financially viable deploying resources in an appropriate manner to meets customer needs. Build, grow and support Solution Delivery managers and Product Analysts through coaching, mentoring and giving feedback. Basic Qualifications: BSc degree is required. 6+ years of experience. Proven success as a Data Product Manager / Owner. 5+ years of management experience leading project teams and technology delivery functions. Experience in product operating model with strong understanding of Agile methodologies, excellent communication and leadership skills, ability to manage multiple priorities, and strong problem-solving skills. Demonstrated experience at owning and managing a significant organizational or project budget. Demonstrated effective collaboration with teams across an organization to drive processes and operations. Demonstrated ability to quickly gain credibility with senior management and effectively influence and collaborate with others. Nice to have: An advanced degree. Knowledge of pharmaceutical industry. Working knowledge of finance, quality assurance, legal and compliance business functions. Demonstrated experience in scoping, defining and managing projects including determining project viability, developing rigorous project estimates, developing and establishing a project plan, budget defining technology solutions, developing RFPs, evaluating vendor responses, negotiating SOWs and identifying resource requirements. Knowledge of visualization tools such as report builders, ad-hoc queries, OLAP, dashboards (Tableau, Spotfire, Power BI etc. ). Physical data architecture options such as enterprise data warehouse, data federation, hub & spoke architectures, independent data marts. Data integration tools and techniques such as ETL (Extract Transform Load), CDC (Change Data Capture). Relevant related areas such as data quality, master data management, metadata management, collaboration and business process management. Please apply by sending your CV and a motivational letter in English. Work Location Assignment: Hybrid Purpose Breakthroughs that change patients lives . . . At Pfizer we are a patient centric company, guided by our four values: courage, joy, equity and excellence. Our breakthrough culture lends itself to our dedication to transforming millions of lives. Digital Transformation Strategy One bold way we are achieving our purpose is through our company wide digital transformation strategy. We are leading the way in adopting new data, modelling and automated solutions to further digitize and accelerate drug discovery and development with the aim of enhancing health outcomes and the patient experience. Flexibility We aim to create a trusting, flexible workplace culture which encourages employees to achieve work life harmony, attracts talent and enables everyone to be their best working self. Let s start the conversation! Equal Employment Opportunity We believe that a diverse and inclusive workforce is crucial to building a successful business. As an employer, Pfizer is committed to celebrating this, in all its forms - allowing for us to be as diverse as the patients and communities we serve. Together, we continue to build a culture that encourages, supports and empowers our employees. Disability Inclusion Our mission is unleashing the power of all our people and we are proud to be a disability inclusive employer, ensuring equal employment opportunities for all candidates. We encourage you to put your best self forward with the knowledge and trust that we will make any reasonable adjustments to support your application and future career. Your journey with Pfizer starts here! Information & Business Tech #LI-PFE
Posted 3 months ago
7 - 9 years
25 - 95 Lacs
Bengaluru
Work from Office
Be responsible for the design, development and maintenance of the Real-time Reporting platform at Zendesk. Work collaboratively with a small, focused and self-organising team to deliver high impact outcomes for Zendesk customers. Mentor and guide engineers, fostering a culture of continuous learning and technical excellence. Translate business needs into technical requirements by engaging effectively with business owners and stakeholders Lead technical initiatives, ensuring delivery of scalable and robust solutions What you bring to the role 7+ years experience working on high scale applications and data processing pipelines. Have a proven experience in both Software Engineering and Data Engineering with a focus on delivering large-scale distributed and high quality applications. Hands-on experience with real-time data pipelines, stream processing and OLAP databases (e.g ClickHouse, Snowflake, Kafka, ElasticSearch etc) Demonstrated ability to lead engineering projects and mentor junior engineers Ability to design scalable and robust data architectures for real-time analytics. Strong communication skills, both written and verbal and the ability to collaborate with teams across multiple time zones globally. Strong proficiency in Java is preferred and hands-on experience with technologies like AWS, Clickhouse, Kafka, Docker and Kubernetes
Posted 3 months ago
7 - 9 years
25 - 95 Lacs
Bengaluru
Work from Office
Lead a team owning and building critical services for the Real-time Reporting and Analytics vision at Zendesk Work with leaders across Engineering and Product Management to define how the engineering teams align with our long-term roadmap Define, implement and deliver long and short-term business and technology initiatives collaboratively to optimize development and security, increase team efficiency, and support business objectives Own services, products, and projects delivery across multiple teams with global dependencies Provide mentorship, guidance and technical direction to the team for their growth while improving software delivery and ensuring team happiness What you bring to the role 7+ years in software engineering with 3+ years of experience in a technical leadership or management role Good understanding of application development, microservices and event driven architectures Proven experience with large-scale cloud architecture preferably in a SaaS environment Hands-on experience with real-time data pipelines, stream processing and OLAP databases (e.g ClickHouse, Snowflake, Kafka, ElasticSearch etc) Proficiency in programming languages such as Java, Ruby or Python with a focus on system optimization and performance Strong communication and collaboration with the ability to work cross-functionally and influence stakeholders globally Proven prioritization skills, to focus your time and energy on areas of highest impact while balancing short-term and long-term growth Dedication to continual self-development and fostering a learning environment within the team
Posted 3 months ago
3 - 5 years
5 - 7 Lacs
Hyderabad
Work from Office
We are seeking a Business Intelligence Developer to collaborate closely with our business leaders, product teams, and the data analytics team to create cross-discipline reports, dashboards, and business intelligence solutions. The Business Intelligence Developer will play a key role in managing business needs, understanding integrations with existing platforms and translating them to insightful data visualizations and analytical solutions. Minimum Education Required: Bachelor's degree in computer science or closely related field Minimum Experience: Minimum 3+ years in working with business and product stakeholders, technical teams and providing insightful data analytics, reports, and dashboards Minimum 3+ years of experience with business intelligence and data visualization using Tableau Advanced knowledge and experience with writing and optimizing SQL queries, working with Relational DBs such as Microsoft MS SQL server/SSRS, MySQL, or Oracle Experience with OLAP, Data lake such as Databricks Delta lake is a plus Experience with delivering key business metrics reporting/dashboards for Sales, Marketing, and Finance teams is preferred Work in an agile environment such as Scrum Knowledge, Skills, and Abilities: Strong mathematical background and knowledge in at least one of the following fields: statistics, data mining, predictive modeling, operations research, econometrics, and/or information retrieval Strong Analytical skills with a problem-solving aptitude and the ability to conclude using various resources Collaborate with stakeholders across various departments to develop Data Analytics, reporting, and dashboards that meet the business needs and to enable data-driven decision making. Take a proactive approach to defining new metrics, reports, and dashboards to improve visibility into the business in a high-growth environment Develop analytics to understand product sales, marketing impact, and application usage metrics and KPIs for current and future products Work closely with the Data Engineering team to define, drive and store key data points that the team should leverage to enable analytical capabilities and use for the implementation of reports Own the design, development, and maintenance of ongoing performance metrics, reports, analyses, dashboards to drive key business decisions Must be effective at summarizing and communicating complex information both in writing and verbal and be able to provide insights and recommendations to audiences of varying levels of technical understanding Exceptional communication skills with the ability to deftly negotiate, prioritize influence decision-makers and build consensus with teams Meticulous attention to detail and strong work ethic Job Responsibilities: Work with QA engineers to ensure the quality and reliability of all reports, extracts, and dashboards by way of continuous improvement Collaborate with technical architects, product leads, QA team, and customer care team to drive new enhancements or fix bugs promptly Document all project/work assignments with required technical documentation to ensure continuity of the process.
Posted 3 months ago
12 - 15 years
50 - 55 Lacs
Bengaluru
Work from Office
As a Senior Manager on the Walmart Commerce Technologies Retail Media team you will have the opportunity to work on commercializing mature enterprise systems and enabling them for retailers. You will build a new SaaS product and ensure engineering excellence. As an experienced engineer you will have the chance to influence technical strategy for this new product. This is a great opportunity for career development What youll do: Manage a high performing team of 5-10 engineers & engineering leads who work across multiple technology stacks including Java, NodeJS. Drive design, development, implementation and documentation Establish best engineering and operational excellence practices based on product, engineering and scrum metrics. Designing, guiding and vetting system designs to ensure scalable and robust architecture Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product and Business stakeholders to drive the agenda, set the priorities and deliver scalable and resilient products. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery) within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring, mentoring and building high performing agile teams. Participating in organizational events like hackathons, demodays etc. and be the catalyst towards the success of those events Interact closely for requirements with Business owners and technical teams both within India and across the globe Setting team priorities that support the borader organization s goals and prioritizing feature development aligned with team s strategic objectives Establishing clear expectations with individuals based on their level and role, meeting regularly to discuss performance, provide feedback and coaching Developing the mid-term technical vision and roadmap, evolving it to meet future requirements and infrastructure needs. Leading a team of engineers and tech leads to transform the technical vision into reality What youll bring: Bachelors/Master s degree in Computer Science, engineering, or related field, with minimum 12+ years of experience in software development and at least 3+ years of experience in managing engineering teams. Have prior experience in managing high performing agile technology teams. Hands on experience building Java, NodeJS based backend systems is a must, and experience of working in cloud based solutions is desirable. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should have exposed to CI/CD development environments/tools including, but not limited to, Git, Maven, Jenkins. Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Posseses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Good knowledge in Pub sub system like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 5 years experience in software engineering or related area. Option 2: 7 years experience in software engineering or related area. 2 years supervisory experience. Preferred Qualifications... Master s degree in computer science, computer engineering, computer information systems, software engineering, or related area and 3 years experience in software engineering or related area
Posted 3 months ago
5 - 7 years
45 - 50 Lacs
Hyderabad
Work from Office
We are seeking a strong and passionate data engineer with experience in large scale system implementation, with a focus on complex data pipelines. The candidate must be able to design and drive large projects from inception to production. The right person will work with stakeholders, analysts, scientists to gather requirements and translate them into a data engineering roadmap. Must be a great communicator, team player, and a technical powerhouse. ---- What You Will Do ---- Collaborate with engineering/product/analyst teams across tech sites to collectively accomplish OKRs to take Uber forward. Enrich data layers to effectively deal with the next generation of products which are a result of Ubers big bold bets. Design and build data pipelines to schedule & orchestrate a variety of tasks such as extract, cleanse, transform, enrich & load data as per the business needs. ---- What You Will Need ---- Strong SQL skills Strong in Data Warehousing and Data Modelling concepts Hands on experience in Hadoop tech stack: HDFS, Hive, Oozie, Airflow, MapReduce, Spark. Programming languages - Python, Java, Scala etc. Experience in building ETL Data Pipelines Performance Troubleshooting and Tuning Experience with DW (Data Warehouse) or BI (Business Intelligence) or OLAP tools like Anaplan, TM1, Hyperion etc. Experience building high-quality end-to-end data solutions in an agile environment from requirements to production. Strong in Data Warehousing concepts Should be self-motivated and passionate about bringing efficiency into the system through optimizations. Should be able to raise the bar for other engineers by proposing and driving innovative ideas. Experience in mentoring junior team members technically. If need be, Act as a Leader to drive big efforts in collaboration with other engineers or team members
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.
The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.
Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.
In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.
As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2