Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 - 3.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Title: QlikSense Developer Location: Hyderabad / Pune (Immediate Joiners Only) Job Responsibilities: Design, build, and maintain QlikSense visualization dashboards Troubleshoot and optimize QlikSense dashboards Administer the QlikSense environment to ensure smooth operation What We Seek: Excellent communication and interpersonal skills Strong problem-solving abilities 3-4+ years of hands-on experience in QlikSense development Proven experience with SQL Familiarity with Big Data technologies (e.g., HBase, Hadoop, Hive, Cassandra) and NoSQL databases Knowledge of OLAP technologies such as TM1, SSAS, or Hyperion Essbase Up-to-date with the latest BI tools and technologies Good understanding of Agile methodologies and DevOps principles Nice to Have: Experience with ETL technologies such as Pentaho and CTRL-M Experience with scripting languages (e.g., PowerShell, JavaScript) Exposure to financial services, preferably Capital Markets Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Benefits: Provident Fund Schedule: Day shift Monday to Friday Application Question(s): Our requirement is for immediate joiners or candidates who can join within 15 days. Are you available to join within this timeline? Our budget for this role is up to ₹15 LPA. Are you comfortable with this compensation range? Experience: QlikSence Developer: 4 years (Required) SQL Proficiency: 3 years (Required) Location: Hyderabad, Telangana (Required) Work Location: In person
Posted 4 weeks ago
0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Job Title: Analytics Head / Data Model Architect Job Summary: We are seeking a seasoned Analytics Head / Data Model Architect to lead our data strategy, design scalable data models, and drive analytical innovation across the organization. This role combines leadership in data science and business analytics with deep technical expertise in data architecture and modelling. The ideal candidate will be a strategic thinker, technical expert, and effective communicator capable of aligning data initiatives with business objectives. Key Responsibilities: Leadership & Strategy Lead and manage a team of data scientists, analysts, and data architects. Define and drive the enterprise analytics strategy aligned with business goals. Collaborate with executive leadership to identify data-driven growth opportunities. Data Architecture & Modeling Design and implement robust, scalable, and high-performance data models (OLAP/OLTP, dimensional, relational, NoSQL). Develop enterprise data architecture standards, policies, and best practices. Oversee data governance, data quality, and metadata management initiatives. Advanced Analytics & Insights Build advanced analytics solutions including predictive modeling, statistical analysis, and machine learning frameworks. Translate complex data into actionable insights for stakeholders across departments. Evaluate and implement modern BI tools and platforms (e.g., Power BI, Tableau, Looker). Collaboration & Integration Partner with data engineering teams to ensure efficient ETL/ELT pipelines and data lake/warehouse infrastructure. Work closely with business units to understand needs and deliver customized data solutions. Support data privacy, security, and compliance initiatives (e.g., GDPR, HIPAA, SOC 2). Required Qualifications: Bachelor’s or master’s in computer science, Data Science, Statistics, or related field. PhD is a plus. 10+ years of experience in analytics, data architecture, or related roles. Strong knowledge of data modeling techniques (3NF, Star Schema, Snowflake, Data Vault, etc.). Expertise in SQL, Python, R, and at least one cloud platform (AWS, Azure, GCP). Experience with modern data warehousing tools (Snowflake, BigQuery, Redshift) and orchestration (Airflow, DBT). Proven leadership and team-building skills. Preferred Skills: Experience with AI/ML model deployment in production. Familiarity with data mesh, data fabric, or modern data stack concepts. Knowledge of industry-specific data standards (e.g., HL7 for healthcare, ACORD for insurance). Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from Tata Consultancy Services Join the Walk-in Drive on 24th May 2025 and Pave your path to value with TCS AI Cloud Team We are Hiring for Below Skills Exp : 4 yrs to 12 yrs Azure Data Engineer Required:Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW Show more Show less
Posted 1 month ago
2 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY_ Consulting _Data Analytics Testing_Staff The opportunity As a Data Analytics Test Engineer, you will be responsible for testing Business Intelligence & Data warehousing Solutions both in on premise and cloud platform and should ensure Quality of deliverables. You will work closely with Test Lead for the Projects under Test. Testing proficiency in ETL, data-warehousing, and Business Intelligence area are required for this position. Added advantage to have experience in testing Big Data/unstructured data using Hadoop/Spark framework, cloud platform knowledge either in AWS/Azure, knowledge in predictive analytics, Machine Learning and Artificial intelligence. Skills And Attributes For Success Delivery of Testing needs for BI & DWH Projects. Ability to effectively communicate with team members across geographies effectively Perform unstructured data / big data testing both in on-premise and cloud platform. Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for Testing BI & DWH Projects for various aspects like ETL testing & Reports testing (Front end and Backend Testing), Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 9 years of experience in Testing Data warehousing / Business Intelligence solutions, minimum 2 years of experience in Testing BI & DWH technologies and Analytics applications. Experience in Bigdata testing with Hadoop/Spark framework and exposure to predictive analytics testing. Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Experience in cloud AWS/Azure infrastructure testing is desirable. Knowledge on python data processing is desirable. Testing experience in more than one of these areas- Data Quality, ETL, OLAP, Reports Good working experience with SQL server or Oracle database and proficiency with SQL scripting. Experience in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Experience in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Experience/ Knowledge in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, you’ll also have Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing BI & DWH applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Founding Fullstack Developer About Oculon AI At Oculon AI, we're revolutionizing organizational planning with our next-gen data modeling and business intelligence web application. Our mission is to centralize and simplify planning processes that are currently scattered across outdated tools. Our Engineering team is at the forefront of this transformation, leveraging cutting-edge technologies in Data Analytics and AI to build a robust system. Our platform efficiently processes large multi-dimensional data, predicts metrics, and scenarios using state-of-the-art forecasting ML models, and delivers automated insights through a clean and intuitive UI. We are building advanced AI Agents to enhance user experience and productivity across planning applications. These AI Agents answer questions, automate model and dashboard building, suggest actions, and execute scheduled tasks, providing a seamless, intelligent planning experience. Oculon AI is redefining the world of Financial Planning & Analysis (FP&A) in an established software category with a multi-billion-dollar TAM but no clear winner. We’re here to change that. Join us at Oculon AI and be part of a team that's setting new standards in creating the go-to platform for business planning. Job Description We are seeking an experienced, visionary, and hands-on Senior Fullstack Developer to be a cornerstone of our engineering team. In this pivotal role, you will take ownership of significant parts of our platform, architecting and developing solutions across the entire stack – from intuitive user interfaces dealing with large datasets to robust backend systems and innovative AI integrations. You will collaborate closely with the founding team, product, and design to shape our technology roadmap, implement high-performance, scalable solutions, and drive best practices. Your extensive experience will be crucial in making critical technical decisions and defining the user and developer experience of our product. If you are passionate about building transformative products from the ground up and leveraging cutting-edge technology, this is the role for you. Compensation: For now, there will be Rs. 50,000 payout per month for the first 5 months and it will be a full-time role. After that, we will raise the compensation to market value, you will be eligible for a much higher equity payout for your contribution as a founding employee at Oculon. Location: Remote Employment Type: Full-Time Application Link: https://tally.so/r/3j79r4 Responsibilities Architect and Develop: Lead the architecture, development, and maintenance of components of our full-stack web application, including complex frontend components for data modeling (interactive spreadsheets/data-grids, dashboards), sophisticated AI-powered features (chatbot UIs, automated insights), and scalable backend services. Full-Stack Implementation: Design, build, and maintain efficient, reusable, and reliable code across frontend (React.js), backend (Python), and database layers. AI Integration: Collaborate on the design and integration of Generative AI and LLM-powered features, ensuring seamless interaction between AI models and the user-facing application. Data Systems Design: Contribute to the design and optimization of backend systems for handling large multi-dimensional and time-series data, ensuring efficient querying and data processing. Mentor and Guide: Provide technical leadership and mentorship to future team members, fostering a culture of code quality, innovation, and continuous improvement. Optimize Performance: Drive performance optimization initiatives across the stack, ensuring the application meets high standards of speed, responsiveness, scalability, and data handling capacity. Technical Strategy: Play a key role in defining the overall technical strategy of the product, making critical decisions on technologies, frameworks, and architectural patterns for both frontend and backend systems. Cross-functional Collaboration: Work closely with product managers, designers, and other engineers (as the team grows) to define, design, and ship new features and improvements. Code Reviews and Quality Assurance: Champion and implement coding standards, lead code reviews, and ensure high-quality, maintainable code across the entire codebase. Stay Ahead of Trends: Continuously evaluate and propose new technologies, tools, and methodologies to keep our tech stack current, efficient, and cutting-edge. Required Skills and Qualifications Experience: Minimum 4+ years of professional experience in full-stack development, with a proven track record of architecting and delivering complex, scalable web applications. Frontend Expertise: Advanced proficiency in React.js (including hooks, Context API, performance optimization) and a deep understanding of HTML5, CSS3, and modern JavaScript (ES6+). Backend Proficiency: Strong experience with backend development using Python, including API design (RESTful, GraphQL). Database Knowledge: Solid understanding of database technologies (SQL and/or NoSQL) and experience designing database schemas and writing efficient queries. Architectural Skills: Proven ability to design and implement scalable and maintainable full-stack architectures for data-intensive applications. State Management: Expert-level knowledge of frontend state management solutions (e.g., Redux, Zustand, Recoil) for complex applications. Testing and Quality Assurance: Strong background in full-stack testing methodologies, including unit, integration, and end-to-end testing (e.g., Jest, React Testing Library, Cypress, PyTest). Build and Deployment: Expertise in modern build tools (Webpack, Vite) and CI/CD pipelines. Version Control: Advanced Git skills, including branching strategies and workflow management. Preferred Skills and Qualifications (In decreasing order of preference): a. Advanced Frontend Architecture for Large Data & WebSockets: Demonstrable experience in frontend architecture design, with a strong focus on performance optimizations for handling, rendering, and interacting with very large datasets in the browser. Significant experience designing and consuming WebSocket APIs for real-time, bi-directional communication and data streaming. b. Data Grid & Charting Libraries: Extensive hands-on experience with advanced data grid libraries like AG Grid, including customization, performance tuning for large datasets, and integrating complex features. Proficiency with charting and data visualization libraries such as Recharts, D3.js, or similar, for creating interactive and insightful dashboards. c. AI Engineering & GenAI/LLM Expertise: Proven experience as an AI Engineer or in a similar role involving practical application of Generative AI and Large Language Models (LLMs). Deep understanding and hands-on experience with LLM engineering techniques, including Prompt Engineering, Tool Calling/Function Calling, ensuring Structured Outputs, Retrieval Augmented Generation (RAGs), Multi-Candidate Prompts (MCPs), and familiarity with fine-tuning concepts. d. Backend/Data Engineering for Complex Data Systems: Hands-on experience as a Backend or Data Engineer designing, building, and optimizing systems for querying, processing, and modeling large-scale multi-dimensional and time-series data. Strong familiarity with data warehousing concepts (e.g., Star schemas, Snowflake schemas), column-store databases (e.g., ClickHouse, Druid, or cloud equivalents), and calculation engines (e.g., spreadsheet formula engines, OLAP cube concepts). Additional Valued Skills: Knowledge of WebGL or Canvas for high-performance rendering of large datasets. Experience with micro-frontend or micro-service architectures. Familiarity with server-side rendering (SSR) or static site generation (SSG) techniques (e.g., Next.js). In-depth knowledge of web performance optimization techniques and metrics (Core Web Vitals). Benefits Flexible Hours: Flexible working hours and remote work options. Growth Opportunities: Unparalleled opportunities for professional growth, skill development, and career advancement as a foundational member of the team. Innovative Environment: A collaborative, stimulating, and innovative work environment where your ideas directly shape the product. Why Join Us? Foundational Impact: As a founding developer, you will have a monumental impact on the technological direction, culture, and ultimate success of Oculon AI. Your contributions will be visible and critical. Greenfield Opportunity: Build from the ground up with modern technologies, without the constraints of legacy code. Solve Hard Problems: Tackle challenging and intellectually stimulating problems in data visualization, AI, and large-scale data management that have a real-world impact. Innovation at Speed: Work on exciting projects that push the boundaries and explore what’s possible using the latest web development, data, and AI tools in a fast-paced environment. Culture of Excellence: Be part of a highly collaborative, inclusive, and high-performing team that values deep technical expertise, innovation, and user-centricity. If you are a highly motivated, innovative, results-oriented, and versatile full-stack developer looking for a unique opportunity to build a category-defining product from its earliest days, we want to hear from you! Application Link: https://tally.so/r/3j79r4 Apply now at the above link to join our team and be part of our exciting journey! Show more Show less
Posted 1 month ago
8 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions (SaT)– DnA Assistant Director EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Assistant Director - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud-based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud (MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success 12-14 years of total experience with 8+ years in Data warehousing/ Business Intelligence field Solid hands-on 8+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse] 5+ years hands-on experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. Experience with Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3 - 7 years
3 - 7 Lacs
Bengaluru
Work from Office
Hyperion Essbase Developer Full-time DepartmentEnterprise Applications Company Description Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment, and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role A Hyperion Essbase Developer will be responsible for designing, developing, and maintaining Oracle Hyperion Planning and Essbase applications . This role requires expertise in multidimensional databases, OLAP technologies, and financial data modeling . Technical Responsibilities Essbase Development : Design and develop BSO (Block Storage Option) and ASO (Aggregate Storage Option) cubes . Implement calculation scripts, business rules, and member formulas . Optimize cube performance using indexing, partitioning, and aggregation techniques. Hyperion Planning : Configure and maintain Hyperion Planning applications . Develop data forms, task lists, and workflow processes . Data Integration & Automation : Implement ETL processes using FDMEE (Financial Data Quality Management Enterprise Edition) . Develop SQL scripts for data extraction and transformation. Automate data loads, metadata updates, and security provisioning . Security & Performance Optimization : Manage user roles, access permissions, and authentication . Optimize query performance using Essbase tuning techniques. Monitor system health and troubleshoot performance issues . Qualifications Oracle Hyperion Planning & Essbase Essbase Calculation Scripts & Business Rules SQL & PL/SQL FDMEE & Data Integration EPM Automate Smart View & Financial Reporting Metadata Management & Security Configuration Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. #LI-BS1 Cookies Settings
Posted 1 month ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption. We work closely with ISV’s and with the startup ecosystem in the Virtualization, Cloud, AI/ML and Gen AI domains to build solutions that matter for the customers and you will work closely with product owner and product lead on the company's current and future strategies related to said domains. Job Requirements Deliver features, including participating in the full software development lifecycle. Deliver reliable, innovative solutions and products. Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. Work closely with cross-functional teams including business stakeholders to innovate and unlock new use-cases for our customers Write unit and automated integration tests and project documentation. Technical Skills Understanding of Software development lifecycle Proficiency in full stack development in MERN Stack, Python, Container Ecosystem, Cloud and Modern ML frameworks. Knowledge of Data storage, virtualization, knowledge on hypervisors such as VMware ESX, Linux KVM and Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model training, inferencing as well as RAG workflows. Unix based operating system kernels and development environments like Linux or FreeBSD. A strong understanding of basic to complex concepts related to computer architecture, data structures, and new programming paradigms. Education Minimum 5 years of experience and must be hands-on with coding. B.E/B.Tech or M.S in Computer Science or related technical field. At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees. This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process. Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification. Why NetApp? We are all about helping customers turn challenges into business opportunity. It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better - but also to innovate. We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches. We enable a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations. We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future. We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life. If you want to help us build knowledge and solve big problems, let's talk. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Let’s be unstoppable together! At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We have a global commitment to diversity, equity, and inclusion as we believe in the undeniable strength that diversity brings to our business, employees, clients, and communities (with us you can always bring your full self to work). Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Learn more at www.circana.com. What will you be doing? Circana is looking for a software development manager to lead a team of 60+ software developers. The person will be responsible for a portfolio of products for Circana's growth engine Media division. Some of our measurement products are developed in partnership with marketing giants like Google, Meta and Amazon to name a few and you may be interacting with these companies. Internally will be working with product management in US and EU and Data Science team in India, Greece and US. We are looking for a leader who can drive excellence in execution and drive the team to deliver the best. The person should have computer science and statistics background and 15+ years of working experience in software development in marketing and analytics. Our products are developed in Circana's Liquid Data platform which has one of the fastest in-memory aggregation engine in the marketplace, also has a best in class ROLAP engine to do complex analytics calculations and also an excellent Visualization layer. Circana is a bigdata company with 100s of penta bytes of data under management in on-premise and cloud. Our data layer is in Hadoop/Spark. Job Responsibilities Leading team in terms of requirement gathering & clarification, technical expertise, project delivery and release management Deep dive into Product development, Design & Architecture changes and suggestions, Code review and approval for Arch/Design changes Understanding the platform and product, Analyzing the tools and technologies for betterment of the existing solution/product Workings closely with team to make sure any technical, functional clarifications are addressed on day-to-day basis Working with onsite for big picture on the product/solution and getting aligned with team in terms of overall initiative Continuous R&D on the new technologies and bringing up the pros and cons for product/solution evolution Planning the delivery and release as per client commitment Upskilling the team and tracking on the progress Reporting the status on project and progress made to senior management Reviewing and enforcing coding standards, pushing and adopting for process improvements Requirements Minimum BE/B.Tech in computer Science with statistics 15-20 years of Experience Minimum 10 years of s/w development management experience Knowledge of web development Knowledge of cloud technologies Knowledge of Hadoop/Spark/Data Warehousing Knowledge of Relational Database Knowledge of OLAP technologies Circana Behaviors As well as the technical skills, experience and attributes that are required for the role, our shared behaviors sit at the core of our organization. Therefore, we always look for people who can continuously champion these behaviors throughout the business within their day-to-day role: Stay Curious: Being hungry to learn and grow, always asking the big questions Seek Clarity: Embracing complexity to create clarity and inspire action Own the Outcome: Being accountable for decisions and taking ownership of our choices Center on the Client: Relentlessly adding value for our customers Be a Challenger: Never complacent, always striving for continuous improvement Champion Inclusivity: Fostering trust in relationships engaging with empathy, respect and integrity Commit to each other: Contributing to making Circana a great place to work for everyone Location This position can be located in the following area(s): Bangalore / Pune Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Engineering Lead a strategic professional who stays abreast of developments within own field and contributes to directional strategy by considering their application in own job and the business. Recognized technical authority for an area within the business. Requires basic commercial awareness. There are typically multiple people within the business that provide the same level of subject matter expertise. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Significant impact on the area through complex deliverables. Provides advice and counsel related to the technology or operations of the business. Work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. Responsibilities: Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy. This includes understanding the data needs of portfolio managers, investment advisors, and other stakeholders in the wealth management ecosystem. Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement. Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data. This includes designing solutions for handling large volumes of structured and unstructured data from various sources. Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data. Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data. Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting. Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data. Qualifications: 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Analytics ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
Introduction IBM Cognos Analytics is a comprehensive Business Intelligence platform that transforms raw data into actionable insights through advanced reporting, AI-powered analytics, and interactive visualizations. Designed to cater to organizations of all sizes, it offers high-quality, scalable reporting capabilities, enabling users to create and share customized reports efficiently. The platform's intuitive interface allows for seamless exploration of data, uncovering hidden trends and facilitating informed decision-making without the need for advanced technical skills. With robust governance and security features, IBM Cognos Analytics ensures data integrity and confidentiality, making it a trusted solution for businesses aiming to harness the full potential of their data. Your Role And Responsibilities Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software Support the entire application lifecycle (concept, design, develop, test, release and support) Special interest in SQL/Query and/or MDX/XMLA/OLAP data sources Responsible for end-to-end product development of Java-based services Work with other developers to implement best practices, introduce new tools, and improve processes Stay up to date with new technology trends Preferred Education Master's Degree Required Technical And Professional Expertise 15+ years of software development in a professional environment Proficiency in Java Experience integrating services with relational databases and/or OLAP data sources Knowledge and experience in relational database, OLAP and/or query planning Strong working knowledge of SQL and/or MDX/XMLA Knowledge and experience creating applications on cloud platforms (Kubernetes, RedHat OCP) Exposure to agile development, continuous integration, continuous development environment (CI/CD) with tools such as: GitHub, JIRA, Jenkins, etc. Other Tools: ssh clients, docker Excellent interpersonal and communication skills with ability to effectively articulate technical challenges and devise solutions Ability to work independently in a large matrix environment Preferred Technical And Professional Experience The position requires a back-end developer with strong Java skills Experienced integrating Business Intelligence tools with relational data sources Experienced integrating Business Intelligence tools with OLAP technologies such as SAP/BW, SAP/BW4HANA Experienced defining relational or OLAP test assets - test suites, automated tests - to ensure high code coverage and tight integration with Business Intelligence tools Full lifecycle of SAP/BW and BW4HANA assets - Cube upgrade, server, and server support: administering, maintaining, and upgrading using current SAP tooling Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Requirements Description and Requirements Position Summary - Expecting someone with guidance and direction to perform development of frontend using Java programming. Someone who has acumen and interest to learn and work on new technologies in process Automation. Someone with good problem-solving skills, has understanding the important of delivery timeline and can operate independently. Job Responsibilities - Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills - Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less
Posted 1 month ago
8 - 12 years
35 - 60 Lacs
Bengaluru
Work from Office
Job Summary NetApp is a cloud-led, data-centric software company that helps organizations put data to work in applications that elevate their business. We help organizations unlock the best of cloud technology. As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption. We work closely with ISV’s and with the startup ecosystem in the Virtualization, Cloud, and AI/ML domains to build solutions that matter for the customers You will work closely with product owner and product lead on the company's current and future strategies related to said domains. Job Requirements • Lead to deliver features, including participating in the full software development lifecycle. • Deliver reliable, innovative solutions and products • Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. • Work closely with cross-functional teams including business stakeholders to innovate and unlock new use-cases for our customers • Write unit and automated integrationtests and project documentation • Mentor the junior’s in the team Technical Skills • Understanding of Software development lifecycle • Proficiency in full stack development ~ Python, Container Ecosystem, Cloud and Modern ML frameworks • Knowledge of Data storage and Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model, inferencing as well as RAG workflows. • Exposure on Data pipeline, integrations and Unix based operating system kernels and development environments, e.g. Linux or FreeBSD. • A strong understanding of basic to complex concepts related to computer architecture, data structures, and new programming paradigms • Demonstrated creative and systematic approach to problem solving. • Possess excellent written and verbal communication skills. Education • Minimum 8 years of experience and must be hands-on with coding. • B.E/B.Tech or M.S in Computer Science or related technical field.
Posted 1 month ago
6 - 11 years
40 - 45 Lacs
Hyderabad
Work from Office
At Apple, new ideas quickly transform into groundbreaking products, services, and customer experiences. Bring passion and dedication to your work, and there s no telling what can be accomplished.As part of the Supply Chain Innovation team, you will play a pivotal role in building end-to-end, best-in-class software solutions for Apple s Supply Chain needs, ranging from Supply Planning and Demand Planning to Product Distribution and beyond. You will collaborate with various internal partners to define and implement solutions that optimize Apple s internal business processes. Description We are seeking an individual who can address challenges and find creative solutions. The ideal candidate should excel in collaborative environments and produce high-quality software under tight deadlines. Must be a self-starter, highly motivated, and able to work independently, while collaborating effectively with multi-functional teams across the globe (US, Singapore, India, and Europe). This role will have a direct impact on Apple s business, requiring interaction with various internal teams to deliver cutting-edge products in a dynamic, ever-evolving environment. Key Responsibilities: Design, develop, and optimize highly scalable, distributed systems, leveraging cloud-native technologies and micro services architecture to build robust and resilient solutions.Lead proof-of-concept projects and pilot implementations to showcase new ideas.Strive for excellence by continuously seeking ways to enhance system reliability, performance, and security.Contribute to design and code reviews, and assist in debugging and resolving issues.Develop system components and take full responsibility for the timely delivery, quality of the work.Collaborate with product owners, developers, QA, support teams, and end users.Mentor and guide a team of engineers, fostering a culture of innovation and excellence.Tackle complex technical challenges, drive innovation, and stay up-to-date with emerging technologies.Collaborate with contractors to ensure successful project execution.Ability to multitask and work independently with minimal supervision.Occasionally, will need to handle application production (warranty) support.Excellent verbal and written communication skills. 6+ years of relevant experience in enterprise-level application development using advanced Oracle database technologies Hands-on experience with large-volume databases for both OLTP and OLAP environments Experience with databases such as SingleStore, Oracle, Snowflake, NoSQL, Graph, etc Strong expertise in ETL, performance optimization, and maintenance of various solutions based on Oracle databases Strong ability to research, design, and develop complex technical solutions involving multiple technologies Bachelors / masters degree in Computer Science or equivalent Preferred Qualifications Familiarity with Agile project management methodologies Experience with Python, Pandas DataFrames, SQLAlchemy, numpy, etc Experience in consuming and exposing web services (eg, SOAP, REST) Knowledge of UNIX/Linux platforms and scripting experience with Shell, XML, JSON AI/ML-related experience A strong understanding of LLMs, prompt engineering and RAG Experience in developing applications for the Supply Chain business domain
Posted 1 month ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
At Board, we power financial and operational planning solutions for the world s best brands. Thousands of enterprises use our technology to optimize resources, drive growth, and ensure profitability. With advanced analytics and forecasting, plus AI-driven insights, customers transform complex, real-time data into actionable intelligence. What s been key to our success? Our people we value everyone s unique perspective and energy they bring to the organization. We collaborate openly across teams and borders. We embrace a growth mindset to get results. And we celebrate shared success as goals and milestones are achieved. Ready to join a team where innovation meets collaboration? If youre driven by bold ideas and a customer-centric mentality, your next adventure starts here! We are currently looking for a Senior Premium Support Specialist to join our team on an early morning Indian shift (5:30 AM to 2:30 PM IST) . In this role, you will be accountable for providing assistance on a range of Planning Solutions developed for some of Board s key accounts. The Premium Support team plays a pivotal role in Board s Customer Success strategy by providing industry leading post-implementation support. Through regular service review meetings, our Support Specialists are expected to maintain a strong grasp of our customers ever-changing business and functional requirements whilst helping them understand how Board can be used to achieve their goals. Key Responsibilities and Objectives: Provide qualified functional and technical assistance for existing customer Board planning and reporting solutions. Participate in extensive knowledge transfer processes between delivery and maintenance teams. Be able to articulate, in deep technical detail how Board functionality can be used to meet Customer requirements and find a solution to business problems. Identify areas for improvement in existing applications. Work closely with the Board Product team by relaying Customer and market feedback. Assist Senior Specialists in meetings to provide insights to new features and functionality introduced in the Board Platform. Provide Reactive support for existing customers if/when questions/issues in their existing application arise. Requirements: Educational background in Business, Finance, Accounting, Computer Science, Management Information Systems (MIS), Mathematics or any relevant technical field. Experience with systems like Anaplan, TM1, Oracle, O9, JDA/Blue Yonder, SAP is preferrable. Previous Support or Consulting experience within Supply Chain, FP&A or Retail planning Good understanding of financial processes (Financial Consolidation and Lease Reporting for example) is beneficial. Exposure to multi-dimensional or OLAP technology preferred. Knowledge of SQL advantageous. Great de-escalation skills and capacity to work in very tight time frames. Strong troubleshooting, root-cause analysis and reverse engineering capacity. Ability to grasp elaborate business requirements and translate those into solutions within the Board platform. Excellent written and verbal communication skills. Board International is an equal opportunity employer and is committed to a diverse and inclusive workforce. Your personal data will be stored for as long as it is necessary to process the job applications that you submitted and for the provision of the service that you requested. Your personal data may also be processed for the fulfillment of the obligations provided for by law. Your data will in any case be deleted without unjustified delay once the aforementioned legal obligations have been fulfilled. Your personal data are collected and used by Board International SA and/or its subsidiaries that are located in the EU or outside on the basis of the appropriate safeguards provided by the European Regulation 2016/679. At any time you may request to access, to correct and/or delete your personal data used by Board International SA or by its subsidiaries for recruiting purposes. For further question, please refer to our Privacy Policy at https: / / www.board.com / en / privacy-policy
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Gurugram
Work from Office
ESSENTIAL FUNCTIONS 3 to 8 years of hands-on experience in design, development, and enhancement ofOAS/OBIEE11g reports, dashboards & its best practices. Strong experience in Oracle SQL. Good to have experience in design, development, and enhancement of BI Publisher 11g reports. UnderstandingOBIEE/OASsecurity implementations and permissions framework Experience withOBIEE/OASdeployment, configurations and general administrative activities Providing first line of technical support for business critical OBIEE / OBIA applications/reports. Should have strong debugging skills to identify the issues. Should have strong knowledge of data warehouse concepts like Star schema, snowflake schema, Dimension and Fact table. Should be able to perform Unit testing and Integration testing. Should be able to work on RPD and MUDE work environment. Good knowledge of performance tuning of dashboards and reports. Strong organizational skills, ability to accomplish multiple tasks within the agreed upon timeframes through effective prioritization of duties and functions in a fast-paced environment. EDUCATION AND EXPERIENCE: BE/BTech/ MCA with 3 to 8 years of relevant experience. Preferably from a services organization background with prior experience in OBIEE/OAS environments.
Posted 1 month ago
8 - 11 years
18 - 30 Lacs
Pune
Hybrid
What’s the role all about? As a Specialist BI Developer, you’ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You’ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Translate business needs to technical specifications Analyze the requirement and work with Product team to freeze requirements in accordance with reporting application capabilities Manage the Project in JIRA Conduct Agile ceremonies in absence of Scrum Master Conduct design and code reviews Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute. 8-11 years of BI report development experience Expertise in SQL & any cloud-based databases. Expertise in Snowflake is advantage Expertise in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas. Expertise in optimizing the data extraction process and queries Experience in managing project in JIRA, conduct Agile ceremonies Experience working with Data modelers and Governance team Experience in database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience working in any Performance test script generation – JMeter, Gatling etc.. Experience working in automating the Testing process for E2E and Regression cases. Experience working in JAVA/ Web services will be added advantage. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6620 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 month ago
8 - 13 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 15 Yrs Location: Pan India Job Description: Minimum Two years experience in Boomi Data modeling Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 1 month ago
0.0 - 20.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay’s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Key job responsibilities · Design, implement, and support a platform providing ad-hoc access to large data sets · Interface with other technology teams to extract, transform, and load data from a wide variety of data sources · Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies · Model data and metadata for ad-hoc and pre-built reporting · Interface with business customers, gathering requirements and delivering complete reporting solutions · Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. · Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. · Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 month ago
0 years
0 - 0 Lacs
Chennai, Tamil Nadu
Work from Office
Create COBOL, DB2, Informatica, Linux, Teradata and Oracle code artifacts Meet with various IT groups (other departments and computer operations' staff) to address issues/concerns Interact closely with Business Analysis team, ETL team and BI Reporting teams to ensure understanding of proper use of data architecture Analyze requirements to create technical designs, data models and migration strategies Design, build, and maintain physical databases, dimensional data models, OLAP cubes, ETL layer design and data integration strategies Evaluate and influence selection of data warehouse and business intelligence software Collaborate with technology stakeholders to define and implement actionable metrics, KPIs and data visualizations Lead technical design and implementation of dashboards and reporting capabilities Implement data quality, data integrity, and data standardization efforts across products and databases enabling key business processes and applications Recommend improvements to enhance existing ETL and data integration processes to enable performance and overall scalability Job Types: Full-time, Permanent, Fresher Pay: ₹18,455.00 - ₹28,755.00 per month Benefits: Provident Fund Schedule: Day shift Morning shift Rotational shift Supplemental Pay: Yearly bonus Work Location: In person
Posted 1 month ago
2 - 6 years
4 - 8 Lacs
Chennai
Work from Office
Strong proficiency with SQL and its variation among popular databases Experience with some of the modern relational databases Skilled at optimizing large complicated SQL statements Knowledge of best practices when dealing with relational databases Capable of configuring popular database engines and orchestrating clusters as necessary Proven experience as a BI Developer or Data Scientist Industry experience is preferred Background in data warehouse design (e.g. dimensional modeling) and data mining In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Familiarity with BI technologies (e.g. Microsoft Power BI, Oracle BI) Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) Proven abilities to take initiative and be innovative Analytical mind with a problem-solving aptitude
Posted 1 month ago
7 - 11 years
15 - 19 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Role Description: We are seeking a Data Solutions Architect to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support enterprise data lakes, data warehouses, and real-time analytics. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 6-8 + years of experience in Computer Science, IT or related field OR Master’s degree with 8-10 + years of experience in Computer Science, IT or related field OR Bachelor’s degree with 10-12 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 month ago
7 - 11 years
10 - 14 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will drive the development and implementation of our data strategy. The ideal candidate possesses a strong blend of technical expertise and data-driven problem-solving skills. As a Senior Data Engineer, you will play a crucial role in designing, building, and optimizing our data pipelines and platforms while mentoring junior engineers. Roles & Responsibilities: Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, managing scope, timelines, and risks. Ensure data quality and integrity through rigorous testing and monitoring. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Work closely with data analysts, data scientists, and business collaborators to understand data requirements. Identify and resolve complex data-related challenges. Adhere to data engineering best practices and standards. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Familiarity with code versioning using GIT, Jenkins and code migration tools. Exposure to Jira or Rally. Identifying and implementing opportunities for automation and CI/CD. Stay up to date with the latest data technologies and trends. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 2 years of Computer Science, IT or related field experience OR Master’s degree and 8 to 10 years of Computer Science, IT or related field experience OR Bachelor’s degree and 10 to 14 years of Computer Science, IT or related field experience OR Diploma and 14 to 18 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills (Not more than 3 to 4): Demonstrated hands-on experience with cloud platforms (AWS, Azure, GCP) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Hands on experience with big data ETL performance tuning. Strong development knowledge in Databricks. Strong analytical and problem-solving skills to address complex data challenges. Good-to-Have Skills: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced working with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experience in SQL/NOSQL database, vector database for large language models Experience with prompt engineering, model fine tuning Experience with DevOps/MLOps CICD build and deployment pipeline Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Data Engineer (preferred) Databricks Certification (preferred) Any SAFe Agile certification (preferred) Soft Skills: Initiative to explore alternate technology and approaches to solving problems. Skilled in breaking down problems, documenting problem statements, and estimating efforts. Effective communication and interpersonal skills to collaborate with multi-functional teams. Excellent analytical and solving skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com
Posted 1 month ago
- 2 years
3 - 5 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What we expect of you Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions What we expect of you Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Bachelor’s degree and 2 to 5 + years of Computer Science, IT or related field experience OR Master’s degree and 1 to 4 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. Apply now and make a lasting impact with the Amgen team. careers.amgen.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.
The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.
Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.
In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.
As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.