Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 10 years
3 - 8 Lacs
Hyderabad, Visakhapatnam, Vijayawada
Work from Office
We are looking for Trainers who have 5 plus years of experience in MS SQL Server Administrator Trainer to join our team and provide high-quality training to students. send CV ahamedrathi@gmail.com and call us at 9697818888 ( HR Rathi ) Required Candidate profile The ideal candidate should have strong hands-on experience with MS SQL Server, as well as passion for teaching & helping students develop their skills in database management & SQL programming.
Posted 3 months ago
2 - 3 years
0 - 0 Lacs
Mumbai
Work from Office
Job Title: Product Engineer - Big Data Location: Mumbai Experience: 3 - 8 Yrs Job Summary: As a Product Engineer - Big Data , you will be responsible for designing, building, and optimizing large-scale data processing pipelines using cutting-edge Big Data technologies. Collaborating with cross-functional teams--including data scientists, analysts, and product managers--you will ensure data is easily accessible, secure, and reliable. Your role will focus on delivering high-quality, scalable solutions for data storage, ingestion, and analysis, while driving continuous improvements throughout the data lifecycle. Key Responsibilities: ETL Pipeline Development & Optimization: Design and implement complex end-to-end ETL pipelines to handle large-scale data ingestion and processing. Utilize AWS services like EMR, Glue, S3, MSK (Managed Streaming for Kafka), DMS (Database Migration Service), Athena, and EC2 to streamline data workflows, ensuring high availability and reliability. Big Data Processing: Develop and optimize real-time and batch data processing systems using Apache Flink, PySpark, and Apache Kafka . Focus on fault tolerance, scalability, and performance. Work with Apache Hudi for managing datasets and enabling incremental data processing. Data Modeling & Warehousing: Design and implement data warehouse solutions that support both analytical and operational use cases. Model complex datasets into optimized structures for high performance, easy access, and query efficiency for internal stakeholders. Cloud Infrastructure Development: Build scalable cloud-based data infrastructure leveraging AWS tools. Ensure data pipelines are resilient and adaptable to changes in data volume and variety, while optimizing costs and maximizing efficiency using Managed Apache Airflow for orchestration and EC2 for compute resources. Data Analysis & Insights: Collaborate with business teams and data scientists to understand data needs and deliver high-quality datasets. Conduct in-depth analysis to derive insights from the data, identifying key trends, patterns, and anomalies to drive business decisions. Present findings in a clear, actionable format. Real-time & Batch Data Integration: Enable seamless integration of real-time streaming and batch data from systems like AWS MSK . Ensure consistency in data ingestion and processing across various formats and sources, providing a unified view of the data ecosystem. CI/CD & Automation: Use Jenkins to establish and maintain continuous integration and delivery pipelines. Implement automated testing and deployment workflows, ensuring smooth integration of new features and updates into production environments. Data Security & Compliance: Collaborate with security teams to ensure data pipelines comply with organizational and regulatory standards such as GDPR, HIPAA , or other relevant frameworks. Implement data governance practices to ensure integrity, security, and traceability throughout the data lifecycle. Collaboration & Cross-Functional Work: Partner with engineers, data scientists, product managers, and business stakeholders to understand data requirements and deliver scalable solutions. Participate in agile teams, sprint planning, and architectural discussions. Troubleshooting & Performance Tuning: Identify and resolve performance bottlenecks in data pipelines. Ensure optimal performance through proactive monitoring, tuning, and applying best practices for data ingestion and storage. Skills & Qualifications: Must-Have Skills: AWS Expertise: Hands-on experience with core AWS services related to Big Data, including EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, Athena, and EC2 . Strong understanding of cloud-native data architecture. Big Data Technologies: Proficiency in PySpark and SQL for data transformations and analysis. Experience with large-scale data processing frameworks like Apache Flink and Apache Kafka . Data Frameworks: Strong knowledge of Apache Hudi for data lake operations, including CDC (Change Data Capture) and incremental data processing. Database Modeling & Data Warehousing: Expertise in designing scalable data models for both OLAP and OLTP systems. In-depth understanding of data warehousing best practices. ETL Pipeline Development: Proven experience in building robust, scalable ETL pipelines for processing real-time and batch data across platforms. Data Analysis & Insights: Strong problem-solving skills with a data-driven approach to decision-making. Ability to conduct complex data analysis to extract actionable business insights. CI/CD & Automation: Basic to intermediate knowledge of CI/CD pipelines using Jenkins or similar tools to automate deployment and monitoring of data pipelines. Required Skills Big Data,Etl, AWS
Posted 3 months ago
5 - 6 years
11 - 15 Lacs
Chennai, Pune, Delhi
Work from Office
As a member of this team, you ll collaborate with talented engineers to design, build, and optimize responsive and scalable cloud-based services that form the backbone of our technology. You ll engage with customers and partners to design solutions that can reliably handle the velocity and scale of modern data ecosystems. This role is ideal for individuals with an outcome driven mindset, who are impact-driven, thrive in dynamic environments, and excel at creative problem-solving. What You ll Do: Build Scalable Systems : Design and develop a high-performance data onboarding platform capable of handling petabytes of data in real-time. Integrate Diverse Datastores : Build robust integrations with a variety of data sources (e.g., MySQL, Mongo, Iceberg), data storages (e.g., Snowflake, Redshift, ClickHouse), and object storages (e.g., S3, GCS). Leverage ClickHouse : Create solutions that enable users to fully harness ClickHouse s exceptional performance and throughput. Collaborate Across Teams : Work closely with internal teams to ensure the platform aligns with customer needs and business objectives. Drive Innovation : Lead and influence technical discussions, continuously identifying and implementing improvements. About You: Experience : 5+ years of industry experience building high-scale, data-intensive software solutions. Expertise : Proficient in Golang (preferred) or Java , with deep experience in distributed systems and microservices architecture. Data Engineering Skills : Strong background in designing and implementing robust ETL pipelines and an understanding of data replication methodologies such as CDC. Cloud-Native Proficiency : Solid experience with cloud-native architecture and infrastructure, with hands-on knowledge of at least one major CSP. Kubernetes : Practical experience with Kubernetes (K8s), including debugging and managing distributed systems at scale. Problem Solver : Exceptional production debugging skills, with the ability to navigate and solve complex technical issues in fast-paced environments. High Autonomy : Thrives in a high-velocity setting with significant ownership and autonomy. Mindset : A founder s mindset with a focus on impact, innovation, and delivering measurable results. Collaboration : Excellent communication skills and a track record of working effectively across teams.
Posted 3 months ago
6 - 10 years
4 - 8 Lacs
Hyderabad
Work from Office
Total Yrs. of Experience6 to 10 years Relevant Yrs. of experience(Total and Relevant cannot be the same which results in sourcing irrelevant talents) 7 to 8 years Detailed JD *(Roles and Responsibilities) 12+ years of overall experience 6+ years of relevant experience S/4HANA and BW/4 HANA Project experience in relevant domain and proficient in SAP MDM, MDG hierarchy and master data extraction. Working experience or knowledge in extracting Customer Hierarchy and master data from SAP MDM to SAP S/4 and BW/4 via Informatica Working experience or knowledge in extracting Customer Hierarchy and master data attributes data from MDG to SAP BW/4 via Azure DB Working experience or knowledge in creating interfaces to push Customer Hierarchy and master data attributes data from SAP BW/4 to third party downstream systems. Must possess excellent analytical, problem solving, verbal and written communication skills. Ability to work on multiple tasks concurrently and quickly adapt to new skills. Have experience of leading a team and coordinating with different stakeholders Mandatory skillsExperience in SAP BW 4 HANA or SAP BWoH. Experience of creating virtual dataflows using SAP ODP-CDS Data sources, Open ODS Views and Composite Providers Have expertise in Extraction, Modelling, Reporting, Data loads, Transport BW Objects, especially in extracting Customer hierarchy and master data from Informatica to BW/4 via S/4 and extraction of Customer Hierarchy data from MDG data to BW/4 via Azure DB Experience in at least 1 S4 and BW4 implementation Hierarchy and master data migration Project. Ability to create complex CDS based solutions using the standard VDM, usage of Customer hierarchies, Odata Services and Data extraction. Reporting Security and Authorization concepts in S/4 BW/4 Analysis. Must Have SAP Operational Reporting skills (S/4 CDS, AFO). VDM Modeling transitioning from BW and reporting using AFO Experience on OLAP interfaces, OData, CDS Views, ODBC connect, AMDPs is a required skill. Proficient in Table mapping for SAP enabled business processes. Strong SQL ABAP Skills and SAP BW Reporting. Desired skillsExperience in SAP BW 4 HANA or SAP BWoH. Experience of creating virtual dataflows using SAP ODP-CDS Data sources, Open ODS Views and Composite Providers Have expertise in Extraction, Modelling, Reporting, Data loads, Transport BW Objects, especially in extracting Customer hierarchy and master data from Informatica to BW/4 via S/4 and extraction of Customer Hierarchy data from MDG data to BW/4 via Azure DB Experience in at least 1 S4 and BW4 implementation Hierarchy and master data migration Project. Ability to create complex CDS based solutions using the standard VDM, usage of Customer hierarchies, Odata Services and Data extraction. Reporting Security and Authorization concepts in S/4 BW/4 Analysis. Must Have SAP Operational Reporting skills (S/4 CDS, AFO). VDM Modeling transitioning from BW and reporting using AFO Experience on OLAP interfaces, OData, CDS Views, ODBC connect, AMDPs is a required skill. Proficient in Table mapping for SAP enabled business processes. Strong SQL ABAP Skills and SAP BW Reporting. DomainBW Lead ",
Posted 3 months ago
3 - 6 years
22 - 27 Lacs
Noida
Work from Office
We at Innovaccer are looking for a Software Development Engineer-II (Backend) to build the most amazing product experience. you'll get to work with other engineers to build delightful feature experiences to understand and solve our customer s pain points. A Day in the Life Building efficient and reusable applications and abstractions. Identify and communicate back-end best practices. Participate in the project life-cycle from pitch/prototyping through definition and design to build, integration, QA and delivery Analyze and improve the performance, scalability, stability, and security of the product Improve engineering standards, tooling, and processes What You Need 3+ years of experience with a start-up mentality and high willingness to learn Expert in Python and experience with any web framework (Django, FastAPI, Flask etc) Aggressive problem diagnosis and creative problem-solving skill Expert in Kubernetes and containerization Experience in RDBMS & NoSQL database such as Postgres, MongoDB, (any OLAP database is good to have) Experience in cloud service providers such as AWS or Azure. Experience in Kafka, RabbitMQ, or other queuing services is good to have. Working experience in BigData / Distributed Systems and Async Programming Bachelors degree in Computer Science/Software Engineering. Preferred Skills Expert in Python and any web framework. Experience in working with Kubernetes and any cloud provider(s). Any SQL or NoSQL database. Working experience in distributed systems. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industrys best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? Weve got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your childs well-being first. *India offices
Posted 3 months ago
10 - 15 years
37 - 45 Lacs
Bengaluru
Work from Office
Experience: Minimum of 10+ years in database development and management roles. SQL Mastery: Advanced expertise in crafting and optimizing complex SQL queries and scripts. AWS Redshift: Proven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQL: Deep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data Pipelines: Extensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud Proficiency: Strong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data Modeling: Comprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. Scripting: Proficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications Leadership: Prior experience in leading database or data engineering teams. Data Visualization: Familiarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOps: Knowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). Certifications: Any relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure Databricks: Familiarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience
Posted 3 months ago
5 - 7 years
25 - 30 Lacs
Bengaluru
Work from Office
The Snowflake Solution Architect takes ownership to collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions leveraging the Snowflake platform. This position aims to enhance team effectiveness through high-quality and timely contributions while primarily adhering to standardized procedures and practices to achieve objectives and meet deadlines, exercising discretion in problem-solving. This role will be based in Bangalore India, and reporting to the Head of SAC Snowflake Engineering. Design, develop, and maintain sophisticated data pipelines and ETL processes within Snowflake Craft efficient and optimized SQL queries for seamless data extraction, transformation, and loading Leverage Python for advanced data processing, automation tasks, and integration with various systems Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2 0 methodologies Oversee and optimize CI/CD pipelines using Azure DevOps to ensure smooth deployment of data solutions Uphold data quality, integrity, and compliance throughout the data lifecycle Troubleshoot, optimize, and enhance existing data processes and queries to boost performance Document data models, processes, and workflows clearly for future reference and knowledge sharing Employ advanced performance tuning techniques in Snowflake to optimize query performance and minimize data processing time Develop and maintain DBT models, macros, and tests for efficient data transformation management in Snowflake Manage version control using Git repositories, facilitating seamless code management and collaboration Design, implement, and maintain automated CI/CD pipelines using Azure DevOps for Snowflake and DBT deployment processes Who You Are:Hold a Bachelor s or Master s degree in Computer Science, Information Technology, or a related field A minimum of 5-7 years of proven experience as a Snowflake developer/architect or in a similar data engineering role Extensive hands-on experience with SQL and Python, showcasing proficiency in data manipulation and analysis Significant industry experiences working with DBT (Data Build Tool) for data transformation Strong familiarity with CI/CD pipelines, preferably in Azure DevOps Deep understanding of data modelling techniques (OLTP, OLAP, DBT, Data Vault 2 0) and best practices Experience with large datasets and performance tuning in Snowflake Knowledge of data governance, data security best practices, and compliance standards Familiarity with additional data technologies (eg, AWS, Azure, GCP, Five Tran) is a plus Experience in leading projects or mentoring junior developers is advantageous
Posted 3 months ago
5 - 10 years
1 - 1 Lacs
Bengaluru
Remote
Greetings!!! Role:- Data Modeler with GCP Location-Chennai(Work from office) Duration:6 Months Contract Experience: 5+ years Immediate Joiner 1. Experience of 5+ years 2. Hands-on data modelling for OLTP and OLAP systems. 3. In-Depth knowledge of Conceptual, Logical and Physical data modelling. 4. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. 5. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. 6. Should have working experience on at least one data modelling tool, preferably DB Schema. 7. People with functional knowledge of mutual fund industry will be a plus. 8. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. If you are interested , please share your resume to prachi@iitjobs.com Role & responsibilities Preferred candidate profile Perks and benefits
Posted 3 months ago
1 - 5 years
5 - 9 Lacs
Hyderabad
Work from Office
5+ years Hyderabad Salary not disclosed Skills Required Power Bi DAX ETL (Extract, Transform, Load) SQL Server Reporting Services (SSRS) SQL Server Integration Services (SSIS) Description Job Opportunity: Power BI Developer Job Type: Full - Time Location: Hyderabad Experience: 5+ Years Notice Period: Immediate to 30 days (Candidates currently serving notice) Work Model: Work from Office Summary: We are seeking a talented Power BI developer to join our Business Intelligence team. A Power BI Developer is responsible for designing, developing, and maintaining business intelligence solutions using Microsoft s Power BI platform. Their primary role is to turn data into meaningful insights that can be used to make informed business decisions Responsibilities: Design, develop, and maintain Power BI dashboards and reports. Transform raw data into meaningful metrics and visualizations. Implement row-level security on data and understand application security layer models in Power BI. Integrate multiple data sources, including databases, Excel spreadsheets, cloud-based and on-premises sources. Use Power Query to manipulate data from various sources. Optimize dashboards and reports for performance. Ensure data refresh processes are efficient and timely. Collaborate with stakeholders to understand and refine report requirements. Present complex information using data visualization techniques. Provide training and support to business users on Power BI. Ensure data accuracy and integrity. Troubleshoot issues with reports and data models. Keep up-to-date with the latest Power BI updates and features. Document technical specifications, data models, and processes. Maintain a library of model documents, templates, or other reusable knowledge assets. Requirements: 4+ relevant years of experience. Proven experience with Power BI or other BI tools. Strong understanding of database management systems, online analytical processing (OLAP), and ETL (Extract, Transform, Load) framework. Proficiency in DAX (Data Analysis Expressions). Familiarity with SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) is a plus. Strong analytical skills with the ability to collect, organize, and analyze significant amounts of information. Excellent communication and presentation skills.
Posted 3 months ago
4 - 8 years
6 - 10 Lacs
Bengaluru
Work from Office
The Database Test and Tools Development for Linux/Unix OS platforms team is looking for bright and talented engineers to work on Linux on Zseries platform. It is an opportunity to demonstrate your skills as a Test Development Engineer. The team has the unique opportunity to make significant contributions to the Oracle database technology stack testing across different vendor platforms like Zlinux and LoP. Detailed Description and Job Requirements The team works on upcoming releases of the Oracle Database - XML/XDB, Real Application Clusters, Flashback, Oracle Storage Appliance, Automatic Storage Management, Data access, Data Warehouse, Transaction Management, Optimization, Parallel Query, ETL, OLAP, Replication/Streams, Advanced queuing / Messaging, OracleText, Backup/Recovery, High availability and more functional areas The team has good opportunities to learn, identify and work on initiatives to improve productivity, quality, testing infrastructure, and tools for automation. We are looking for engineers with below requirements Requirement: B.E / B.Tech in CS or equivalent with consistently good academic record with 4+ years of experience. Strong in Oracle SQL, PLSQL and Database concepts. Experience with UNIX Operating system. Good in UNIX operating system concepts, commands and services. Knowledge of C/C++ or Java. Experience with Shell scripting, Perl, Python, Proficiency in any one or two. Good communication skills. Good debugging skills.
Posted 3 months ago
12 - 16 years
20 - 35 Lacs
Pune
Work from Office
Senior Data Modeller More than 12-17 years of experience Banking DWH Project ,Teradata experience Teradata FSDM Banking Data model implementation experience ( if not at least IBM BDW – which is similar, but FSDM is will be a big plus Required Candidate profile Current expectation is onsite English is required, Arabic preferred and will be plus
Posted 3 months ago
11 - 20 years
35 - 50 Lacs
Pune, Delhi NCR, Hyderabad
Hybrid
Exp - 11 to 20 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad, Bangalore, Chennai, Pune, Delhi NCR JD: 10+ years of experience in Data Warehousing & Data Modeling . Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team
Posted 3 months ago
5 - 10 years
20 - 35 Lacs
Pune, Delhi NCR, Bengaluru
Hybrid
Exp - 5 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI, PUNE, CHENNAI, HYDERABAD, BANGALORE Experience & Skills: 5+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)
Posted 3 months ago
5 - 9 years
7 - 11 Lacs
Bengaluru
Work from Office
What You Will Do Be part of the data team to design and build a BI and analytics solution Implement batch and near real time data movement design patterns and define best practices in data engineering. Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, and developers. Work closely with a team of data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment Build data pipelines from a wide variety of sources Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery Backtracking, troubleshooting failures and provide fix as needed Update and maintain key data cloud solution deliverables and diagrams Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision May participate in a 24x7 on call rotation once the development is complete Who You Are (Basic Qualifications) Bachelors degree in Computer Science, Engineering, or related IT area with at least 5+ years of experience in software development. Primary Skill set : Data Engineering, Python (Especially strong in Object oriented Programming concepts), AWS (Glue, Lambda, EventBridge, Step functions and serverless architecture),Columnar DB(Redshift or Snowflake), Matillion (or any ETL tool) Secondary Skill set: Working with APIs, Spark, GIT/CICD, SQL,SPARK,STEP FUNCTIONS At least 2+ of hands-on experience in designing, implementing, managing large-scale and ETL solutions. At least 3 years of hands-on experience in business intelligence, data modelling, data engineering, ETL, multi-dimensional data warehouses, cubes, with expertise in relevant languages and frameworks like SQL, Python etc. Hands on experience with designing and fine-tuning queries in Redshift Strong knowledge of Data Engineering, Data Warehousing, OLAP and database concepts Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc Be able to analyze large complex data sets to resolve data quality issues What Will Put You Ahead AWS certifications like Solution Architect (SAA/SAP) or Data Analytics Specialty (DAS) Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. Exposure to visualization tools, such as Tableau or PowerBI. Experience with OLAP technologies and Data Virtualization (using Denodo) Knowledge in Accounting and Finance
Posted 3 months ago
8 - 13 years
30 - 35 Lacs
Bengaluru
Work from Office
Join our Team About this opportunity: Join the Ericsson team as an IT Data Engineer and contribute to our digital journey. In this role, youll design, build, test and maintain data and analytics solutions, utilizing the latest technologies and platforms, to ensure the availability and accessibility of data across multiple consumer channels. We place emphasis on actualizing solutions per Ericssons standards and architectural designs; youll work with both small and big data, create efficient, scalable and flexible data models and flows, and provide specific analytical insights to meet business requirements. What you bring: Solid experience in SAP Hana and SAP BODS development Experience in Creating SAP HANA Calculation views, Stored procedure and BODS ETL to connect with Different SAP and Third party sources. Good knowledge on data warehousing concepts (OLAP vs OLTP) Strong Knowledge on SAP HANA SQL, Stored procedures. Understanding on data integration of SAP ECC, SAP HANA and BODS Good Knowledge on Snowflake Knowledge on optimization and best practices for data modelling and Reporting. Experience: 5-8 Years Why join Ericsson? At Ericsson, you ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what s possible. To build never seen before solutions to some of the world s toughest problems. You ll be challenged, but you won t be alone. You ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, thats why we nurture it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more
Posted 3 months ago
5 - 10 years
22 - 27 Lacs
Pune
Work from Office
Responsibilities As a Functional Consultant, responsible for performing detailed analysis of the Banks Data Warehouse requirements against the business systems, the data and processes for various change request and translate to the IT changes needed. Also analyze the data flows and reporting processes, create, update and maintain functional specification documentation, document issues, support in identify root causes and find solutions. Support development of technical designs and user documentation Skills Must have At least 5-10 years of relevant experience Banking /Financial Services, Regulatory reporting and Enterprise Datawarehouse experience is required. Knowledge of AnaCredit and or FinRep / CoRep would be a big plus. Technical Requirements: Able to create and execute complex SQL queries. Strong understanding of Database concepts A good understanding of data modelling. A background in development. Experience with MS SQL Server and SSIS. Experience with MS Visual Studio Experience with data warehousing, datamarts and OLAP Strong experience working with interfaces and have a good understanding of ETL techniques and reporting technologies. Required Experience: Experienced in translating Business requirements to Technical/Functional Requirements (Creating,Updating and Maintaining functional specification document) Understanding of the data governance life cycle. Experienced in analyzing the data flows and reporting processes Experienced in documenting issues, support in identify root causes and find solutions. The ability to understand financial products and how they are reported on by Risk, Product Control and Finance departments within an investment bank. Experience with Jira. Have worked in a fast paced BAU environment and appreciate deadlines the regulatory reporting teams are working towards. Pay great attention to detail and be client driven with a focus on delivery and milestones. Have exceptional communication skills, being able to converse with a wide variety of stakeholders including department heads. Experience with an agile/scrum way of working. Flexibility is a must. Self-starter who is proactive and driven. Detailed and structured way of working. Strong communication skills. Nice to have Experience in OneSumX, Analytics tool such as Power BI / Excel / SSRS / SSAS, WinSCP Training on Datawarehousing, BI, Data modelling, Microsoft SQL server, SSIS, Scrum Other Languages English: C2 Proficient Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Technical BA (MS SQL + Java) Functional/System Analysis India Remote India MS Dynamics Senior Functional Consultant Functional/System Analysis Switzerland Zurich Senior System Analyst Functional/System Analysis Malaysia Kuala Lumpur Pune, India Req. VR-111959 Functional/System Analysis BCM Industry 25/02/2025 Req. VR-111959 Apply for Technical Lead- Enterprise Warehouse in Pune *
Posted 3 months ago
5 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
The essential functions include, but are not limited to the following: Business Requirements Analysis Stakeholder Engagement o Work directly with business stakeholders to gather reporting needs and translate them into effective technical solutions. o Conduct requirement analysis, prototyping, and validation to ensure accuracy and usability. o Provide user training and ongoing support to ensure adoption and success. Develop and Maintain Power BI Reports Dashboards o Design, build, and optimize dashboards, reports, and analytics using Power BI. o Integrate data from SQL Server, ERPs, and other business systems. o Ensure reports are intuitive, visually engaging, and easily understood by business users. Support Data Infrastructure Implementation of Data Lake Databricks o Assist in the design and implementation of tools like Data Lake and Databricks for enterprise-wide data management. o Collaborate with internal and 3 rd party data engineers and other technical resources to ensure seamless data integration and transformation. o Ensure data governance, security, and best practices are followed. Data Modeling ETL Development o Design and develop data models to support analytics and reporting needs. o Create and optimize ETL processes for data ingestion, transformation, and reporting. o Ensure data accuracy, consistency, and performance optimization. Collaboration Continuous Improvement o Partner with cross-functional teams to align data solutions with business objectives. o Stay updated with emerging BI and data technologies to drive continuous improvement. o Advocate for best practices in data visualization, analytics, and user experience. MINIMUM QUALIFICATIONS (KNOWLEDGE, SKILLS, AND ABILITIES) Technical: Power BI expertise: advanced skills in DAX, Power Query (M), and report design. SQL Server Database Knowledge: ability to query, manipulate, and model data. Data Integration: experience integrating data from ERPs, CRMs, and other business applications. Cloud Data Services: familiarity with Data Lake, Databricks, and related toolsets on Azure and/or AWS ETL Data Transformation: experience with tools like Azure Data Factory, SSIS, or similar. Data Modeling: strong understanding of star schema, snowflake schema, and OLAP concepts. Working knowledge of Python or Spark for working with Databricks and large-scale data processing. Technical Documentation Skills: Ability to document technical processes, designs, and user instructions clearly and effectively. Non-technical: Excellent Verbal and Written Communication: Ability to communicate complex technical solutions clearly to both technical and non-technical stakeholders. Team Player Collaboration: Ability to work effectively with a geographically dispersed team, including remote collaboration with the systems teams based in Canada and the US. Strong Business Process Understanding: Ability to quickly understand business processes, work with business stakeholders to gather requirements, and translate those into technical solutions. User Experience Focus: Understanding of the importance of creating user-friendly solutions that ensure a seamless experience for end-users. Problem Solving Adaptability: Ability to think critically and creatively to solve problems and adapt to changing business needs. Independent and Self-Driven: Ability to function productively with minimal supervision, allied with sound judgement on when to ask for help or direction. Excellent business Acumen: Solid knowledge and experience of how businesses function, how stakeholders interact, business strategy and objectives. The ideal candidate will possess the following: Education: Bachelor s degree in Computer Science, Information Systems, Data Analytics, or a related field. Certifications: o Microsoft Certified: Power BI Data Analyst Associate o Additional related certifications in Azure and/or AWS are highly desirable. Experience: o 5+ years of experience in business intelligence, data analytics, or data engineering. o Experience working with ERP systems, such as Acumatica, Yardi, Sage 300. o Proven track record of engaging with business users to deliver actionable insights. o Experience working in a remote, cross-functional, international team is a plus.
Posted 3 months ago
5 - 7 years
8 - 10 Lacs
Hyderabad
Work from Office
Amazon cloud database technologies Strong grasp of SQL and query languages Experience with Linux Operating Systems Preferred Experience with 2 or more of the following: Excellent troubleshooting and analytical skills with a focus on preventative solutions. MySQL, DB2 PostgreSQL, MongoDB Ability to work independently, manage priorities, and meet deadlines in a fast-paced environment. Self-directed and operate with urgency, focus, and discipline. Positive attitude and team driven. Proven expertise managing complex data systems. Experience with supporting highly transaction OLTP databases. Ex perience with some of the following technologies and concepts is not required but would be a plus : MS SQL Server,Microsoft Operating systems, AWS CLI, AWS Backup, OLTP, DSS, OLAP, Data Archiving, Linux Shell scripting, Agile Methodologies, Cloud Automation. Education and Experience: Bachelors degree in a technical discipline or equivalent experience. At least 5+ years experience performing database administration functions in AWS.
Posted 3 months ago
2 - 7 years
4 - 9 Lacs
Bengaluru
Work from Office
Design, develop, test, and support dynamic, interactive, and responsive data visualizations using Power BI and DAXDeveloping and executing SQL scripts for data modellingCollaborate with data analysts and business stakeholders to identify business requirements, design data visualizations, and develop reports that meet their needsDemonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologiesCommunicate risks and ensure understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in Production grade Power BI dashboards with optimized data modeling Good hands-on experience of complex SQL, DAX based data modeling Experience with data visualization best practices, including storytelling, color theory, and human-centered design. Ability to clearly communicate complex business problems and technical solutions. Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Must be a strong team player/leader Strong oral written and interpersonal skills for interacting and throughout all levels of the organization.
Posted 3 months ago
2 - 7 years
4 - 9 Lacs
Gurgaon
Work from Office
Design, develop, test, and support dynamic, interactive, and responsive data visualizations using Power BI and DAXDeveloping and executing SQL scripts for data modellingCollaborate with data analysts and business stakeholders to identify business requirements, design data visualizations, and develop reports that meet their needsDemonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologiesCommunicate risks and ensure understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in Production grade Power BI dashboards with optimized data modeling Good hands-on experience of complex SQL, DAX based data modeling Experience with data visualization best practices, including storytelling, color theory, and human-centered design. Ability to clearly communicate complex business problems and technical solutions. Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Must be a strong team player/leader Strong oral written and interpersonal skills for interacting and throughout all levels of the organization.
Posted 3 months ago
10 - 14 years
30 - 37 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .
Posted 3 months ago
6 - 10 years
22 - 30 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry
Posted 3 months ago
8 - 13 years
40 - 80 Lacs
Bengaluru
Work from Office
Join our Team About this opportunity: Join the Ericsson team as an IT Data Engineer and contribute to our digital journey. In this role, youll design, build, test and maintain data and analytics solutions, utilizing the latest technologies and platforms, to ensure the availability and accessibility of data across multiple consumer channels. We place emphasis on actualizing solutions per Ericssons standards and architectural designs; youll work with both small and big data, create efficient, scalable and flexible data models and flows, and provide specific analytical insights to meet business requirements. What you bring: Solid experience in SAP Hana and SAP BODS development Experience in Creating SAP HANA Calculation views, Stored procedure and BODS ETL to connect with Different SAP and Third party sources. Good knowledge on data warehousing concepts (OLAP vs OLTP) Strong Knowledge on SAP HANA SQL, Stored procedures. Understanding on data integration of SAP ECC, SAP HANA and BODS Good Knowledge on Snowflake Knowledge on optimization and best practices for data modelling and Reporting. Experience: 5-8 Years Why join Ericsson At Ericsson, you ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what s possible. To build never seen before solutions to some of the world s toughest problems. You ll be challenged, but you won t be alone. You ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, thats why we nurture it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more
Posted 3 months ago
3 - 5 years
11 - 12 Lacs
Bengaluru
Work from Office
Role: Data Engineer Experience: 4 to 6 Years Location: Bangalore Mandatory Skill: Python, Pyspark, AWS Proven experience with cloud platforms (e.g. AWS ) Strong proficiency in Python , PySpark , R , and familiarity with additional programming languages such as C++ , Rust , or Java . Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL . Experience with the Apache Spark , and multi-cloud platforms (AWS, GCP, Azure). Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus.
Posted 3 months ago
3 - 5 years
11 - 12 Lacs
Bengaluru
Work from Office
Experience: 3 to 5 Years Location: Bangalore Mandatory Skill: Python, Pyspark, AWS Good to Have: Palantir Proven experience with cloud platforms (e.g., Palantir , AWS ) and data security best practices. Strong proficiency in Python , PySpark , R , and familiarity with additional programming languages such as C++ , Rust , or Java . Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL . Experience with the Apache Spark , and multi-cloud platforms (AWS, GCP, Azure). Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.
The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.
Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.
In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.
As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2