Home
Jobs
Companies
Resume

106 Oltp Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 18 years

30 - 45 Lacs

Pune, Delhi NCR, India

Hybrid

Naukri logo

Exp - 10 to 18 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad (Hybrid), Bangalore (Hybrid), Chennai (Hybrid), Pune (Remote till office opens), Delhi NCR (Remote till office opens) JD: 10+ years of experience in Data Warehousing & Data Modeling - Dimensional/Relational/Physical/Logical. Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Good experience in stakeholder management Decent communication and experience in leading the team

Posted 2 months ago

Apply

6 - 10 years

15 - 30 Lacs

Chennai, Hyderabad, Kolkata

Work from Office

Naukri logo

About Client Hiring for One of Our Multinational Corporations! Job Description Job Title: Snowflake Developer Qualification : Graduate Relevant Experience: 6 to 8 years Must-Have Skills: Snowflake Python SQL Roles and Responsibilities: Design, develop, and optimize Snowflake-based data solutions Write and maintain Python scripts for data processing and automation Work with cross-functional teams to implement scalable data pipelines Ensure data security and performance tuning in Snowflake Debug and troubleshoot database and data processing issues Location: Kolkata,Hyderabad, Chennai, Mumbai Notice Period: Upto 60 days Mode of Work: On-site -- Thanks & Regards Nushiba Taniya M Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432408 |Nushiba@blackwhite.in |www.blackwhite.in

Posted 2 months ago

Apply

3 - 5 years

13 - 15 Lacs

Gurgaon

Work from Office

Naukri logo

Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos, powering our future growth. Data Engineer at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities. Job Responsibilities As a Data Engineer, you will be joining our Data Engineering Modernization team transforming our global financial network and improving our data products and services we provide to our internal customers. This team will leverage cutting edge data engineering modernization techniques to develop scalable solutions for managing data and building data products. In this role, you are expected to Involve from inception of projects to understand requirements, architect, develop, deploy, and maintain data. Work in a multi-disciplinary, agile squad which involves partnering with program and product managers to expand product offering based on business demands. Focus on speed to market and getting data products and services in the hands of our stakeholders and passion to transform financial industry is key to the success of this role. Maintain a positive and collaborative working relationship with teams within the NCR Atleos technology organization, as well as with wider business. Creative and inventive problem-solving skills for reduced turnaround times are required, and valued, and will be a major part of the job. And Ideal candidate would have: BA/BS in Computer Science or equivalent practical experience Experience applying machine learning and AI techniques on modernizing data and reporting use cases. Overall 3+ years of experience on Data Analytics or Data Warehousing projects. At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure. Microsoft Azure, ADF, Synapse. Programming in Python, PySpark, with experience using pandas, ml libraries etc. Data streaming with Flink/Spark structured streaming. Open-source orchestration frameworks like DBT, ADF, AirFlow Open-source data ingestion frameworks like Airbyte, Debezium Experience migrating from traditional on-prem OLTP/OLAP databases to cloud native DBaaS and/or NoSQL databases like Cassandra, Neo4J, Mongo DB etc. Deep expertise operating in a cloud environment, and with cloud native databases like Cosmos DB, Couchbase etc. Proficiency in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling. Excellent design, development, and tuning experience with SQL (OLTP and OLAP) and NoSQL databases. Experience with modern database DevOps tools like Liquibase or Redgate Flyway or DBmaestro. Deep understanding of data security and compliance, and related architecture Deep understanding and strong administrative experience with distributed data processing frameworks such as Hadoop, Spark, and others Experience with programming languages like Python, Java, Scala, and machine learning libraries. Experience with dev ops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps Experience with Agile development concepts and related tools. Ability to tune and trouble shoot performance issues across the codebase and database queries. Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions. Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Additional Skills: Leverage machine learning and AI techniques on operationalizing data pipelines and building data products. Provide data services using APIs. Containerization data products and services using Kubernetes and/or Docker.

Posted 2 months ago

Apply

4 - 8 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Requirements : Proficient in Azure / GCP / AWS cloud services (SaaS, PaaS, CaaS, and IaaS) Designing custom applications using microservices, event driven architecture, single-page applications, and micro frontends Strong experience with front-end frameworks such as React or Angular Exposure to network security, infrastructure security and application security, data privacy and compliance requirements Proficient in transactional and analytical database systems, including OLTP (Online Transaction Processing) for real-time data management and OLAP (Online Analytical Processing) for complex data analysis Skilled in NoSQL/document databases for flexible data storage and adept at leveraging data analytics for insightful decision-making Proven ability in setting up and managing CI/CD pipelines for seamless software delivery and infrastructure as code (IaC)

Posted 2 months ago

Apply

10 - 17 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and maintain reporting applications using Cognos Business Intelligence components, including Report Studio, Framework Manager, Query Studio, and Cognos Connection. Design and implement complex reports in Cognos 10 and 11 BI Suite with advanced features like drill-through, master-detail relationships, conditional formatting, report bursting, and macros. Enhance existing Framework packages by integrating new namespaces, query subjects, query items, and data tables based on evolving business requirements. Perform metadata modeling tasks such as creating projects, managing metadata, preparing business views, and publishing packages to the Cognos portal. Administer Cognos environments, including creating data source connections, deploying reports from development to production environments, and managing user permissions for packages and folders. Conduct OLTP/OLAP system analysis and maintain database schemas, such as Star and Snowflake schemas, to ensure efficient data management. Optimize report performance, write and debug complex SQL queries, and troubleshoot issues to maintain seamless reporting operations. Leverage domain expertise in banking and finance to align reporting solutions with industry-specific needs. Qualifications: Proficiency in IBM Cognos BI Suite (versions 10 and 11), including Report Studio, Framework Manager, Query Studio, and Event Studio. Strong experience in metadata modeling and Cognos administration. Hands-on expertise in OLTP/OLAP systems and database schema design (Star and Snowflake schemas). Proficiency in SQL with experience in writing and troubleshooting complex queries. Knowledge of banking and finance processes and reporting needs is a significant advantage. A minimum of 5+ years of relevant experience in Cognos BI development and administration. Excellent problem-solving abilities, attention to detail, strong communication skills, and the ability to work collaboratively with cross-functional teams. Location: On site Jeddah location (Middle East)

Posted 2 months ago

Apply

4 - 6 years

1 - 5 Lacs

Pune

Work from Office

Naukri logo

4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required

Posted 3 months ago

Apply

5 - 10 years

3 - 8 Lacs

Hyderabad, Visakhapatnam, Vijayawada

Work from Office

Naukri logo

We are looking for Trainers who have 5 plus years of experience in MS SQL Server Administrator Trainer to join our team and provide high-quality training to students. send CV ahamedrathi@gmail.com and call us at 9697818888 ( HR Rathi ) Required Candidate profile The ideal candidate should have strong hands-on experience with MS SQL Server, as well as passion for teaching & helping students develop their skills in database management & SQL programming.

Posted 3 months ago

Apply

6 - 10 years

15 - 30 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Role:- Data Modeler. Location : Bangalore, Pune, Gurugram. Work Mode:-Hybrid. Please Find the JD in Below Job Description: Data Modeler 6+ years of experience, with at least 3 years as a data modeler, Data Vault, data architect or similar roles Analyze and translate business needs to long term, optimized data models. Design, develop and maintain comprehensive conceptual, logical, and physical data models. Develop best practices for data standards and data architecture. Proficiency in data modelling tools such as ERwin, ER/Studio or similar Strong understanding of data warehousing, data vault concepts Proficient in data analysis and profiling on database systems (eg. SQL, Databricks) Good knowledge of data governance and data quality best practices Experience with cloud platforms (such as AWS, Azure, Google Cloud), NoSQL databases (like MongoDB) will be an added advantage. Excellent communication and interpersonal skills. No constraint on working from ODC if there is a client ask. Flexible on work timings based on engagement need. Mandatory : OLTP/OLAP concept ,Relational and Dimensional Modelling STAR/ Snowflake schema knowledge, Data Vault strong knowledge on writing SQL query and Entity Relationship concepts. Must have experience with RDBMS- like oracle, MS SQL Data Modeling (Conceptual, Logical, Physical). Good in analyzing data.

Posted 3 months ago

Apply

2 - 3 years

0 - 0 Lacs

Mumbai

Work from Office

Naukri logo

Job Title: Product Engineer - Big Data Location: Mumbai Experience: 3 - 8 Yrs Job Summary: As a Product Engineer - Big Data , you will be responsible for designing, building, and optimizing large-scale data processing pipelines using cutting-edge Big Data technologies. Collaborating with cross-functional teams--including data scientists, analysts, and product managers--you will ensure data is easily accessible, secure, and reliable. Your role will focus on delivering high-quality, scalable solutions for data storage, ingestion, and analysis, while driving continuous improvements throughout the data lifecycle. Key Responsibilities: ETL Pipeline Development & Optimization: Design and implement complex end-to-end ETL pipelines to handle large-scale data ingestion and processing. Utilize AWS services like EMR, Glue, S3, MSK (Managed Streaming for Kafka), DMS (Database Migration Service), Athena, and EC2 to streamline data workflows, ensuring high availability and reliability. Big Data Processing: Develop and optimize real-time and batch data processing systems using Apache Flink, PySpark, and Apache Kafka . Focus on fault tolerance, scalability, and performance. Work with Apache Hudi for managing datasets and enabling incremental data processing. Data Modeling & Warehousing: Design and implement data warehouse solutions that support both analytical and operational use cases. Model complex datasets into optimized structures for high performance, easy access, and query efficiency for internal stakeholders. Cloud Infrastructure Development: Build scalable cloud-based data infrastructure leveraging AWS tools. Ensure data pipelines are resilient and adaptable to changes in data volume and variety, while optimizing costs and maximizing efficiency using Managed Apache Airflow for orchestration and EC2 for compute resources. Data Analysis & Insights: Collaborate with business teams and data scientists to understand data needs and deliver high-quality datasets. Conduct in-depth analysis to derive insights from the data, identifying key trends, patterns, and anomalies to drive business decisions. Present findings in a clear, actionable format. Real-time & Batch Data Integration: Enable seamless integration of real-time streaming and batch data from systems like AWS MSK . Ensure consistency in data ingestion and processing across various formats and sources, providing a unified view of the data ecosystem. CI/CD & Automation: Use Jenkins to establish and maintain continuous integration and delivery pipelines. Implement automated testing and deployment workflows, ensuring smooth integration of new features and updates into production environments. Data Security & Compliance: Collaborate with security teams to ensure data pipelines comply with organizational and regulatory standards such as GDPR, HIPAA , or other relevant frameworks. Implement data governance practices to ensure integrity, security, and traceability throughout the data lifecycle. Collaboration & Cross-Functional Work: Partner with engineers, data scientists, product managers, and business stakeholders to understand data requirements and deliver scalable solutions. Participate in agile teams, sprint planning, and architectural discussions. Troubleshooting & Performance Tuning: Identify and resolve performance bottlenecks in data pipelines. Ensure optimal performance through proactive monitoring, tuning, and applying best practices for data ingestion and storage. Skills & Qualifications: Must-Have Skills: AWS Expertise: Hands-on experience with core AWS services related to Big Data, including EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, Athena, and EC2 . Strong understanding of cloud-native data architecture. Big Data Technologies: Proficiency in PySpark and SQL for data transformations and analysis. Experience with large-scale data processing frameworks like Apache Flink and Apache Kafka . Data Frameworks: Strong knowledge of Apache Hudi for data lake operations, including CDC (Change Data Capture) and incremental data processing. Database Modeling & Data Warehousing: Expertise in designing scalable data models for both OLAP and OLTP systems. In-depth understanding of data warehousing best practices. ETL Pipeline Development: Proven experience in building robust, scalable ETL pipelines for processing real-time and batch data across platforms. Data Analysis & Insights: Strong problem-solving skills with a data-driven approach to decision-making. Ability to conduct complex data analysis to extract actionable business insights. CI/CD & Automation: Basic to intermediate knowledge of CI/CD pipelines using Jenkins or similar tools to automate deployment and monitoring of data pipelines. Required Skills Big Data,Etl, AWS

Posted 3 months ago

Apply

2 - 7 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

The data engineering team's mission is to enhance the vehicle decoding accuracy, and provide high availability and high resiliency as a core service to our ACV applications. Additionally, the team is responsible for database to database ETL’s using different ingestion techniques. We are responsible for a range of critical tasks aimed at ensuring smooth and efficient functioning and high availability of ACVs data platforms. We are a crucial bridge between Infrastructure Operations, Data Infrastructure, Analytics, and Development teams providing valuable feedback and insights to continuously improve platform reliability, functionality, and overall performance. As we expand our platform, we’re offering a wide range of exciting opportunities across various roles. At ACV, we put people first and believe in the principles of trust and transparency. If you are looking for an opportunity to work with the best minds in the industry and solve unique business and technology problems? Look no further! Join us in shaping the future of the automotive marketplace! Who we are looking for: We are seeking a talented data professional as a Data Engineer to join our Data Engineering team. This role requires a strong focus and experience in software development, multi-cloud based technologies, in memory data stores, and a strong desire to learn complex systems and new technologies. It requires a sound foundation in database and infrastructure architecture, deep technical knowledge, software development, excellent communication skills, and an action-based philosophy to solve hard software engineering problems. What you will do: As part of the Data Engineering team you will be responsible for Python development for API and ETLs, application architecture, optimizing SQL queries, collaboration with teams on database and development support, and designing and developing scalable data services. As a Data Engineer at ACV Auctions you will design, develop, write, and modify code. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Actively and consistently support all efforts to simplify and enhance the customer experience. Design, develop, maintain code, and support for our web-based applications and ETLs using Python Fastapi and Python. Support multi-cloud application development. Design and build complex systems that can scale rapidly with little maintenance. Design and implement effective service/product interfaces. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in Postgres database development. Be an influencer of the team designs and direction of our owned applications Actively seek new or additional technologies to improve the data layer of our application Influence company wide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. This may include being part of the emergency after-hours on-call rotation. Perform additional duties as assigned What you will need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 1+ years of experience programming, building & supporting SaaS web applications 1+ years of experience programming in Python Fastapi 1+ years of experience with ETL workflow implementation (Airflow, Python) Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Familiarity with database security principles and best practices. Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Some experience working with: Cloud platforms preferably in AWS or GCP SQL query writing and optimization SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python #LI-NX1

Posted 3 months ago

Apply

10 - 15 years

37 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Experience: Minimum of 10+ years in database development and management roles. SQL Mastery: Advanced expertise in crafting and optimizing complex SQL queries and scripts. AWS Redshift: Proven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQL: Deep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data Pipelines: Extensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud Proficiency: Strong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data Modeling: Comprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. Scripting: Proficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications Leadership: Prior experience in leading database or data engineering teams. Data Visualization: Familiarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOps: Knowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). Certifications: Any relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure Databricks: Familiarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 3 months ago

Apply

5 - 7 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

The Snowflake Solution Architect takes ownership to collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions leveraging the Snowflake platform. This position aims to enhance team effectiveness through high-quality and timely contributions while primarily adhering to standardized procedures and practices to achieve objectives and meet deadlines, exercising discretion in problem-solving. This role will be based in Bangalore India, and reporting to the Head of SAC Snowflake Engineering. Design, develop, and maintain sophisticated data pipelines and ETL processes within Snowflake Craft efficient and optimized SQL queries for seamless data extraction, transformation, and loading Leverage Python for advanced data processing, automation tasks, and integration with various systems Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2 0 methodologies Oversee and optimize CI/CD pipelines using Azure DevOps to ensure smooth deployment of data solutions Uphold data quality, integrity, and compliance throughout the data lifecycle Troubleshoot, optimize, and enhance existing data processes and queries to boost performance Document data models, processes, and workflows clearly for future reference and knowledge sharing Employ advanced performance tuning techniques in Snowflake to optimize query performance and minimize data processing time Develop and maintain DBT models, macros, and tests for efficient data transformation management in Snowflake Manage version control using Git repositories, facilitating seamless code management and collaboration Design, implement, and maintain automated CI/CD pipelines using Azure DevOps for Snowflake and DBT deployment processes Who You Are:Hold a Bachelor s or Master s degree in Computer Science, Information Technology, or a related field A minimum of 5-7 years of proven experience as a Snowflake developer/architect or in a similar data engineering role Extensive hands-on experience with SQL and Python, showcasing proficiency in data manipulation and analysis Significant industry experiences working with DBT (Data Build Tool) for data transformation Strong familiarity with CI/CD pipelines, preferably in Azure DevOps Deep understanding of data modelling techniques (OLTP, OLAP, DBT, Data Vault 2 0) and best practices Experience with large datasets and performance tuning in Snowflake Knowledge of data governance, data security best practices, and compliance standards Familiarity with additional data technologies (eg, AWS, Azure, GCP, Five Tran) is a plus Experience in leading projects or mentoring junior developers is advantageous

Posted 3 months ago

Apply

5 - 10 years

1 - 1 Lacs

Bengaluru

Remote

Naukri logo

Greetings!!! Role:- Data Modeler with GCP Location-Chennai(Work from office) Duration:6 Months Contract Experience: 5+ years Immediate Joiner 1. Experience of 5+ years 2. Hands-on data modelling for OLTP and OLAP systems. 3. In-Depth knowledge of Conceptual, Logical and Physical data modelling. 4. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. 5. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. 6. Should have working experience on at least one data modelling tool, preferably DB Schema. 7. People with functional knowledge of mutual fund industry will be a plus. 8. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. If you are interested , please share your resume to prachi@iitjobs.com Role & responsibilities Preferred candidate profile Perks and benefits

Posted 3 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

SAP Senior Data Engineer Company Description We are a consulting company with a bunch of technology-interested and happy people! We love technology, we love design and we love quality. Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued. With us, each individual is her/himself and respects others for who they are and we believe that when a fantastic mix of people gather and share their knowledge, experiences and ideas, we can help our customers on a completely different level. We are looking for you who want to grow with us! With us, you have great opportunities to take real steps in your career and the opportunity to take great responsibility. Job Description: We are now looking for a Senior Data Engineer who is passionate about building analytical solutions on SAP Data. You will help the company to derive the most of our data assets - with ultimate accountability for the business and customer value it delivers. As a senior Data Engineer in the SAP Analytics Platform product, you will be working with one of the larger and volume intense SAP BW installations in the world. You interact with a high number of stakeholders to ensure the best utilization of our SAP BW investment. As a senior Data engineer, you are good at coaching others like leading programming sessions, presenting and aligning solutions to complex problems, and having a wide perspective on engineering practices to build scalable and robust platforms. You will deploy different Analytical features, on SAP BW, and on other SAP OLTP systems as well. But not only this, but you will also use several products from Microsoft, Google, and other tools in order to fulfill the Analytical need of your Stakeholders. As a Senior Data Engineer, you have great leadership skills believe in a non-hierarchical culture of collaboration, transparency, safety, and trust. Working closely with the Product Area leadership and other Engineers to ensure overall alignment key results. We believe you have a strong agile and DevOps mindset with Scrum Master experience so that you can drive Agile development practices. This dual role offers a unique opportunity to combine SAP BW skills with a passion for Agile methodologies and team collaboration. Qualifications A Data engineer with a minimum of 10 years experience in SAP BW development Experience in at least 3 full life cycle implementations on SAP BW or BW/4HANA with HANA modelling A person having previous extensive experience from SAP BW Retail scenarios, including experience from the following Data Sources: - General Ledger, Billing (SAP ECC) - Inventory Management (SAP ECC) - POS-data (SAP CAR) - Warehouse transactions (SAP EWM) Experience with Datasphere Experience with frontend analytics using SAP Analytics Cloud, PowerBI. Experience with CDS Views based data modeling, Data integration, and provisioning to-and-from SAP BW. Experience in Data Volume Management. Preferable project experience with large data volume Experience in Integrated Planning/BPC A great team player in an Agile team setup, willing to take different kinds of tasks to meet sprint commitment and to reach our goal together. Facilitating the events requidred by Scrum and ensuring that the team is actively driving development. Help everyone in the team, or interacting with the team, to understand Scrum theory, practices, rules and values (teaches, coaches, facilitates). Start: Immediate to 15 Days Location: Bangalore Form of employment: Full-time until further notice, we apply 6 months probationary employment We interview candidates on an ongoing basis, do not wait to submit your application.

Posted 3 months ago

Apply

10 - 15 years

25 - 30 Lacs

Noida

Work from Office

Naukri logo

An extraordinarily talented group of individuals work together every day to drive TNS success, from both professional and personal perspectives. Come join the excellence! Overview The Engineering Lead is responsible for guiding a team in the design, development, and implementation of innovative engineering solutions. This role requires a strong technical background, leadership capabilities, and excellent communication skills to align engineering efforts with business objectives. Responsibilities Must have strong Java platform and computer science fundamentals Must have strong fundamentals of Database PL/SQL Strong technical mobile software background with good analytical, problem solving, and communication skills are essential as well as the ability to work collaboratively in a team environment. Design and document software components that meet organization and industry standards. Code high quality software components in accordance with organizational standards, technical requirements, and detailed designs Must have strong fundamentals on OLTP style applications and related concepts Must have strong fundamentals and proficiency in core technologies we use for web development - Java 8, HTML, CSS, JavaScript, JSP, Spring and Hibernate Be able to work in an independent manner as part of an Agile scrum team Be able to research technical solutions and recommend options with appropriate reasoning Should be ready to take ownership and guide the team. Lead the architectural design of systems and software, ensuring scalability, reliability, and maintainability. Qualifications Provide technical direction and mentorship to the engineering team, ensuring adherence to best practices and industry standards. Deep knowledge of the software development lifecycle, including scoping, planning, conception, design, implementation, deployment and maintenance. Good problem-solving skills and ability to meet deadlines is a must. Must have a BS degree in Computer Science, Computer Engineering, Electrical Engineering or equivalent. 10-15 Years of experience If you are passionate about technology, love personal growth and opportunity, come see what TNS is all about!

Posted 3 months ago

Apply

12 - 16 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Senior Data Modeller More than 12-17 years of experience Banking DWH Project ,Teradata experience Teradata FSDM Banking Data model implementation experience ( if not at least IBM BDW – which is similar, but FSDM is will be a big plus Required Candidate profile Current expectation is onsite English is required, Arabic preferred and will be plus

Posted 3 months ago

Apply

11 - 20 years

35 - 50 Lacs

Pune, Delhi NCR, Hyderabad

Hybrid

Naukri logo

Exp - 11 to 20 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad, Bangalore, Chennai, Pune, Delhi NCR JD: 10+ years of experience in Data Warehousing & Data Modeling . Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team

Posted 3 months ago

Apply

5 - 10 years

20 - 35 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Exp - 5 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI, PUNE, CHENNAI, HYDERABAD, BANGALORE Experience & Skills: 5+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)

Posted 3 months ago

Apply

8 - 13 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Join our Team About this opportunity: Join the Ericsson team as an IT Data Engineer and contribute to our digital journey. In this role, youll design, build, test and maintain data and analytics solutions, utilizing the latest technologies and platforms, to ensure the availability and accessibility of data across multiple consumer channels. We place emphasis on actualizing solutions per Ericssons standards and architectural designs; youll work with both small and big data, create efficient, scalable and flexible data models and flows, and provide specific analytical insights to meet business requirements. What you bring: Solid experience in SAP Hana and SAP BODS development Experience in Creating SAP HANA Calculation views, Stored procedure and BODS ETL to connect with Different SAP and Third party sources. Good knowledge on data warehousing concepts (OLAP vs OLTP) Strong Knowledge on SAP HANA SQL, Stored procedures. Understanding on data integration of SAP ECC, SAP HANA and BODS Good Knowledge on Snowflake Knowledge on optimization and best practices for data modelling and Reporting. Experience: 5-8 Years Why join Ericsson? At Ericsson, you ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what s possible. To build never seen before solutions to some of the world s toughest problems. You ll be challenged, but you won t be alone. You ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, thats why we nurture it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more

Posted 3 months ago

Apply

5 - 7 years

8 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Amazon cloud database technologies Strong grasp of SQL and query languages Experience with Linux Operating Systems Preferred Experience with 2 or more of the following: Excellent troubleshooting and analytical skills with a focus on preventative solutions. MySQL, DB2 PostgreSQL, MongoDB Ability to work independently, manage priorities, and meet deadlines in a fast-paced environment. Self-directed and operate with urgency, focus, and discipline. Positive attitude and team driven. Proven expertise managing complex data systems. Experience with supporting highly transaction OLTP databases. Ex perience with some of the following technologies and concepts is not required but would be a plus : MS SQL Server,Microsoft Operating systems, AWS CLI, AWS Backup, OLTP, DSS, OLAP, Data Archiving, Linux Shell scripting, Agile Methodologies, Cloud Automation. Education and Experience: Bachelors degree in a technical discipline or equivalent experience. At least 5+ years experience performing database administration functions in AWS.

Posted 3 months ago

Apply

10 - 14 years

30 - 37 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 3 months ago

Apply

6 - 10 years

22 - 30 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 3 months ago

Apply

8 - 13 years

40 - 80 Lacs

Bengaluru

Work from Office

Naukri logo

Join our Team About this opportunity: Join the Ericsson team as an IT Data Engineer and contribute to our digital journey. In this role, youll design, build, test and maintain data and analytics solutions, utilizing the latest technologies and platforms, to ensure the availability and accessibility of data across multiple consumer channels. We place emphasis on actualizing solutions per Ericssons standards and architectural designs; youll work with both small and big data, create efficient, scalable and flexible data models and flows, and provide specific analytical insights to meet business requirements. What you bring: Solid experience in SAP Hana and SAP BODS development Experience in Creating SAP HANA Calculation views, Stored procedure and BODS ETL to connect with Different SAP and Third party sources. Good knowledge on data warehousing concepts (OLAP vs OLTP) Strong Knowledge on SAP HANA SQL, Stored procedures. Understanding on data integration of SAP ECC, SAP HANA and BODS Good Knowledge on Snowflake Knowledge on optimization and best practices for data modelling and Reporting. Experience: 5-8 Years Why join Ericsson At Ericsson, you ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what s possible. To build never seen before solutions to some of the world s toughest problems. You ll be challenged, but you won t be alone. You ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, thats why we nurture it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more

Posted 3 months ago

Apply

3 - 5 years

11 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Role: Data Engineer Experience: 4 to 6 Years Location: Bangalore Mandatory Skill: Python, Pyspark, AWS Proven experience with cloud platforms (e.g. AWS ) Strong proficiency in Python , PySpark , R , and familiarity with additional programming languages such as C++ , Rust , or Java . Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL . Experience with the Apache Spark , and multi-cloud platforms (AWS, GCP, Azure). Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus.

Posted 3 months ago

Apply

3 - 5 years

11 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Experience: 3 to 5 Years Location: Bangalore Mandatory Skill: Python, Pyspark, AWS Good to Have: Palantir Proven experience with cloud platforms (e.g., Palantir , AWS ) and data security best practices. Strong proficiency in Python , PySpark , R , and familiarity with additional programming languages such as C++ , Rust , or Java . Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL . Experience with the Apache Spark , and multi-cloud platforms (AWS, GCP, Azure). Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies