Home
Jobs
Companies
Resume

105 Oltp Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

25 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Chennai

Work from Office

Naukri logo

Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery.

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 week ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2412_JOB Date Opened 04/02/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Data Modeller City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer Location: Bangalore - Onsite Experience: 8 - 15 years Type: Full-time Role Overview We are seeking an experienced Data Engineer to build and maintain scalable, high-performance data pipelines and infrastructure for our next-generation data platform. The platform ingests and processes real-time and historical data from diverse industrial sources such as airport systems, sensors, cameras, and APIs. You will work closely with AI/ML engineers, data scientists, and DevOps to enable reliable analytics, forecasting, and anomaly detection use cases. Key Responsibilities Design and implement real-time (Kafka, Spark/Flink) and batch (Airflow, Spark) pipelines for high-throughput data ingestion, processing, and transformation. Develop data models and manage data lakes and warehouses (Delta Lake, Iceberg, etc) to support both analytical and ML workloads. Integrate data from diverse sources: IoT sensors, databases (SQL/NoSQL), REST APIs, and flat files. Ensure pipeline scalability, observability, and data quality through monitoring, alerting, validation, and lineage tracking. Collaborate with AI/ML teams to provision clean and ML-ready datasets for training and inference. Deploy, optimize, and manage pipelines and data infrastructure across on-premise and hybrid environments. Participate in architectural decisions to ensure resilient, cost-effective, and secure data flows. Contribute to infrastructure-as-code and automation for data deployment using Terraform, Ansible, or similar tools. Qualifications & Required Skills Bachelors or Master’s in Computer Science, Engineering, or related field. 6+ years in data engineering roles, with at least 2 years handling real-time or streaming pipelines. Strong programming skills in Python/Java and SQL. Experience with Apache Kafka, Apache Spark, or Apache Flink for real-time and batch processing. Hands-on with Airflow, dbt, or other orchestration tools. Familiarity with data modeling (OLAP/OLTP), schema evolution, and format handling (Parquet, Avro, ORC). Experience with hybrid/on-prem and cloud platforms (AWS/GCP/Azure) deployments. Proficient in working with data lakes/warehouses like Snowflake, BigQuery, Redshift, or Delta Lake. Knowledge of DevOps practices, Docker/Kubernetes, Terraform or Ansible. Exposure to data observability, data cataloging, and quality tools (e.g., Great Expectations, OpenMetadata). Good-to-Have Experience with time-series databases (e.g., InfluxDB, TimescaleDB) and sensor data. Prior experience in domains such as aviation, manufacturing, or logistics is a plus. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 10 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Purpose: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. Requirements: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. The ideal candidate should have: Key Responsibilities: Create Data lakes from scratch, configure existing systems and provide user support Understand different datasets and Storage elements to bring data Have good knowledge and work experience in ADF, Synapse Data pipelines Have good knowledge in python, Py spark and spark sql Implement Data security at DB and data movement layers Should have experience in ci/cd data pipelines Work with internal teams to design, develop and maintain software Qualifications & Key skills required: Expertise in Datalakes, Lakehouse, Synapse Analytics, Data bricks, Tsql, sql server, Synapse Db, Data warehouse Hands-on experience in ETL, ELT, handling large volume of data and files. Working knowledge in json, parquet, csv, xl, structured, unstructured data and other data sets Exposure to any Source Control Management, like TFS/Git/SVN Understanding of non-functional requirements Should be proficient in Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems Experience in Azure Data Fabric, MS Purview, MDM tools is an added advantage A good team player and excellent communicator

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

The Senior Developer, SQL will analyze and develop SQL for existing data structures and new applications. Responsibilities Analyze and develop queries, stored procedures, and functions for use by existing and new data sets. Responsible for code development in modern RDMS systems as well as unstructured data stores. Get familiar with existing business process and data structure. Learn code standards, development processes and tools. Gather and understand requirements and develop business modules using various data tools. Write complex queries to conduct research and implement complex business rules. Develop sequences of processes populating data into new database structure. Create required table structure for loading data and ingestion processes. Performance optimization and troubleshooting new code. Develop PowerShell script to schedule business jobs. Document technical processes and approach. Collaborate with team members and participate in design discussions. Knowledge and Experience 10+ years of experience and a degree in a technology field. Bachelors degree Strong experience with Data warehouse, T-SQL development supporting OLTP applications (complex queries, stored procedures, scripts, etc.). Experience with Postgres databases. Experience working with unstructured data and related tools. Good communication and problem-solving skills. Be able to understand requirements and communicate problems and solutions clearly. Experience working in agile development teams and utilizing SDLC tools such as Azure DevOps and Jira. Experience automating delivery of database changes using DevOps tools such as Azure DevOps and Jenkins. Experience with ETL processes utilizing SSIS. Experience with SSRS is desired. Experience with PowerShell scripting is desired. Familiar with data modeling concepts Basic knowledge of Mortgage domain. Strong verbal and written communication skills Schedule Shift timings: 3PM - 12:30AM

Posted 1 week ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Job Purpose The Senior Developer, SQL will analyze and develop SQL for existing data structures and new applications. Responsibilities Analyze and develop queries, stored procedures, and functions for use by existing and new data sets. Responsible for code development in modern RDMS systems as well as unstructured data stores. Get familiar with existing business process and data structure. Learn code standards, development processes and tools. Gather and understand requirements and develop business modules using various data tools. Write complex queries to conduct research and implement complex business rules. Develop sequences of processes populating data into new database structure. Create required table structure for loading data and ingestion processes. Performance optimization and troubleshooting new code. Develop PowerShell script to schedule business jobs. Document technical processes and approach. Collaborate with team members and participate in design discussions. Knowledge and Experience 10+ years of experience and a degree in a technology field. Bachelors degree Strong experience with Data warehouse, T-SQL development supporting OLTP applications (complex queries, stored procedures, scripts, etc. ). Experience with Postgres databases. Experience working with unstructured data and related tools. Good communication and problem-solving skills. Be able to understand requirements and communicate problems and solutions clearly. Experience working in agile development teams and utilizing SDLC tools such as Azure DevOps and Jira. Experience automating delivery of database changes using DevOps tools such as Azure DevOps and Jenkins. Experience with ETL processes utilizing SSIS. Experience with SSRS is desired. Experience with PowerShell scripting is desired. Familiar with data modeling concepts Basic knowledge of Mortgage domain. Strong verbal and written communication skills Schedule Shift timings: 3PM 12:30AM

Posted 1 week ago

Apply

4.0 - 14.0 years

22 - 25 Lacs

Pune

Work from Office

Naukri logo

Join us as a Snr Developer at Barclays, where youll take part in the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. Youll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Snr Developer you should have experience with: Minimum Qualification - B. E. / B. Tech or equivalent. 10+ years of experience in database development in banking domain Expert in Oracle, SQL Server Proficiency in SQL Experience in data migration tools Optimize OLTP/OLAP and operational DB systems for Valpre and CLM. Lead database design for all customer lifecycle management services Implement migration strategies Implement data masking/encryption for PII under PCI DSS and GDPR Mentor junior developers in DB best practices Establish CI/CD pipelines with audit trails Collaborate with Data Architect, Data Analyst and Data Modelers Some other highly valued skills include: Effective and Efficient stakeholder management. Good Communication Skills. Good working knowledge and hands-on experience of workflow application and business rules engine. Good Knowledge of Banking Domain. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 week ago

Apply

5.0 - 10.0 years

35 - 40 Lacs

Chennai

Work from Office

Naukri logo

Job Summary We are seeking a seasoned Data Modeller with deep expertise in designing, implementing, and optimizing data models for both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) systems. The ideal candidate will have hands-on experience working with Google Cloud Platform (GCP) database services such as AlloyDB, CloudSQL, and BigQuery, and will leverage strong data modeling and performance tuning skills to deliver scalable, high-performing enterprise data solutions. Key Responsibilities: Design and develop conceptual, logical, and physical data models for OLTP and OLAP systems, ensuring alignment with business goals and technology roadmaps. Architect end-to-end enterprise data warehouse and operational data store solutions with best practices in dimensional and normalized modeling (Inmon, Kimball, 3NF). Implement database optimization strategies including indexing, partitioning, and data sharding to enhance performance, scalability, and availability. Lead cloud migration and modernization initiatives, focusing on GCP services such as AlloyDB, CloudSQL, and BigQuery to support hybrid and cloud-native data architectures. Collaborate closely with data engineers, BI developers, and business stakeholders to translate complex business requirements into robust data models and schemas. Utilize advanced data modeling tools such as DBSchema, ERWin, or Visio to document and manage database schemas and metadata. Establish and enforce data governance, data quality, and metadata management frameworks to ensure data accuracy and compliance. Stay abreast of emerging trends and technologies in data architecture, cloud databases, and analytics to continuously improve data platform capabilities. Required Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or related field; Masters preferred. 10+ years of experience in data modeling and architecture for OLTP and OLAP environments. Strong proficiency in conceptual, logical, and physical data modeling techniques and methodologies (3NF, Inmon, Kimball). Extensive hands-on experience with data modeling tools such as DBSchema, ERWin, or equivalent. Expertise in indexing, partitioning, query optimization, and data sharding for high-volume transactional and analytical databases. Proven experience designing and optimizing cloud databases on Google Cloud Platform (AlloyDB, CloudSQL, BigQuery). Strong SQL skills with proficiency in PL/SQL, T-SQL, and performance tuning. Familiarity with ETL frameworks and BI tools such as SSIS, Power BI, Tableau, and Azure Data Factory is a plus. Excellent problem-solving skills with keen attention to detail and data quality. Strong communication, collaboration, and stakeholder management skills. Experience or knowledge of financial services or mutual fund industry data models is advantageous. Familiarity with Agile/Scrum methodologies.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Mumbai

Work from Office

Naukri logo

Essential Services: Role & Location fungibility To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service. The role descriptions give you an overview of the responsibilities; it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

16 - 25 Lacs

Bengaluru

Remote

Naukri logo

MINIMUM REQUIRED EDUCATION AND EXPERIENCE: Bachelors degree in information technology, Computer Science, or any IT related field 5+ years of in-depth experience in an SQL Server development environment Extensive hands-on experience in SQL Server development, ETL experience with custom coding. In-depth understanding of SQL Server database management architecture, transactional processing (OLTP) and ETL (Extract, transform, load) methodologies. Working experience with transactional database systems, and large volume data processing on SQL Server platform. In-depth knowledge of T-SQL (views, functions, and stored procedures) in SSMS tool and SQL Server Integration Services (SSIS) development in Visual Studio environment The knowledge of and experience utilizing SSIS building custom packages is a plus. Experience in the pharmaceutical clinical trial, financial, or IT development roles is a plus. Strong Microsoft Office skills; including advanced Excel skills. Ability to work in Agile development environment. Proven abilities to take initiative and be innovative. Strong problem-solving, analytical skills and attention to detail. Possess positive attitude and ability to work in a team environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

35 - 45 Lacs

Mumbai

Work from Office

Naukri logo

Helping careers take flight. Reshaping an industry. Enable your career to be Made on Duck Creek. WHO WE ARE Authenticity, purpose, and transparency are core to Duck Creek, and we believe insurance should be there for individuals and businesses when, where, and how they need it most. Our market-leading solutions are available on a standalone basis or as a full suite, and all are available via Duck Creek OnDemand. With more than 1,000 successful implementations to date, Duck Creek removes the IT burden for insurers so they can focus on the business of insurance. If working in a fast-paced, rapidly evolving company that is transforming one of the world s oldest and largest industries sounds exciting, let us know. We are excited you are considering Duck Creek as a future employer and hope you decide to join The Flock ! To learn more about us, visit www.duckcreek.com and follow us on our social channels for the latest information - LinkedIn and Twitter . Title : Software Engineer II WHAT YOU LL DO Designs, codes, and/or configures solutions for moderate complexity Agile stories, with little guidance from senior software engineers. Debugs and resolves moderate complexity software bugs or issues, working independently, and finds the real root cause and provide a fix without collateral damage. Writes automated unit and integration-level tests under own direction. Creates a conceptual design/architecture for small scale software solutions with guidance from an architect or more senior software engineer. Provides guidance and mentoring to more junior software engineers. Follows development standards and effectively demonstrates technical solutions to other software engineers in code reviews. Assists in making source code management decisions for one or more teams. Performs complex source code management tasks independently. Performs other related duties and activities as required. WHAT YOU VE DONE We re in search of dynamic individuals responsible to design, code, and/or configure solutions for moderate complexity Agile stories, and to create conceptual design/architecture for small scale software solutions. QUALIFICATIONS/REQUIREMENTS Education and Work Experience: Bachelor s degree, or higher education level, or its foreign equivalent, in Computer Science, Computer Information Sciences, and/or related field. Total Work Experience: 10+ years (software development), 6 years minimum Product Development Experience: 4 years minimum, 6 years preferred Specialized Knowledge, Skills, and/or Abilities: Expert in Object-oriented design, Java or .NET development, Relational OLTP queries and Relational database design Expert of XML/XSLT document design, JavaScript development, HTML5 & CSS Excels in the ability to manage deadlines, communicate in a team, and operate independently with guidance Expert in how to estimate, analyze, and the Software Product Development Lifecycle with Agile methodology Excels in Insurance domain knowledge Other Requirements: Travel: 0-10% Work Authorization: Legally authorized to work in the country of the job location. Physical: Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects, including the human body. Sedentary work involves sitting most of the time. Jobs are sedentary if walking and standing are required only occasionally and all other sedentary criteria are met WHAT WE STAND FOR Our global company celebrates & leverages the differences each employee brings to the table. Our success is a direct result of an inclusive culture where opportunities to learn from one another occur regardless of title, seniority, or background. This collaborative and team-oriented approach is at the core of how we operate and continuously improve our products, services, and systems. As such, Duck Creek is committed to providing equal opportunity to all employees and applicants - to recruit, hire, train, and reward employees for their individual abilities, achievements, and experience without regard to race, color, gender, religion, sexual orientation, age, national origin, disability, marital, military, or any other protected status. To learn more about our inclusive company culture, values, DE&I initiatives, and people, please visit: https: / / www.duckcreek.com / life-at-duck-creek / Please let us know if you encounter accessibility barriers with our web content by sending an email to accessibility@duckcreek.com . Duck Creek Technologies does not accept, nor will we pay a fee for any hires resulting from unsolicited headhunter or agency resumes. #LI-Remote #LI-DP1

Posted 2 weeks ago

Apply

12.0 - 20.0 years

22 - 37 Lacs

Bengaluru

Hybrid

Naukri logo

12+ yrs of experience in Data Architecture Strong in Azure Data Services & Databricks, including Delta Lake & Unity Catalog Experience in Azure Synapse, Purview, ADF, DBT, Apache Spark,DWH,Data Lakes, NoSQL,OLTP NP-Immediate sachin@assertivebs.com

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 12 Lacs

Hyderabad, Kakinada

Work from Office

Naukri logo

Job Description for SQL Server Developer: We are looking for a Senior MS SQL developer who will be responsible for designing databases and ensuring their stability, reliability, and performance. You will also work other developers optimizing in-application SQL statements as necessary, and establishing best practices. You will help solve all database usage issues and come up with ideas and advice that can help avoid such problems in the future. Roles and responsibilities Design, develop, and maintain complex SQL queries, stored procedures, and functions. Perform database optimization and performance tuning to ensure optimal system efficiency. Apply data modelling techniques to ensure development and implementation support efforts meet integration and performance expectations Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Refine and automate regular processes, track issues, and document changes. Assist developers with complex query tuning and schema refinement, collaborate with the development team to integrate database components into applications. Provide 24x7 support for critical production systems. Create and maintain documentation for database processes and procedures. Education Bachelors degree in computer science, information systems, or a related field Required Skills and Experience 5+ years MS SQL Server experience required. Experience with Performance Tuning and Optimization (PTO), using native monitoring and troubleshooting tools. Experience working with Windows server, including Active Directory Familiarity with Azure is a plus.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary NetApp is a cloud-led, data-centric software company that helps organizations put data to work in applications that elevate their business. We help organizations unlock the best of cloud technology. As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption. We work closely with ISV’s and with the startup ecosystem in the Virtualization, Cloud, and AI/ML domains to build solutions that matter for the customers You will work closely with product owner and product lead on the company's current and future strategies related to said domains. Job Requirements • Deliver features, including participating in the full software development lifecycle. • Deliver reliable, innovative solutions and products • Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. • Work closely with cross-functional teams including business stakeholders to innovate and unlock new use-cases for our customers • Write unit and automated integrationtests and project documentation Technical Skills: • Understanding of Software development lifecycle • Proficiency in full stack development ~ Python, Container Ecosystem, Cloud and Modern ML frameworks • Knowledge of Data storage and Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model, inferencing as well as RAG workflows. • Exposure on Data pipeline, integrations and Unix based operating system kernels and development environments, e.g. Linux or FreeBSD. • A strong understanding of basic to complex concepts related to computer architecture, data structures, and new programming paradigms • Demonstrated creative and systematic approach to problem solving. • Possess excellent written and verbal communication skills. Education • Minimum 5 years of experience and must be hands-on with coding. • B.E/B.Tech or M.S in Computer Science or related technical field.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Good to have skills required : Cloud, SQL , data analysis skills Location : Pune - Kharadi - WFO - 3 days/week. Job Description : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications : Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspicious? Report Here! Hide This Job? Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Data Modeler Experience: 5+ Years Location: Hyderabad (WFO). Roles and Responsibilities: Experience in data modelling designing, implementing, and maintaining data models to support data quality, performance, and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in Azure, Databricks, Data warehousing, ERWIN, and Supply chain background is required. Strong knowledge of data modelling principles and techniques (e.g., ERD, UML). Proficiency with data modelling tools (e.g., ER/Studio, Erwin, IBM Data Architect). Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Able to create and maintain Source to Target mapping [STTM] Document , Bus Matrix Document , etc. Realtime experience working in OLTP & OLAP Database modelling. Additional: Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical, problem-solving, and communication skills. Ability to work effectively in a collaborative, fast-paced environment.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Noida

Work from Office

Naukri logo

Hello! Youve landed on this page, which means youre interested in working with us. Lets take a sneak peek at what its like to work at Innovaccer. Engineering at Innovaccer With every line of code, we accelerate our customers success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team thats turning dreams of better healthcare into reality, one line of code at a time. Together, we re shaping the future and making a meaningful impact on the world. About the Role The technology that once promised to simplify patient care has brought more issues than anyone ever anticipated. At Innovaccer, we defeat this beast by making full use of all the data Healthcare has worked so hard to collect, and replacing long-standing problems with ideal solutions. Data is our bread and butter for innovation. We are looking for a Software Development Engineer - II within the analytics team who can help us build the next generation of dashboards, reports, and other analytics for our customers in the provider/payer market. A Day in the Life Begin the day by reviewing overnight alerts and health dashboards for MongoDB, Elasticsearch, and Redis clusters, ensuring system uptime and performance SLAs are met. Attend a morning sync with the platform engineering and AI enablement teams to align on current incidents, infrastructure changes, and in-flight automation projects. Develop or refine APIs that expose key database metrics and status information to internal AI agents, enabling proactive issue detection and self-healing workflows. Work on infrastructure-as-code scripts (e.g., Terraform, Helm, Ansible) to provision or update HA database environments across Kubernetes and cloud-native platforms. Collaborate with AI/ML engineers to build lightweight agents that can auto-scale resources, detect slow queries, or predict cache evictions using logs and telemetry from Redis or Elasticsearch. Troubleshoot real-time issues like replication lag in MongoDB or index bloat in Elasticsearch, using monitoring tools and custom-built internal dashboards. Document operational runbooks and reliability patterns, while contributing to a shared knowledge base for SREs, developers, and future AI agents to use. Wrap up by running a simulation or test scenario for a self-recovery agent (e.g., automated failover handler), validating that it performs as expected under load or failure conditions What You Need Functional :- Reliability Mindset: Passion for building resilient, self-healing database systems with minimal manual intervention. Problem Solving: Strong analytical skills to diagnose and resolve issues across distributed data systems in real-time. Collaboration: Proven ability to work cross-functionally with engineering, SRE, and AI/ML teams to deliver scalable infrastructure solutions. Documentation Runbooks: Experience creating operational playbooks, troubleshooting guides, and automation documentation. Incident Response: Familiarity with on-call rotations, root cause analysis (RCA), and post-incident reviews in a high- availability environment. Process Improvement: Ability to identify and improve inefficiencies in database and cache workflows through tooling or automation. Technical Skill Sets :- Strong Database knowledge and work experience in dealing with multi terabyte and high concurrent work loads 5+ years of experience related to MPP and/Or Columnar database platforms like Redshift , Azure , Snowflake etc 4 years of experience with traditional OLTP database platforms like Postgres ,MS SQL server etc Database Expertise: Hands-on experience managing, tuning, and scaling MongoDB, Redis, and Elasticsearch in production environments. API Development: Proficiency in developing and consuming RESTful APIs, especially for telemetry, alerting, or agent control functions. Automation IaC: Working knowledge of tools like Terraform, Ansible, Helm, and scripting in Python, Bash, or Go. Monitoring Observability: Familiar with tools like Prometheus, Grafana, ELK stack, Datadog, or OpenTelemetry for database observability. AI/Agent Integration: Exposure to building or supporting AI-powered automation agents for predictive alerting, anomaly detection, or auto-scaling behaviors. Security Compliance: Understanding of secure key management, database encryption, audit logging, and role- based access control (RBAC). Cloud Kubernetes: Experience deploying and operating database workloads in AWS, Azure, or GCP with Kubernetes orchestration. Here s What We Offer Generous Leave Benefits : Enjoy generous leave benefits of up to 40 days. Parental Leave : Experience one of the industrys best parental leave policies to spend time with your new addition. Sabbatical Leave Policy : Want to focus on skill development, pursue an academic career, or just take a breakWeve got you covered. Health Insurance : We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office *: Spend more time with your treasured friends, even when youre away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children *: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your childs well-being first. *India offices Where and how we work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for . Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer Inc. is the data platform that accelerates innovation. The Innovaccer platform unifies patient data across systems and care settings and empowers healthcare organizations with scalable, modern applications that improve clinical, financial, operational, and experiential outcomes. Innovaccer s EHR-agnostic solutions have been deployed across more than 1,600 hospitals and clinics in the US, enabling care delivery transformation for more than 96,000 clinicians, and helping providers work collaboratively with payers and life sciences companies. Innovaccer has helped its customers unify health records for more than 54 million people and generate over $1.5 billion in cumulative cost savings. The Innovaccer platform is the #1 rated Best-in-KLAS data and analytics platform by KLAS, and the #1 rated population health technology platform by Black Book. For more information, please visit innovaccer.com . Check us out on YouTube , Glassdoor , LinkedIn , and innovaccer.com .

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies