Jobs
Interviews

123 Oltp Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

As a Data Modeler, your primary responsibility will be to design and develop conceptual, logical, and physical data models supporting enterprise data initiatives. You will work with modern storage formats like Parquet and ORC, and build and optimize data models within Databricks Unity Catalog. Collaborating with data engineers, architects, analysts, and stakeholders, you will ensure alignment with ingestion pipelines and business goals. Translating business and reporting requirements into robust data architecture, you will follow best practices in data warehousing and Lakehouse design. Your role will involve maintaining metadata artifacts, enforcing data governance, quality, and security protocols, and continuously improving modeling processes. You should have over 10 years of hands-on experience in data modeling within Big Data environments. Your expertise should include OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficiency in modeling methodologies like Kimball, Inmon, and Data Vault is essential. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Experience in Databricks with Unity Catalog and Delta Lake is required, along with a strong command of SQL and Apache Spark for querying and transformation. Familiarity with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database, is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are necessary for this role, as well as the ability to work in cross-functional agile environments. A Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field is required. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are a plus. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks like GDPR and HIPAA are advantageous.,

Posted 20 hours ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a strong development knowledge in DB Design & development with 6+ to 10 years of experience in Postgres DB. It is mandatory to have hands-on experience in writing complex PGSQL procedures and Functions to prevent blocking and Deadlocks. Conducting SQL objects code review and Performance tuning is also mandatory. Having proficiency in Microsoft SQL and MYSQL DB will be an advantage. A strong understanding of RDBMS and NoSQL concepts with logical thinking and problem-solving skills is highly required. Expertise in transaction databases (OLTP) and ACID property with the ability to manage large-scale application databases is mandatory. Your key responsibilities will include designing, implementing, and maintaining scalable and efficient PostgreSQL databases. You will be responsible for developing database schemas, tables, views, and relationships based on application requirements. Writing complex SQL queries, stored procedures, and functions to support various applications will be part of your role. Identifying and resolving issues related to blocking and deadlocks in database operations, conducting thorough code reviews of SQL objects, and optimizing performance are essential tasks. You will need to provide insights into integration and data migration between different database systems and design strategies to handle large-scale application databases effectively. Consulting with application developers to offer expert guidance on SQL and PGSQL best practices, and maintaining clear and comprehensive documentation for database structures, procedures, and configurations are key aspects of the role. Strong communication skills are crucial for effectively conveying technical information to both technical and non-technical stakeholders.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced Data Migration and Integration engineer, you will be part of the STS group in capital markets, focusing on migrating client data from legacy/3rd party systems into FIS products. You will work alongside specialists in Adeptia, PL/SQL, and SSIS, with solid domain knowledge of Lending, Risk, and Treasury products. In this role, you will be responsible for completing Data Migration/Conversion/Integration projects within the specified time frame and ensuring the highest quality. Your duties will include clear and timely communication, problem escalation, and resolution efforts to ensure project success. To excel in this position, you should hold a Bachelor's degree in Computer Science or related fields such as B.Sc./B.C.A./B.Tech./B.E./M.C.A. with a minimum of 8-10 years of overall experience, primarily in ETL. Your expertise should include 5-6 years of experience with ETL tools like SSIS, Talend, and Adeptia, as well as proficiency in writing PL/SQL or T-SQL programming for Oracle/SQL Server databases. Additionally, you should possess strong knowledge of RDBMS concepts, OLTP system architecture, and analytical programs like Power BI, Crystal Reports/SSRS. Experience with source code control mechanisms, GIT/BitBucket, XML, JSON structures, Jenkins, job scheduling, SOAP, REST, and problem-solving skills are also essential for this role. Strong written and verbal communication, interpersonal skills, and the ability to work independently in high-pressure situations are key attributes. Previous experience in the Banking or Financial Industry is preferred, along with mentorship skills and hands-on experience in languages like Python, Java, or C#. At FIS, you will have the opportunity to learn, grow, and have a significant impact on your career. We offer extensive health benefits, career mobility options, award-winning learning programs, a flexible home-office work model, and the chance to collaborate with global teams and clients. FIS is dedicated to safeguarding the privacy and security of personal information processed for client services. Our recruitment model primarily involves direct sourcing, and we do not accept resumes from agencies not on our preferred supplier list. If you are ready to advance the world of fintech and meet the criteria outlined above, we invite you to join us at FIS.,

Posted 2 days ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Bengaluru

Work from Office

About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . As a Staff Engineer on the Data Engineering Team you ll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team wile using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 8+ years of industry experience building highly scalable distributed Data systems Programming experience in Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Kafka, Kubernetes, MySql, Hadoop, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred #LI-SK3

Posted 3 days ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Mumbai, Hyderabad

Work from Office

Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team . To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service . The role descriptions give you an overview of the responsibilities, it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which are focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to provide products and solutions that genuinely touch customers, unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavor, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

The Senior Lead Analyst - Implementation Conversion position at FIS is an exciting opportunity for individuals looking to make a significant impact in the fintech industry. In this role, you will be part of the Data Migration team, focusing on migrating client data from legacy or third-party systems into FIS products. Your responsibilities will include completing Data Migration, Conversion, and Integration tasks within designated timelines and maintaining high quality standards. To be successful in this role, you should have a Bachelor's degree in Computer Science or related field, with a minimum of 8-10 years of overall experience and significant expertise in ETL. Proficiency in ETL tools such as SSIS, Talend, and Adeptia, along with strong programming skills in PL/SQL or T-SQL for Oracle or SQL Server databases, is essential. Additionally, you should have a solid understanding of RDBMS concepts, OLTP system architecture, and experience with analytical programs like Power BI and Crystal Reports/SSRS. It is crucial for the Implementation Conversion Analyst to possess excellent problem-solving abilities, effective communication skills, and the capability to work independently and under pressure. Experience in the Banking or Financial Industry is preferred, and familiarity with web services, XML, JSON structures, and job scheduling is advantageous. Mentoring skills and proficiency in languages like Python, Java, or C# are considered a plus. At FIS, you will have the opportunity to learn, grow, and contribute to your career development. We offer a comprehensive Health Benefits Program, career mobility options, award-winning learning opportunities, and an adaptable work model. Join our global team and collaborate with clients around the world to drive innovation in the fintech sector. FIS is dedicated to safeguarding the privacy and security of all personal information processed to deliver services to our clients. For detailed information on our data protection practices, please refer to our Online Privacy Notice. Our recruitment process at FIS is primarily based on direct sourcing, with limited engagement with recruitment agencies. We do not accept resumes from agencies not on our preferred supplier list and do not cover any related fees for submissions. Join us at FIS and be part of a dynamic team shaping the future of fintech.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 3 days ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Company Overview: Syren Cloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Role Summary: As a Business Intelligence Engineer, you ll create and deliver data analytics solutions using tools like Power BI. You ll collaborate with business partners to define requirements, design data models, and build dashboards to provide actionable insights. Your work will help automate the digital supply chain for a global audience. Key Responsibilities - Develop dashboards and reports using Power BI. - Design data models to transform raw data into insights. - Work with MS SQL Server BI stack (SSRS, TSQL, Power Query, MDX, DAX). - Collaborate with end users to gather and translate requirements. - Enhance existing BI systems and ensure data security. Qualifications - 4+ years of Power BI experience. - Proficiency in Power BI, SQL Server, TSQL, Power Query, MDX, and DAX. - Strong analytical and communication skills. - Knowledge of database management systems and OLAP/OLTP. - Comfortable working under deadlines in agile environments.

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Pune, Chennai

Hybrid

Role & responsibilities We are looking for a skilled Data Modeller with 5 to 8 years of hands-on experience in designing and maintaining robust data models for enterprise data solutions. The ideal candidate has a strong foundation in dimensional, relational, and semantic data modelling and is ready to expand into data engineering technologies and practices . This is a unique opportunity to influence enterprise-wide data architecture while growing your career in modern data engineering. Required Skills & Experience: 5 to 8 years of experience in data modelling with tools such as Erwin, ER/Studio, dbt, PowerDesigner , or equivalent. Strong understanding of relational databases, star/snowflake schemas, normalization, and denormalization . Experience working with SQL , stored procedures , and performance tuning of data queries. Exposure to data warehousing concepts and BI tools (e.g., Tableau, Power BI, Looker). Familiarity with data governance , metadata management , and data cataloging tools . Excellent communication and documentation skills.

Posted 1 week ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

Bengaluru

Work from Office

About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the role: As a Sr. Staff Engineer on the Data Engineering Team you ll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team while using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 10+ years of industry experience building highly scalable distributed Data systems Programming experience in Scala, Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Spark, Kafka, Kubernetes, Iceberg, Trino, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education B.tech or equivalent required, Masters or equivalent strongly preferred #LI-SK3

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Chennai, Bengaluru

Work from Office

Application Support - Unix,SQL,Shell Scripting / IST Switch - Rotational Shifts About the team: MPS EPI Production Support team supports Payment which resides in their data center to ensure their OLTP systems (IST Switch and Fraud Navigator) and Batch systems (Clearing, MAS, Fraud Navigator and Data Navigator) are working without any interruption, quick failovers ,monthly and compliances releases etc. What you will be doing: Provides technical support activities for a software production processing environment. Installs, maintains and supports application/source code and/or its components and subsystems including third party software. Detects, diagnoses and reports related problems. Analyzes and resolves incidents, problems or known errors related to failures in application and supporting software components. Provides technical assistance to programming staff in the analysis of application software amends, performance and resource consumption. What you bring: 3 to 7 years of experience in: Switch ,Unix, SQL and Shell scripting Should have hands on experience on automation production support Having knowledge on IST Switch. Should have good experience in handling incidents ,Excellent communication skills Added Bonus if you have: Knowledge of FIS products and services Knowledge of the business goals, objectives and business operations for the appropriate FIS organization Knowledge of financial services industry What we offer you: A range of benefits designed to help support your lifestyle and wellbeing A multi-faceted job with a broad spectrum of responsibilities A modern international work environment and a dedicated and innovative team A broad range of professional education and personal development possibilities FIS is your final career step

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

ECMS Req # 533599 Number of Openings 1 Duration of Hiring 6 months years of experience Total 6 - 8 years Relevant 4 - 5 yrs Detailed job description - Skill Set: Required Qualifications: 4+ years of experience in data engineering or warehousing with a focus on Amazon Redshift . Strong proficiency in SQL , with ability to write and optimize complex queries for large datasets. Solid understanding of dimensional modeling , Star Schema , and OLAP vs OLTP data structures. Experience in designing analytical data marts and transforming raw/transactional data into structured analytical formats. Hands-on experience with ETL tools (e. g. , AWS Glue). Familiarity with Amazon Redshift Spectrum , RA3 nodes , and data distribution/sort keys best practices. Comfortable working in cloud-native environments, particularly AWS (S3, Lambda, CloudWatch, IAM, etc. ). Preferred Qualifications: Exposure to data lake integration , external tables , and Redshift-Unload/Copy operations. Experience in BI tools (e. g. , Tableau, QuickSight) to validate and test data integration. Familiarity with Python or PySpark for data transformation scripting. Understanding of CI/CD for data pipelines and version control using Git. Knowledge of data security, encryption, and compliance in a cloud environment. Mandatory Skills(ONLY 2 or 3) Amazon Redshift SQL Vendor Billing range in local currency (per day) INR 8500/day Work Location Any Infosys DC WFO/WFH/Hybrid Hybrid Joining time ( Notice period) As early as possible Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No BGCHECK before or After onboarding Before - Final BG report

Posted 1 week ago

Apply

7.0 - 8.0 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Skills needed : 7+ Years of proven experience in writing and optimizing complex SQL queries, stored procedures, functions, triggers and views for both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) environments. Create and provide optimized SQL views and data sets for business intelligence tools (e.g., Power BI, Tableau, Qlik Sense) and reporting applications. Work closely with BI developers and data analysts to understand their data needs and provide efficient data access solutions. Experience with cloud-based data warehousing platforms (e.g., Snowflake, Databricks, Azure Synapse Analytics). Strong proficiency in SQL (Structured Query Language) and a deep understanding of relational database concepts. Extensive experience with at least one major RDBMS (e.g., Microsoft SQL Server, MySQL, PostgreSQL, Oracle). Solid understanding of database design principles, data modelling, and normalization. Experience with query troubleshooting, performance tuning and query optimization techniques. Nice to have: Knowledge of the Commercial life sciences and bio-pharma industry is highly desirable. Comfortable with commercial dataset: Sales data from Iqvia, Symphony, Komodo, etc., CRM data from Veeva, OCE, etc. Knowledge of scripting languages (e.g., Python, PowerShell) for data manipulation and automation within a data pipeline.

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 24 Lacs

Navi Mumbai

Work from Office

Responsibilities: * Design, develop, test & maintain Oracle applications using PL/SQL, SQL & OLTP principles. * Collaborate with cross-functional teams on project delivery following Agile methodology. Leave encashment Maternity leaves Paternity leaves

Posted 1 week ago

Apply

14.0 - 18.0 years

50 - 90 Lacs

Bengaluru

Work from Office

About Netskope Today, theres more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security. Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the team DSPM is designed to provide comprehensive data visibility and contextual security for the modern AI-driven enterprise. Our platform automatically discovers, classifies, and secures sensitive data across diverse environments including AWS, Azure, Google Cloud, Oracle Cloud , and on-premise infrastructure. DSPM is critical in empowering CISOs and security teams to enforce secure AI practices, reduce compliance risk, and maintain continuous data protection at scale. Whats in it for you We are a distributed team of passionate engineers dedicated to continuous learning and building impactful products. Our core product is built from the ground up in Scala, leveraging the Lightbend stack (Play/Akka). As a key member of the DSPM team, you will contribute to developing innovative and scalable systems designed to protect the exponentially increasing volume of enterprise data. Our platform continuously maps sensitive data across all major cloud providers, and on-prem environments, automatically detecting and classifying sensitive and regulated data types such as PII, PHI, financial, and healthcare information. It flags data at risk of exposure, exfiltration, or misuse and helps prioritize issues based on sensitivity, exposure, and business impact. What you will be doing Drive the enhancement of Data Security Posture Management (DSPM) capabilities, by enabling the detection of sensitive or risky data utilized in (but not limited to) training private LLMs or accessed by public LLMs. Improve the DSPM platform to extend support of the product to all major cloud infrastructures, on-prem deployments, and any new upcoming technologies. Provide technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Contribute to product development: understand customer requirements and work with the product team to design, plan, and implement features. Support customers by investigating and fixing production issues. Help us improve our processes and make the right tradeoffs between agility, cost, and reliability. Collaborate with the teams across geographies. Required skills and experience 12+ years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services. Expert coding skills in Scala or Java. Development on cloud platforms including AWS. Deep knowledge on databases and data warehouses (OLTP, OLAP) Analytical and troubleshooting skills. Experience working with Docker and Kubernetes. Able to multitask and wear many hats in a fast-paced environment. This week you might lead the design of a new feature, next week you are fixing a critical bug or improving our CI infrastructure. Cybersecurity experience and adversarial thinking is a plus. Expertise in building REST APIs. Experience leveraging cloud-based AI services (e.g., AWS Bedrock, SageMaker), vector databases, and Retrieval Augmented Generation (RAG) architectures is a plus. Education BSCS or equivalent required, MSCS or equivalent strongly preferred #LI-JB3 Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate. Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskopes Privacy Policy for more details.

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

This is a technical role responsible for managing complex database environments including SQL and MySQL databases. The role includes planning, implementation, performance tuning, and maintenance of enterprise relational database platforms with a focus on reliability, security, and automation. The ideal candidate will have a consistent track record in database infrastructure operations and have a passion for fostering innovation and excellence in the biotechnology industry. Additionally, collaboration with multi-functional and global teams is required to ensure seamless integration and operational excellence. The ideal candidate will have a solid background in database service delivery and operations, coupled with leadership and transformation experience. This role demands the ability to drive and deliver against key organizational critical initiatives, foster a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Please note, this is an on-site role based in Hyderabad. Database administration for all database lifecycle stages including installation, upgrade, optimization and decommission of SQL Server databases Administer security access controls, as needed recover databases during disaster recovery, develop and update documentation, automate routine operational work and implement process improvements Plan the implementation & configuration of Database software related services to support specific database business requirements (OLTP, decision support, standby DB, replication) while following database security requirements, reliability, and performance and standard processes Provide database administration support for development, test and production environments Investigate and resolve technical database issues. Participate in a 24x7 on-call support rotation and assist/lead root cause analysis reviews as needed Provide technical leadership for less experienced personnel, including training on installation and upgrades of RDBMS software, backup/recovery strategies and high availability configurations Develop and document standards, procedures, and work instructions that increase operational productivity Perform necessary security patch implementations to ensure ongoing database security Understanding of SAN storage and knowledge of supporting and provisioning databases in AWS and Azure public clouds. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 4 to 6 years of Information Systems and Database Administration experience OR Bachelor s degree and 6 to 8 years of Information Systems and Database Administration experience OR Diploma with 10 to 12 years of Information Systems and Database Administration experience Experience administering and monitoring SQL Server Databases & systems Demonstrable experience automating database provisioning, patching and administration Demonstrable experience with MSSQL Always on Availability Groups (AAG) Experience with DB tools to review performance, monitor and solve issues Understanding of ITIL frameworks and standard processes Understanding of operating system tools for performance and solving issues Excellent data-driven problem solving and analytical skills Demonstrable experience as part of a high-performance team Preferred Qualifications: Experience working on regulated systems (preferably in Pharmaceutical sector) Superb communication skills Organisational change expertise Skill in persuasion and negotiation Experience maximising Ansible for automation Experience supporting MySQL databases Soft Skills: Partner communication and expectation management Crisis management capabilities Shift Information: This position is required to be onsite and participate in 24/5 and weekend on call in rotation fashion and may require you to work a later shift. Candidates must be willing and able to work off hours, as required based on business requirements. As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for better performance. - Working knowledge of Big Query, Fire Store, Memory Store, Spanner, and bare metal setup for PostgreSQL. - Expertise in tuning Alloy DB/Cloud SQL database for optimal performance. - Experience with GCP Data migration service, MongoDB, Cloud Dataflow, Disaster Recovery, job scheduling, logging techniques, and OLTP/OLAP. - Desirable: GCP Database Engineer Certification. Roles & Responsibilities: - Develop, test, and maintain data architectures. - Migrate Enterprise Oracle database from On-Prem to GCP cloud, focusing on autovacuum in PostgreSQL. - Tuning autovacuum in PostgreSQL. - Performance tuning of PostgreSQL stored procedures and queries. - Convert Oracle stored procedures and queries to PostgreSQL equivalents. - Create a hybrid data store with Datawarehouse, NoSQL GCP solutions, and PostgreSQL. - Migrate Oracle table data to Alloy DB. - Lead the database team. Mandatory Skills: PostgreSQL, PLSQL, Bigquery, GCP Cloud, tuning, and optimization. To apply, please share your resume at sonali.mangore@impetus.com with details of your current CTC, expected CTC, notice period, and Last Working Day (LWD).,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a minimum of 8 years of experience as a Power BI Developer with at least 7 years to 12 years of total experience. Your role will involve hands-on experience in handling teams and clients. You should possess expert knowledge in using advanced calculations in MS Power BI Desktop, including DAX languages such as Aggregate, Date, Logical, String, and Table functions. Prior experience in connecting Power BI with both on-premise and cloud computing platforms is required. A deep understanding and the ability to utilize and explain various aspects of relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards are essential for this role. You should have a very good understanding of Data Modeling Techniques for Analytical Data, including Facts, Dimensions, and Measures. Experience in data warehouse design, specifically dimensional modeling, and data mining will be beneficial for this position. Additionally, hands-on experience in SSIS, SSRS, and SSAS will be considered a plus.,

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Remote

Client is 5+years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio.The ideal candidate will possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Staff Software Engineer specializing in Java at Walmart Global Tech in Chennai, you will play a crucial role in guiding the team in making architectural decisions and best practices for building scalable applications. Your responsibilities will include driving design, development, implementation, and documentation of cutting-edge solutions that impact associates of Walmart globally. You will collaborate with engineering teams across different locations, engage with Product Management and Business to drive product agendas, and work closely with architects to ensure solutions meet Quality, Cost, and Delivery standards. With a Bachelor's/Master's degree in Computer Science or a related field and a minimum of 10 years of experience in software design, development, and automated deployments, you will bring valuable expertise to the team. Your prior experience in delivering highly scalable Java applications, strong system design skills, and proficiency in CS fundamentals, Microservices, Data Structures, and Algorithms will be essential for success in this role. You should have hands-on experience with CICD development environments and tools like Git, Maven, and Jenkins, as well as expertise in writing modular and testable code using frameworks such as JUnit and Mockito. Your experience in building Java-based backend systems, working with cloud-based solutions, and familiarity with technologies like Spring Boot, Kafka, and Spark will be crucial. Additionally, you should be well-versed in microservices architecture, distributed concepts, design patterns, and cloud-native development. Your experience with relational and NoSQL databases, caching technologies, event-based systems like Kafka, monitoring tools like Prometheus and Splunk, and containerization tools like Docker and Kubernetes will be highly valuable. At Walmart Global Tech, you will have the opportunity to work in an innovative environment where your contributions can impact millions of people. The company values diversity, inclusion, and belonging, and offers a flexible, hybrid work environment along with competitive compensation, benefits, and opportunities for personal and professional growth. As an Equal Opportunity Employer, Walmart fosters a workplace culture where every individual is respected and valued, contributing to a welcoming and inclusive environment for all associates, customers, and suppliers.,

Posted 2 weeks ago

Apply

9.0 - 14.0 years

25 - 40 Lacs

Chennai

Work from Office

Role & responsibilities We are seeking a Data Modeller with over 12+ years of progressive experience in information technology, including a minimum of 4 years in a Data migration projects to cloud(refactor, replatform etc) and 2 years exposer to GCP. Preferred candidate profile In-depth knowledge of Data Warehousing/Lakehouse architectures, Master Data Management, Data Quality Management, Data Integration, and Data Warehouse architecture. Work with the business intelligence team to gather requirements for the database design and model Understand current on-premise DB model and refactoring to Google cloud for better performance. Knowledge of ER modeling, big data, enterprise data, and physical data models designs and implements data structures to support business processes and analytics, ensuring efficient data storage, retrieval, and management Create a logical data model and validate it to ensure it meets the demands of the business application and its users Experience in developing physical Model for SQL, No SQL, Key-Value pair, document database like Oracle, BigQuery, spanner, Postgresql, firestore, mongo DB etc Understand the data needs of the company or client Collaborate with the development team to design and build the database model for both Application and Datawarehousing development Classify the business needs and build both MicroServices & Reporting Database Model Strong hands on experience in SQL, Database procedures Work with the development team to develop and implement phase wise migration plan, go existing of on-prem and cloud DB, Help determine and manage data cleaning requirements

Posted 2 weeks ago

Apply
Page 1 of 5
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies