Jobs
Interviews

563 Query Optimization Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 17.0 years

6 - 10 Lacs

Mumbai

Work from Office

Role Overview : We are looking for an experienced Denodo SME to design, implement, and optimize data virtualization solutions using Denodo as the enterprise semantic and access layer over a Cloudera-based data lakehouse. The ideal candidate will lead the integration of structured and semi-structured data across systems, enabling unified access for analytics, BI, and operational use cases. Key Responsibilities: Design and deploy the Denodo Platform for data virtualization over Cloudera, RDBMS, APIs, and external data sources. Define logical data models , derived views, and metadata mappings across layers (integration, business, presentation). Connect to Cloudera Hive, Impala, Apache Iceberg , Oracle, and other on-prem/cloud sources. Publish REST/SOAP APIs, JDBC/ODBC endpoints for downstream analytics and applications. Tune virtual views, caching strategies, and federation techniques to meet performance SLAs for high-volume data access. Implement Denodo smart query acceleration , usage monitoring, and access governance. Configure role-based access control (RBAC) , row/column-level security, and integrate with enterprise identity providers (LDAP, Kerberos, SSO). Work with data governance teams to align Denodo with enterprise metadata catalogs (e.g., Apache Atlas, Talend). Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : 8–12 years in data engineering, with 4+ years of hands-on experience in Denodo Platform . Strong experience integrating RDBMS (Oracle, SQL Server), Cloudera CDP (Hive, Iceberg), and REST/SOAP APIs. Denodo Admin Tool, VQL, Scheduler, Data Catalog; SQL, Shell scripting, basic Python (preferred). Deep understanding of query optimization , caching, memory management, and federation principles. Experience implementing data security, masking, and user access control in Denodo.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Manage and optimize Azure Cosmos DB, ensuring efficient partitioning, indexing, and performance tuning. Maintain .NET Core applications, ensuring seamless database connectivity and high performance. Monitor and troubleshoot Azure database infrastructure including Cosmos DB, Redis Cache, and Azure SQL. Implement backup, disaster recovery, and high availability strategies across multiple regions. Automate database operations, provisioning, and monitoring using Azure DevOps (CI/CD) and IaC (Terraform, Bicep, ARM). Work with APIM, App Services, Function Apps, and Logic Apps for cloud-native database solutions. Optimize Azure Storage Containers, Cognitive Search, and Form Recognizer for data processing and retrieval. Ensure database security, authentication (OAuth, JWT), and compliance with PMI standards. Strong expertise in query optimization, performance troubleshooting, and RU cost management in Cosmos DB. Hands-on experience with Azure Monitor, Log Analytics, and Application Insights for proactive monitoring and performance insights

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Manage and optimize Azure Cosmos DB ensuring efficient partitioning indexing and performance tuning Maintain NET Core applications ensuring seamless database connectivity and high performance Monitor and troubleshoot Azure database infrastructure including Cosmos DB Redis Cache and Azure SQL Implement backup disaster recovery and high availability strategies across multiple regions Automate database operations provisioning and monitoring using Azure DevOps CI CD and IaC Terraform Bicep ARM Work with APIM App Services Function Apps and Logic Apps for cloud native database solutions Optimize Azure Storage Containers Cognitive Search and Form Recognizer for data processing and retrieval Ensure database security authentication OAuth JWT and compliance with PMI standards Strong expertise in query optimization performance troubleshooting and RU cost management in Cosmos DB Hands on experience with Azure Monitor Log Analytics and Application Insights for proactive monitoring and performance insights

Posted 1 month ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Gurugram

Work from Office

Urgent Opening for Sr Database Administrator Posted On 04th Jul 2019 12:14 PM Location Gurgaon Role / Position Senior Database Administrator Experience (required) 15 plus years Description Designation: Designation SeniorDatabase Administrator LocationGurgaon Primary responsibilities of this role would include owning, tracking and resolving database related incidents and requests, participation in design of database architecture for current and future products. The SQL Server DBA will be responsible for the implementation, configuration, maintenance, and performance of critical SQL Server RDBMS systems, to ensure the availability and consistent performance of our applications. This is a hands-on position requiring solid technical skills, as well as excellent interpersonal and communication skills. The successful candidate will be responsible for the development and sustainment of the SQL Server, ensuring its operational readiness (security, health and performance), executing data loads, and performing data modelling in support of multiple development teams. The OLTP Databases and Data warehouse supports an enterprise application suite of program management tools. Must be capable of working independently and collaboratively. Troubleshooting and resolving database integrity, performance, blocking and HA issues etc. Knowledge of SQL Server tools (Profiler, DTA, SSMS, PerfMon, DMVs etc). Responsibilities Responding todatabase related alerts, escalations and working with research/development teams to implement strategic solutions.Conduct SQL Server lunch-and-learn sessions for application developers to share domain andtechnical expertise.Capable of multi-tasking and working with a variety of people.Provide data to business users for analysis.Create complex analytic queries on large data setsIndependently analyse, solve, and correct issues in real time, providing problem resolution end-to-end.Hands-on support of largedatabases , including, but not limited to, monitoring, re-indexing, generaldatabase maintenance, extract- transformation-load, backup/recovery, documentation, and configuration. Qualification 12 + years of experience as a Microsoft SQL Server database administrator with Development using TSQL, SSIS under MSSQL Server 2005/2008/2012/2014.Experience in upgrading databases from 2005/2008 to 2012/2014. Strong Knowledge of high availability architecture including clustering, AlwaysOn & hands on knowledge with MS SQL Business Intelligence offering products Analysis Services, Reporting Services and Integration Services. Experience designing logical and physical databases for OLTP and Data warehouse. Experience in Performance Tuning, Query Optimization, using Performance Monitor, SQL Profiler and other related monitoring and troubleshooting tools. Ability to identify and troubleshoot SQL Server related CPU, memory, I/O, disk space and other resource contention. SQL Development ability to write and troubleshoot SQL Code and design (stored procedures, functions, tables, views, triggers, indexes, constraints). SQL Development- experience in creating database architecture with associated schema as part of a software design process. Familiarity with windows server, security delegation, SPNs, storage components. Ability to network with documentation and testing teams to accomplish creation of processes and procedures. A highly self-motivated individual with the ability to work effectively in a collaborative, team-oriented IT environment Must have 2-3 years of experience as .Net developer with strong understanding of database structures, theories, principles, and practices. Send Resumes to girish.expertiz@gmail.com -->Upload Resume

Posted 1 month ago

Apply

6.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

About the role As a Big Data Engineer, you will make an impact by identifying and closing consulting services in the major UK banks. You will be a valued member of the BFSI team and work collaboratively with manager, primary team and other stakeholders in the unit. In this role, you will: Collaborate with cross-functional teams to improve data ingestion, transformation, and validation workflows Work closely with Data Engineers, Architects, and Analysts to understand data reconciliation requirements Develop and implement PySpark programs to process large datasets in Big data platforms Analyze and comprehend existing data ingestion and reconciliation frameworks Perform complex transformations including reconciliation and advanced data manipulations Fine-tune Spark jobs for performance optimization, ensuring efficient data processing at scale Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client or Cognizant office in Pune/Hyderabad location. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs. What you must have to be considered Design and implement data pipelines, ETL processes, and data storage solutions that support data-intensive applications Extensive hands-on experience with Python, PySpark Good at Data Warehousing concepts & well versed with structured, semi structured (Json, XML, Avro, Parquet) data processing using Spark/Pyspark data pipelines Experience working with large-scale distributed data processing, and solid understanding of Big Data architecture and distributed computing frameworks Proficiency in Python and Spark Data Frame API, and strong experience in complex data transformations using PySpark These will help you stand out Able to leverage Python libraries such as cryptography or pycryptodome along with PySpark's User Defined Functions (UDFs) to encrypt and decrypt data within your Spark workflows Should have worked on Data risk metrics in PySpark & Excellent at Data partitioning, Z-value generation, Query optimization, spatial data processing and optimization Experience with CI/CD for data pipelines Must have working experience in any of the cloud environment AWS/Azure/GCP Proven experience in an Agile/Scrum team environment Experience in development of loosely coupled API based systems We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Skill-Snowflake Developer with Data Build Tool with ADF with Python Job Descripion: We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Administer PostgreSQL databases, ensuring optimal performance, security, and backup processes. Work on data migration, query optimization, and troubleshooting issues to ensure database reliability.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Chennai

Work from Office

Manage and maintain MySQL databases, ensuring optimal performance, security, and data integrity. You will handle database backups, troubleshooting, performance tuning, and ensure high availability. Strong experience with MySQL database administration, SQL, and optimization is required.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Manage and maintain Microsoft SQL Server databases, ensuring performance, availability, and security. You will perform backups, optimize queries, and troubleshoot database issues. Expertise in SQL Server, T-SQL, and database administration practices is required.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Design, develop, and maintain PL/SQL code for efficient data management and integration with Oracle databases. You will write complex SQL queries, optimize database performance, and ensure data integrity. Experience in PL/SQL development, Oracle databases, and query optimization is essential.

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 9 Lacs

Navi Mumbai

Work from Office

3-5 years of experience in MySQL database development and administration. Strong knowledge of relational database concepts, design, and indexing. Expertise in SQL query optimization and performance tuning. Deep understanding of database security best practices and encryption methods. Experience with ETL processes and tools. Ability to write complex SQL queries, stored procedures, and functions. Strong problem-solving skills, attention to detail, and the ability to work independently or within a team. Excellent communication skills for cross-functional collaboration. Bachelors degree in Computer Science, Information Technology, or a related field. Certification in MySQL administration and knowledge of Oracle or PostgreSQL is a plus! If youre looking to take your career to the next level, apply now and become part of a dynamic and innovative team!

Posted 1 month ago

Apply

8.0 - 10.0 years

7 - 17 Lacs

Bengaluru

Work from Office

Employee type : CTH Primary Skills - Expert in BI tools like Microstrategy and Tableau with minimum 8 years of experience Secondary Skills Able to lead a team of BI experts and provide technical solution to BI tool reporting requirements Should have Over 6+years of IT experience in Analysis, Design, Development, Implementation, Reporting, Testing and Visualization of Business Intelligence Tools. Extensive Knowledge in creating data visualizations using BI Tools Desktops and regularly publishing and presenting dashboards. Familiar with SQL Concepts SQL tables, Views, indexes, joins, writing stored procedures and query optimization, performance tuning etc. Able to generate reports using BI Tools to analyse data from multiple data sources like Oracle, SQL Server, Excel, Hive and Tera data. Good knowledge on various BI Tools functionalities like Tableau Extracts, Parameters, Trends, Hierarchies, Sets, Groups, Data Blending and joins etc. Able to develop various customized charts. Work extensively with Actions, Calculations, Parameters, Background images, Maps, Trend lines, LOD functions and Table Calculations. Able to generate Context filters, Extract and Data Source filters while handling huge volume of data. Hands on building Groups, hierarchies, Sets and create detail level summary reports and Dashboard using KPIs. Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server. Handling User onboarding activities under BI tools and various servers. Able to manage User groups and access relative activities in BI Tools and tableau server. Ability to handle the data security. experience in SharePoint with min 2+ year experience on office 365 / SharePoint online , Hands on experience 3+ years of relevant experience in SharePoint with minerience on office 365 / SharePoint online , Hands on experience in coding using Angular, HTML, JavaScript, CSS, React JS, SPFx, and RES

Posted 1 month ago

Apply

6.0 - 10.0 years

11 - 14 Lacs

Bengaluru

Work from Office

Notice Period: Immediate Joiners or within 15 days Employee type : Contract Should have Over 6+years of IT experience in Analysis, Design, Development, Implementation, Reporting, Testing and Visualization of Business Intelligence Tools . Extensive Knowledge in creating data visualizations using BI Tools Desktops and regularly publishing and presenting dashboards. Familiar with SQL Concepts SQL tables, Views, indexes, joins, writing stored procedures and query optimization, performance tuning etc. Able to generate reports using BI Tools to analyse data from multiple data sources like Oracle, SQL Server, Excel, Hive and Tera data. Good knowledge on various BI Tools functionalities like Tableau Extracts, Parameters, Trends, Hierarchies, Sets, Groups, Data Blending and joins etc. Able to develop various customized charts. Work extensively with Actions, Calculations, Parameters, Background images, Maps, Trend lines, LOD functions and Table Calculations. Able to generate Context filters, Extract and Data Source filters while handling huge volume of data. Hands on building Groups, hierarchies, Sets and create detail level summary reports and Dashboard using KPIs. Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server. Handling User onboarding activities under BI tools and various servers. Able to manage User groups and access relative activities in BI Tools and tableau server. Ability to handle the data security. Good knowledge in Banking domain Experience and Expertise in developing Test Harness in J2SE Experience in Continuous Integration Tool Jenkins / Hudson Experience / Knowledge in Version Control systems like Git , CVS, Subversion, etc Experience in writing SQLs / PL-SQL Good knowledge in shell scripting & Unix commands

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Chennai

Work from Office

We are looking immediatefor SnowflakeDataWarehouse Engineers_ Contract_ Chennai:Snowflake Data Warehouse Engineers5+yearsChennaiPeriodImmediateTypeContractDescription:- We need an experienced, collaborative Snowflake Data Warehouse Engineers with 5+ Yrs of experience in developing Snowflake data models, data ingestion, views, Stored procedures, complex queries Good experience in SQL Experience in Informatica Powercenter / IICS ETL tools Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Provide production support for Data Warehouse issues such data load problems, transformation translation problems Ability to facilitate and coordinate discussion and to manage expectations of multiple stakeholders Candidate must have good communication and facilitation skills Work in an onsite-offshore model involving daily interactions with Onshore teams to ensure on-time quality deliverables

Posted 1 month ago

Apply

3.0 - 5.0 years

1 - 3 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, and deploy complex PL/SQL programs using Oracle SQL. Develop stored procedures, functions, and packages to meet business requirements. Optimize database queries for performance tuning and efficiency. Strong experience in developing applications using Oracle APEX. Perferred : immediate joiner / 15 days of notice period

Posted 1 month ago

Apply

7.0 - 12.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Client in the Insurance Segment, we are searching for a Senior Mainframe Developer. Responsibilities Essential Job Functions: Design and Development Mainframe batch support Resource should be comfortable with On-Call Support and Week-End Release support. Incident & Outage management Develop code for Reporting, Defect & Enhancement Hands on experience with JCL, COBOL, CICS, DB2, VSAM and mainframe utilities are mandatory. Drive the Kanban, Daily Standup, co-ordination meetings with respective stakeholders. Analytical, problem solving, creative thinking and design skills Skills Must have Strong Mainframe skills with minimum 7+ years of experience Experience in developing design, develop, test, debug, and maintain mainframe applications using COBOL programming language 3.Utilize VSAM (Virtual Storage Access Method) for efficient data access and management. 4.Interact with databases using DB2, including SQL query optimization and performance tuning. 5.Utilize tools such as File-AID for data manipulation, browsing, and editing. 6.Write and execute SQL queries using SPUFI for data retrieval and manipulation. 7.Collaborate with cross-functional teams to analyze requirements, design solutions, and implement changes. Nice to have Insurance domain experience. Other Languages EnglishC1 Advanced Seniority Senior

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Hybrid

We are seeking a skilled Database Specialist with strong expertise in Time-Series Databases, specifically Loki for logs, InfluxDB, and Splunk for metrics. The ideal candidate will have a solid background in query languages, Grafana, Alert Manager, and Prometheus. This role involves managing and optimizing time-series databases, ensuring efficient data storage, retrieval, and visualization. Key Responsibilities: Design, implement, and maintain time-series databases using Loki, InfluxDB, and Splunk to store and manage high-velocity time-series data. Develop efficient data ingestion pipelines for time-series data from various sources (e.g., IoT devices, application logs, metrics). Optimize database performance for high write and read throughput, ensuring low latency and high availability. Implement and manage retention policies, downsampling, and data compression strategies to optimize storage and query performance. Collaborate with DevOps and infrastructure teams to deploy and scale time-series databases in cloud or on-premise environments. Build and maintain dashboards and visualization tools (e.g., Grafana) for monitoring and analyzing time-series data. Troubleshoot and resolve issues related to data ingestion, storage, and query performance. Work with development teams to integrate time-series databases into applications and services. Ensure data security, backup, and disaster recovery mechanisms are in place for time-series databases. Stay updated with the latest advancements in time-series database technologies and recommend improvements to existing systems. Key Skills: Strong expertise in Time-Series Databases with Loki (for logs), InfluxDB, and Splunk (for metrics).

Posted 1 month ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Resource with 3-6 years of experience in SAP Data migration. (B2/B3 Band preferable). ADM (Syniti Tool) knowledge is preferred, Knowledge on other ETL tools is an advantage. Knowledge of SQL & PLSQL queries is mandatory. Good Knowledge of Data migration concept from the requirement to the end of the load in SAP. Sound knowledge in Data analysis, Profiling, Cleaning, Visualization, transformation, Conversion, and implementation. Good functional understanding - SAP Material Master and Finance. Having sound knowledge of the SAP LSMW Program and ABAP. Should be able to debug ABAP programs. Sound knowledge of Loading Methods in SAP and Interface.

Posted 1 month ago

Apply

6.0 - 10.0 years

7 - 16 Lacs

Bengaluru

Hybrid

We are looking for a talented and experienced PL/SQL Developer to join our team. The ideal candidate should have a strong foundation in Oracle PL/SQL programming , excellent problem-solving skills, and the ability to manage and support applications through their full lifecycle. Key Responsibilities: Design, code, test, and debug Oracle PL/SQL programs including packages, procedures, triggers, and functions Optimize and tune complex SQL queries and PL/SQL blocks for performance Perform code reviews , root cause analysis, and issue resolution in production environments Participate in requirement analysis , technical documentation, and system design Collaborate with cross-functional teams for development and support Maintain and enhance existing PL/SQL applications and scripts Provide L2/L3 application support , manage incidents and changes Work within Agile/Scrum framework and contribute to sprint goals Support deployment, risk mitigation, and project flow coordination Required Qualifications: Bachelor's/Masters degree in Computer Science, IT, or relevant discipline 6–10 years of hands-on experience with PL/SQL and Oracle databases Strong in SQL tuning , debugging, and exception handling Exposure to Unix/Linux environment and shell scripting is a plus Experience in SDLC, Agile methodologies , and version control systems Good communication, documentation, and client interaction skills

Posted 1 month ago

Apply

12.0 - 14.0 years

13 - 14 Lacs

Ahmedabad

Work from Office

Key Responsibilities: Client Engagement Team Management Project Planning and Execution Solution Development Technical Leadership Advanced Database and System Knowledge Documentation and Reporting Requirements Proven experience as a Project Manager, with a strong technical background. Hands-on expertise with Shopify Plus, including APIs, customizations, and integrations. Proficiency in no-code/low-code tools, such as: n8n, Make.com, Glide, Bubble, Airtable, Zapier, Retool, and Appgyver. Strong knowledge of programming languages (e.g., JavaScript, Python, or similar) for troubleshooting and development. Advanced database expertise, including: Designing and managing complex databases. Query optimization and handling large-scale datasets. Familiarity with task management tools (e.g., Trello, Asana, Jira) and agile methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Req ID: 324434 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Advisor to join our team in Bengaluru, Karn taka (IN-KA), India (IN). Job Title: API Developer (JAVA | Spring Boot| Rest & GraphQL) About the role: We are looking for a passionate and experienced API Developer to join our backend engineering team. In this role, you will play a critical part in designing and implementing robust APIs and microservices. You will work with cutting-edge tech stack including Spring Boot and Java 8+ (Java17 preferred) GraphQL, Docker and Kubernetes. In this role you ll be responsible for building clean, testable and efficient code, contributing to architecture decisions and supporting full lifecycle development from concept to development. Required Skills: Java 8+ (preferred java 17) Strong backend development experience using Spring Boot, specifically building APIs that integrate with relational databases. Proficient in building and consuming RESTful and GraphQL APIs Experience with Junit, SonarQube and test-driven development Knowledge of Docker and Kubernetes for microservice architecture Experience with Rancher for Kubernetes and Jenkins for CI/CD is preferred Hands-on expertise in PostgreSQL/Oracle, including schema design, indexing, query optimization and migration Familiarity with version control (GIT) Strong problem-solving skills and the ability to work collaboratively in Agile teams. Nice to Have: Experience with Apache Kafka Understanding of API security protocols (OAuth2, JWT, etc.)

Posted 1 month ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Req ID: 324436 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Advisor to join our team in Bengaluru, Karn taka (IN-KA), India (IN). About the role: We are looking for a passionate and experienced API Developer to join our backend engineering team. In this role, you will play a critical part in designing and implementing robust APIs and microservices. You will work with cutting-edge tech stack including Spring Boot and Java 8+ (Java17 preferred) GraphQL, Docker and Kubernetes. In this role you ll be responsible for building clean, testable and efficient code, contributing to architecture decisions and supporting full lifecycle development from concept to development. Required Skills: Java 8+ (preferred java 17) Strong backend development experience using Spring Boot, specifically building APIs that integrate with relational databases. Proficient in building and consuming RESTful and GraphQL APIs Experience with Junit, SonarQube and test-driven development Knowledge of Docker and Kubernetes for microservice architecture Experience with Rancher for Kubernetes and Jenkins for CI/CD is preferred Hands-on expertise in PostgreSQL/Oracle, including schema design, indexing, query optimization and migration Familiarity with version control (GIT) Strong problem-solving skills and the ability to work collaboratively in Agile teams. Nice to Have: Experience with Apache Kafka Understanding of API security protocols (OAuth2, JWT, etc.)

Posted 1 month ago

Apply

7.0 - 9.0 years

16 - 16 Lacs

Hyderabad

Work from Office

Responsibilities: * Design, develop, test & maintain PL/SQL applications using Oracle 11g/12c. * Optimize database performance through query optimization & indexing.

Posted 1 month ago

Apply

7.0 - 11.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Role: Lead SQL Server Developer Location Bangalore (Ashok Nagar) Experience : 8 - 10 years of prior experience which includes 3+ years of Team Lead experience required. Education : Bachelor's/Masters Degree in Technology. Salary : Negotiable Job Type : Full Time (On Role) Mode of Work : Work from Office Job Description What you will be doing: Working on Microsoft SQL Server 2012 and above server-side development. Designing the schema that can be usable across multiple customers. Design and developing T-SQL (Transact SQL) Stored Procedures, Functions and Triggers and SSIS packages. Develop underlying data models and databases. Develop, manage and maintain data dictionary and or metadata. Doing performance tuning & query optimization. Ensuring compliance of standards and conventions in developing programs. Analyse and resolve complex issues without oversight from other people. Perform quality checks on reports and exports to ensure exceptional quality. Works in Scrum environment. Preferred Skills: Knowledge of tools like Qlik /Qlik sense and/or any other data visual tools. Understanding of .NET code/jQuery experience is a plus. Knowledge in Microsoft Reporting tool (SSRS). Experience in Database Administration activities. Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -

Posted 1 month ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

Noida

Work from Office

5-7 years of experience in data analytics, business intelligence, or a related field. Proven expertise with Firebase Analytics and GA4, including custom event setup and user journey tracking. Advanced proficiency in BigQuery: SQL scripting, query optimization, partitioning, and clustering. Hands-on experience with Looker or Looker Studio for dashboard development and data modeling. Familiarity with other GCP services such as Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow is a strong plus. Solid understanding of data privacy and governance frameworks (GDPR, CCPA, etc). Strong analytical thinking and problem-solving abilities with attention to detail. Excellent communication skills and the ability to work effectively in cross-functional teams. Preferred Qualifications: Google Cloud certifications (eg, Professional Data Engineer, Looker Business Analyst) Experience with A/B testing frameworks and experimentation platforms Background in product analytics or digital marketing analytics

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies