Home
Jobs

859 Stored Procedures Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Location Bangalore : D365 F&O Functional Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Location Bangalore : .NET Core Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Location Bangalore : Mulesoft Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Location Bangalore : Spec flow experience is mandate Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 6 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Location Chennai/ Hyderabad, Pan India : Primary Skillset Python, AWS, Life Science/Bioinformatics Interview Levels 1 Internal round, 2 Customer round Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

7 - 17 Lacs

Noida, Pune, Bengaluru

Hybrid

Naukri logo

Encora is scouting for #Oracle PL/SQL Developers for its Noida/Bangalore/Pune Locations. Job Summary: We are seeking a skilled and motivated Oracle PL/SQL Developer with experience in Oracle Forms to join our development team. The ideal candidate should have 35 years of hands-on experience working with Oracle PL/SQL and Oracle Forms in a fast-paced, enterprise environment. You will be responsible for designing, developing, and maintaining Oracle-based applications to support business operations. We are continuously evolving & always aiming to use the most cutting-edge technology for our developments & would love to learn more from you too. Click Ups on the hunt for Oracle PL/SQL Developer with proven experience in below areas: Strong hands-on experience in Oracle PL/SQL (Stored Procedures, Functions, Triggers, Cursors). 35 years of experience with Oracle Forms (preferably 10g/11g/12c). Good understanding of Oracle database architecture and relational data models. Experience in debugging, performance tuning, and unit testing of Oracle applications. Familiarity with Oracle Reports is a plus. Exposure to version control tools (like Git/SVN). Strong analytical and problem-solving skills. Ability to work independently and as part of a team. Note: For this role, we need strong hands on experience on Oracle Forms.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Ahmedabad

Work from Office

Naukri logo

Synoptek We think globally, act locally. As a Managed Services Provider Synoptek provides world-class strategic IT leadership and hyper-efficient IT operational support, enabling our global client-base to grow and transform their businesses. We are excited to have experienced continuous growth and in keeping with that momentum are seeking to add a Software Engineer to our talented team. Synoptek is a company with a heart and soul dedicated to the ongoing success and growth of our employees and continued business success of the customers we support. Want to see what we re made of? Head to Synoptek.com . Responsibilities .Net Lead DotNet Core, Angular, Microservices, Azure. A Software Engineer will be responsible for maintaining and developing Web Applications utilizing a variety of web technologies like .Net Core , Angular, Azure, Microservices, MSSQL, IIS, Javascript, jQuery, CSS, etc. Develop and maintain dynamic, responsive UI components . Manage state with Redux, Context API, or other state management tools. Integrate RESTful APIs with React frontend and handle data efficiently. Build and maintain .NET MVC APIs, implementing business logic and security (JWT/OAuth). Optimize backend performance using caching, async processing, and error handling. Design and manage SQL Server databases, writing efficient queries and stored procedures. Ensure database security with role-based access control and encryption. Debug, test, and optimize both frontend and backend for seamless application performance. Contribute to continuous improvement initiatives within the development team. Qualifications Qualifications: Bachelor s degree in Computer Science, Information Technology, or a related field. 5+ years of experience in software development using .Net Core, Angular, Microservices, Azure. Hands-on experience with Docker for containerization. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work effectively in a team environment and manage multiple tasks simultaneously Skills/Attributes Synoptek core DNA behaviors: Clarity: Possesses excellent communication skills, makes a concentrated effort to speak the customers language. Ability to field questions with concise, well-constructed responses OwnIT: Shows integrity, innovation, and accountability in completing daily assignments Results: Solutions focused and driven to resolve conflict quickly and precisely. Proactively looks for opportunities to contribute to the company s business goals Growth: Willing to learn and ask questions. Constantly looking for new ways to improve yourself. Ability to adapt and grow in a fast-paced environment Team: Embraces both customers and colleagues as team members. Ability to be flexible, respectful, engaged and collaborative Ability to creating, articulate, and position an innovative and compelling value proposition so that customer executives clearly realize the benefits and transformation value of migrating to AWS Ability to define and execute technical migration strategies by working with highly technical teams Ability to work with large scale network, compute, and storage architectures to help remove roadblocks in customer migration projects Ability to engage with development, infrastructure, security, and IT operations teams at the customer and identifying repeatable patterns and architectures for cloud migration Ability to drive the co-development of new go-to-market approaches and opportunities. Ability to identify platform improvements, influencing future iterations of the AWS platform and the Working Conditions We live by the motto work hard, play hard and strive to support our employees in both their professional and personal goals. We believe that by hiring the right people, leading process improvement, and leveraging technology, we achieve superior results. Work is performed primarily in an office or remote environment. May be subject to time constraints and tight deadlines. May require occasional travel. EEO Statement We are proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, veteran status, sexual orientation, gender identity, marital status, pregnancy, genetic information, or any other characteristic protected by law and will not be discriminated against on the basis of disability. It is our intention that all qualified applicants are given equal opportunity and that employment decisions be based on job-related factors. ","qualifications":" Qualifications: Bachelor s degree in Computer Science, Information Technology, or a related field. 5+ years of experience in software development using .Net Core, Angular, Microservices, Azure. Hands-on experience with Docker for containerization. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work effectively in a team environment and manage multiple tasks simultaneously Skills/Attributes Synoptek core DNA behaviors: Clarity: Possesses excellent communication skills, makes a concentrated effort to speak the customers language. Ability to field questions with concise, well-constructed responses OwnIT: Shows integrity, innovation, and accountability in completing daily assignments Results: Solutions focused and driven to resolve conflict quickly and precisely. Proactively looks for opportunities to contribute to the company s business goals Growth: Willing to learn and ask questions. Constantly looking for new ways to improve yourself. Ability to adapt and grow in a fast-paced environment Team: Embraces both customers and colleagues as team members. Ability to be flexible, respectful, engaged and collaborative Ability to creating, articulate, and position an innovative and compelling value proposition so that customer executives clearly realize the benefits and transformation value of migrating to AWS Ability to define and execute technical migration strategies by working with highly technical teams Ability to work with large scale network, compute, and storage architectures to help remove roadblocks in customer migration projects Ability to engage with development, infrastructure, security, and IT operations teams at the customer and identifying repeatable patterns and architectures for cloud migration Ability to drive the co-development of new go-to-market approaches and opportunities. Ability to identify platform improvements, influencing future iterations of the AWS platform and the Working Conditions We live by the motto work hard, play hard and strive to support our employees in both their professional and personal goals. We believe that by hiring the right people, leading process improvement, and leveraging technology, we achieve superior results. Work is performed primarily in an office or remote environment. May be subject to time constraints and tight deadlines. May require occasional travel. EEO Statement We are proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, veteran status, sexual orientation, gender identity, marital status, pregnancy, genetic information, or any other characteristic protected by law and will not be discriminated against on the basis of disability. It is our intention that all qualified applicants are given equal opportunity and that employment decisions be based on job-related factors. ","responsibilities":" .Net Lead DotNet Core, Angular, Microservices, Azure. A Software Engineer will be responsible for maintaining and developing Web Applications utilizing a variety of web technologies like .Net Core , Angular, Azure, Microservices, MSSQL, IIS, Javascript, jQuery, CSS, etc. Develop and maintain dynamic, responsive UI components . Manage state with Redux, Context API, or other state management tools. Integrate RESTful APIs with React frontend and handle data efficiently. Build and maintain .NET MVC APIs, implementing business logic and security (JWT/OAuth). Optimize backend performance using caching, async processing, and error handling. Design and manage SQL Server databases, writing efficient queries and stored procedures. Ensure database security with role-based access control and encryption. Debug, test, and optimize both frontend and backend for seamless application performance. Contribute to continuous improvement initiatives within the development team. ","

Posted 2 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Location: Working Full time from Strategy Pune office. We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS. Key Responsibilities : ETL Development and Maintenance : Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals. Maintain existing ETL processes ensuring data accuracy and adequate process performance Data Warehouse Design & Development : Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions. Proficiently utilize SQL queries to retrieve and manipulate data as required. Data Quality and Analysis : Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality. Implement data quality improvement strategies to ensure the accuracy and reliability of data. Performance Optimization : Investigate and resolve database and query performance issues to ensure optimal system functionality. Continuously monitor system performance and make recommendations for improvements. Business Collaboration : Collaborate with business users to gather comprehensive data and reporting requirements. Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise. Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). Proven expertise in designing, implementing, and managing ETL processes and data warehouses. Proficiency with SQL and experience in optimizing queries for performance. Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Detail-oriented with strong problem-solving capabilities.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Key Result Areas and Activities: Design, develop and deploy ETL/ELT solutions on premise or in the cloud Transformation of data with stored procedures Report Development (MicroStrategy/Power BI) Create and maintain comprehensive documentation for data pipelines, configurations, and processes Ensure data quality and integrity through effective data management practices Monitor and optimize data pipeline performance Troubleshoot and resolve data-related issues Technical Experience: Must Have Good experience in Azure Synapse Good experience in ADF Good experience in Snowflake & Stored Procedures Experience with ETL/ELT processes, data warehousing, and data modelling Experience with data quality frameworks, monitoring tools, and job scheduling Knowledge of data formats like JSON, XML, CSV, and Parquet English Fluent (Strong written, verbal, and presentation skills) Agile methodology & tools like JIRA Good communication and formal skills Good To Have : Good experience in MicroStrategy and PowerBI Experience in scripting languages such as Python, Java, or Shell scripting Familiarity with Azure cloud platforms and cloud data services Qualifications : Bachelor s or Master s degree in Computer Science, Engineering, or a related field 3+ years of experience in Azure Synapse Qualities: Experience with or knowledge of Agile Software Development methodologies Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions. Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues. Able to consult, write, and present persuasively

Posted 2 weeks ago

Apply

4.0 - 7.0 years

22 - 25 Lacs

Mumbai

Work from Office

Naukri logo

Description Position at WebMD About the Company: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. For Company details, visit our website: www.webmd.com / www.internetbrands.com Education: B.E. in Computer Science/IT or related engineering discipline Experience: 4 - 7 years Shift Timings: 2:00 to 11:00 PM About the Role We are looking for a skilled Senior Software Engineer with a strong background in Python and Generative AI. You will play a key role in building agentic AI systems, designing prompt-driven workflows, integrating tools with LLMs, and mentoring the team on the latest advancements in AI technologies. Roles and Responsibilities Design & Development Architect and build AI-powered tools using LLMs (e.g., OpenAI, Claude, Mistral, etc.). Integrate and customize AI agents using frameworks like LangChain, CrewAI, or custom solutions. Collaborate with product and research teams to translate requirements into prompt workflows and agent behaviors. Design and implement scalable backends using Python and FastAPI/Django/Flask. Build and maintain vector search capabilities using tools like FAISS, LanceDB, Pinecone, etc. Prompt Engineering & Agent Systems Design advanced, modular prompt chains that incorporate memory, dynamic context injection, and multi-step reasoning across agent workflows. Experiment with and optimize agentic architectures for real-world tasks (retrieval, generation, summarization, etc.). Front-End Integration Front End Development - Hands-on experience with at least one modern front-end framework such as React or Vue.js for building responsive and interactive user interfaces. Testing & DevOps Write unit and integration tests to ensure reliability. Participate in code reviews, CI/CD pipelines, and deploy models/services using best practices. Mentorship & Collaboration Upskill peers in GenAI, prompt engineering, and best practices. Contribute to knowledge-sharing and architectural reviews across teams. Stay updated with evolving AI ecosystems, tools, and research. Requirements Core Technical Skills Strong Python skills (OOP, security, performance). Experience with frameworks like FastAPI, Flask, or Django. Proficiency with LLMs, OpenAI/Anthropic APIs, LangChain, Hugging Face Transformers, or similar. GenAI & Agentic AI 3+ years of experience working with GenAI and LLMs. 2+ years of experience in prompt engineering and AI agent architectures. Experience with vector databases (FAISS, Pinecone, LanceDB, Weaviate). Familiarity with agent frameworks like CrewAI, Autogen, Cursor, or Windsurf. Strong understanding of the Cursor Rule (intent context action) and real-world experience applying it in agent workflows. Solid understanding of how to construct and pass dynamic context to LLMs, including techniques like retrieval-augmented generation (RAG), session memory, and prompt templating. Experience fine-tuning or customizing LLMs is a plus. Frontend Hands-on experience with at least one modern front-end framework such as React or Vue.js for building responsive and interactive user interfaces. RDBMS Experience in relational database management systems such as SQL Server, MySQL, or PostgreSQL, and proficiency in writing SQL queries and stored procedures. Testing & DevOps Experience with Git, CI/CD workflows, and test automation. Familiarity with DevOps tools, monitoring, and logging in production environments. Soft Skills Excellent communication and collaboration skills. Self-driven and eager to mentor and share knowledge.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark / Databricks / bigquery / Airflow / composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Your Key Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Your skills and experience that will help you excel Bachelor s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. . . To all recruitment agencies . Note on recruitment scams

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a strong Data Engineer with DATA ANALYSIS/PROFILING SKILLS for our Enterprise Data Organization to develop and manage data pipelines (data ingestion, transformation, storage etc.) for an Azure/Snowflake cloud-based data analytics platform. The candidate will possess strong technical, analytical, programming, and critical thinking skills. The ideal candidate will have good experience with data transformation, data modeling, master data management, and metadata management. The candidate must also have excellent communication skills; leadership skills will be a plus. This is key to this role as the candidate will be will working closely with senior leadership, product team and potentially overseeing a team of engineers. Essential functions DATA ANALYSIS/PROFILING SKILLS for our Enterprise Data Organization to develop and manage data pipelines (data ingestion, transformation, storage etc.) for an Azure/Snowflake cloud-based data analytics platform. Qualifications Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Airflow Would be a plus Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships. Strong and effective communication skills (verbal and written). Strong analytical, problem-solving skills. Experience of working in a matrix organization. Ability to prioritize and deliver. Results-oriented, flexible, adaptable. Work well independently and lead a team. Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills. Familiar with Agile practices and methodologies Professional data engineering experience focused on batch and real-time data pipelines using Spark, Python, SQL Data warehouse (data modeling, programming) Experience working with Snowflake Experience working on a cloud environment, preferably, Microsoft Azure Cloud Data Warehouse solutions (Snowflake, Azure DW) We offer Opportunity to work on bleeding-edge projects Work with a highly motivated and dedicated team Competitive salary Flexible schedule Benefits package - medical insurance, sports Corporate social events Professional development opportunities Well-equipped office

Posted 2 weeks ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Job Description Job Location: Working Full time from Strategy Pune office. We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS. Key Responsibilities : ETL Development and Maintenance : Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals. Maintain existing ETL processes ensuring data accuracy and adequate process performance Data Warehouse Design & Development : Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions. Proficiently utilize SQL queries to retrieve and manipulate data as required. Data Quality and Analysis : Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality. Implement data quality improvement strategies to ensure the accuracy and reliability of data. Performance Optimization : Investigate and resolve database and query performance issues to ensure optimal system functionality. Continuously monitor system performance and make recommendations for improvements. Business Collaboration : Collaborate with business users to gather comprehensive data and reporting requirements. Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise. Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). Proven expertise in designing, implementing, and managing ETL processes and data warehouses. Proficiency with SQL and experience in optimizing queries for performance. Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Detail-oriented with strong problem-solving capabilities.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

About the Role We are seeking an experienced Senior MSSQL Developer with 3+ years of professional database development experience to lead our database initiatives. In this role, you will be responsible for designing, implementing, and optimizing enterprise-grade database solutions that support our Angular frontend and Spring Boot backend applications. You will serve as the subject matter expert for all database-related activities and mentor junior team members while establishing best practices for database development. Key Performance Indicators Architect and implement 2-3 complex database solutions per quarter with 98% requirements fulfillment Improve overall database performance by 35% within first year Reduce database-related production incidents by 40% year-over-year Achieve 99.9% database availability through proper design and maintenance Implement data governance standards resulting in 100% compliance with data regulations Lead at least 5 knowledge transfer sessions per quarter on database best practices Responsibilities Design and architect sophisticated database solutions for enterprise applications Lead database modeling efforts ensuring scalability, performance, and data integrity Develop and maintain complex stored procedures, functions, triggers, and views Implement advanced performance tuning techniques for high-volume, mission-critical databases Design and develop database security architecture including encryption, access control, and audit trails Create and maintain comprehensive database documentation including data dictionaries and ERDs Establish database coding standards, patterns, and best practices for the development team Develop and implement database migration strategies with minimal downtime Collaborate with application developers to optimize database integration with Spring Boot backend Mentor junior database developers and provide technical guidance Analyze and troubleshoot complex database performance issues Design and implement database monitoring solutions and preventative maintenance procedures Evaluate and recommend database technologies and tools to improve development efficiency Qualifications Bachelors degree in Computer Science, Information Systems, or related field (or equivalent experience) 3+ years of professional MSSQL development experience with progressively increasing responsibilities Proven track record of designing and implementing at least 5 enterprise-grade database solutions Experience optimizing database systems processing 1M+ transactions daily Demonstrated ability to reduce query execution time by 50%+ for complex operations Experience leading database projects and mentoring junior developers Strong understanding of database security principles and implementation Excellent analytical and problem-solving skills Effective communication skills for both technical and non-technical stakeholders Technical Skills Required : Advanced Microsoft SQL Server expertise (2016 or later versions) Expert-level T-SQL programming including complex stored procedures, CTEs, and dynamic SQL Proficiency with database performance tuning, query optimization, and execution plan analysis Experience with database high availability solutions (AlwaysOn, clustering, mirroring) Advanced indexing strategies and statistics management Database security implementation including row-level security and data masking Experience with SSIS, SSRS, and SSAS Proficiency with database version control and deployment methodologies Good to Have : Working knowledge of Spring Boot and Java for backend integration Familiarity with Angular or modern front-end frameworks Experience with Hibernate or other ORM frameworks Understanding of microservices architecture Experience with cloud database solutions (Azure SQL) Knowledge of NoSQL database technologies Measurable Achievements Expected in First Year Design and implement a comprehensive database monitoring solution reducing incident response time by 60% Architect at least 3 major database modules supporting new business initiatives Implement query optimization strategies resulting in 40% overall performance improvement Establish automated testing procedures for database changes with 95% code coverage Develop and document company-wide database standards and best practices Create a mentoring program for junior database developers Implement database security enhancements achieving 100% compliance with industry regulations

Posted 2 weeks ago

Apply

5.0 - 12.0 years

11 - 15 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Full Stack Developer (Senior and Lead Level) / / / Full Stack Developer (Senior and Lead Level) Full Stack Developer (Senior and Lead Level) Job Code: MACIN11215 Role: Full Stack Developer (Senior & Lead Level) Type of Commute: Remote Skill Set: .Net core, Restful API, CI/CD Pipelines, AWS Desired Industry Experience: 5 -12 years No. for backend development, especially with ASP.NET Web API. Angular (Latest) : Strong expertise in the latest version of Angular for building dynamic web applications. SQL Server : In-depth knowledge of SQL Server including database design, querying, stored procedures, and performance optimization. ASP.NET Web API : Experience with ASP.NET Web API for building APIs to support frontend Angular applications. Hands-on with Microservices would be an added advantage. TypeScript : Understanding of TypeScript for frontend development in Angular. HTML/CSS : Knowledgeable in integrating HTML and CSS for building responsive and visually appealing web applications. Version Control Systems : Experience with version control systems like Git for collaborative development. Ability to Lead: Guide group of developers, conduct code reviews, and enforce best practices.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Ahmedabad

Work from Office

Naukri logo

Get in touch with us to see what we can do for your company. Software Engineer, Backend (Java) Ahmedabad, Pune, Hyderabad About DataOrb DataOrb is revolutionizing how organizations understand and utilize their customer data. We enable businesses of all sizes from ambitious startups to Fortune 500 companies to unlock insights from their customer interactions across conversational, transactional, and structured datasets. Founded by veterans from Google, Amazon, Microsoft, and Samsung, were driven by a shared mission to democratize customer intelligence and make AI accessible to everyone. The Opportunity We are seeking a highly-skilled, experienced Java developer to join our expanding Engineering team. In this role, you will help develop and design technology solutions that are scalable, relevant, and critical to our company s success. You will focus on Java/Java EE development throughout all phases of the development lifecycle and must have a solid skill set, a desire to continue to grow as a developer, and a team-player mentality. Core Responsibilities Participate in the design and implementation of essential applications Demonstrate expertise and add valuable input throughout the development lifecycle Help design and implement scalable, lasting technology solutions Review current systems, suggesting updates as needed Gather requirements from internal and external stakeholders Test and debug new applications and updates Resolve reported issues and reply to queries in a timely manner Develop and utilize technical change documentation Strive to deploy all products and updates on time Help improve code quality by implementing recommended best practices Remain up to date on all current best practices, trends, and industry developments Maintain a high standard of work quality and encourage others to do the same Help junior team members grow and develop their skills Identify potential challenges and bottlenecks in order to address them proactively Required Qualifications Minimum 5 years of hands-on experience in backend development, building and maintaining large-scale, high-performance systems, ideally in enterprise or SaaS environments. Expertise in Java 17 (or latest LTS version), including strong understanding of object-oriented principles, functional programming features, and concurrency. Java SE 21 certification (OCP Java 21 Developer) is highly desirable. Deep hands-on experience with the Spring ecosystem: Spring Boot 3.x (latest), including advanced configurations, profiles, and actuator Spring Security 6.x, including OAuth2, JWT, and RBAC implementations Spring Data JPA 3.x with Hibernate ORM for data persistence Experience with microservices patterns like service discovery, API gateway, and configuration management. Strong problem-solving and debugging skills able to troubleshoot complex issues across microservices, logs, and distributed systems. Proven experience designing and developing microservices architectures using modern design patterns (e.g., API-first, domain-driven design, event-driven architecture). Cloud-native development experience, preferably with AWS (Lambda, S3, RDS, ECS) or Azure, including CI/CD pipeline setup, deployment, and monitoring. Proficiency in Git (latest version), including branching strategies, merge conflict resolution, and code review best practices (GitHub, GitLab, or Bitbucket). Solid understanding and hands-on experience with build tools: Maven 4.x (or 3.x LTS) Gradle 8.x (for modern projects) Advanced SQL skills (PostgreSQL, MySQL), including query optimization, stored procedures, and schema design. Basic proficiency in HTML5, CSS, and REST API design principles (OpenAPI 3.x, Swagger). Clear understanding of MVC architecture and RESTful service principles; hands-on experience developing and consuming REST APIs. Experience writing unit and integration tests using: JUnit 5 (Jupiter) Mockito 5.x Familiarity with Agile development practices (Scrum, Kanban) and Jira for task tracking and sprint planning. Bonus: Knowledge of containerization with Docker and orchestration with Kubernetes (K8s). Bonus: Exposure to GraphQL, gRPC, or Reactive Programming (Spring WebFlux, Reactor). Desired Experience Background in working on SaaS products Experience with AI/ML products Java Backend Engineer experience Educational Requirements Bachelors Or Master s degree in one of the following fields: Bachelor of Computer Science Bachelor of Engineering (Information Technology) Masters of Computer Science Master of Engineering (Information Technology) OR Equivalent professional experience in Backend Engineer (typically 4+ additional years of hands-on experience beyond the base requirement) Technical Toolkit JAVA Springboot Microservices Architecture AWS Why Join DataOrb Mission: Be part of democratizing customer intelligence and making AI accessible Impact: Shape how organizations understand and serve their customers Team: Work with experienced leaders from top tech companies Growth: Rapid scaling environment with significant learning opportunities Culture: Autonomous, trust-based environment focused on outcomes Benefits: Flexible work arrangements Comprehensive health coverage Generous PTO policy Professional development support Competitive compensation package Our Values Customer Obsession: We practice what we preach Democratizing Technology: Making complex solutions accessible Innovation with Purpose: Solving real customer problems Trust and Autonomy: Freedom to create and deliver excellence Upload CV (PDF, DOC, DOCX - max 10MB) Upload your resume Upload failed. Max size for files is 10 MB. LinkedIn Profile (Optional) When can you start a new role? Labor et dolore magna aliqua. Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Your application has been received. Our team will carefully review your profile, and if there s a potential match, we ll be in touch soon. Oops! Something went wrong while submitting the form. Turning Customer Interactions into Revenue Opportunities. Thank you! Your submission has been received! Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui

Posted 2 weeks ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly experienced and versatile Lead Data Engineer to join our engineering team. The ideal candidate will be proficient in the full software development lifecycle, with strong expertise in building micro services, Python, PySpark, and AWS cloud services. This role involves developing scalable enterprise applications within a Microservices Architecture, integrating databases, and ensuring high code quality through rigorous testing and engineering best practices. Key Responsibilities: Contribute across the entire technology stack: frontend, backend, and database layers. Design and develop enterprise-grade services using Python, Pyspark and AWS. Implement microservices-based architecture to build scalable and maintainable solutions. Apply software engineering principles to enhance the reliability, scalability, and maintainability of the codebase. Incorporate automated testing and ensure robust code coverage as a part of daily development. Collaborate closely with cross-functional teams in an Agile development environment. Design and implement ETL/ELT pipelines for automated data extraction, transformation, and loading Develop parameterized queries, job configurations, and data ingestion logic for diverse datasets Build and maintain data orchestration workflows using tools like Apache Airflow Partner with the Full Stack team to integrate backend APIs with ETL jobs and job status monitoring Support integration and load testing, and optimize job execution across environments Work on metadata ingestion, job audit logging, and traceability for compliance Participate in production deployment, job monitoring, and troubleshooting post go-live Document data models, pipelines, and configurations; support training and knowledge transfer Must-Have Skills: 10+ years of experience in data engineering and ETL development in complex enterprise environments Strong object-oriented programming knowledge Proficient in Python 3.6+. Strong debugging and performance tuning skills across the stack. Strong experience in Python and Django for backend API development Hands-on experience with AWS services (EC2, S3, Lambda, etc) Experience integrating and managing SQL databases. Hands-on experience with Apache Airflow for workflow orchestration. Hands-on experience with Apache Spark for big data processing and analytics. Hands-on experience in building ETL pipelines with tools like Python, SQL, or similar Exposure to containerization and orchestration tools (eg, Docker, Kubernetes). CI/CD pipeline experience with tools like Jenkins, GitLab CI/CD, or AWS Code Pipeline. Expertise in SQL Server: complex queries, stored procedures, performance tuning Familiarity with RESTful API integration for triggering and monitoring jobs Proficiency with job parameterization, scheduling, and configuration-driven pipelines Exposure to data validation, quality checks, and error handling mechanisms Experience with Git-based version control, CI/CD, and deployment processes Knowledge of metadata management and data cataloging concepts Familiarity with unit tests (Preferably with PYtest) Proven track record of working in fast-paced, agile teams. Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee we'llness Job Location : Bengaluru, India

Posted 2 weeks ago

Apply

10.0 - 15.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly experienced and versatile Lead Data Engineer to join our engineering team. The ideal candidate will be proficient in the full software development lifecycle, with strong expertise in building micro services, Python, PySpark, and AWS cloud services. This role involves developing scalable enterprise applications within a Microservices Architecture, integrating databases, and ensuring high code quality through rigorous testing and engineering best practices. Key Responsibilities: Contribute across the entire technology stack: frontend, backend, and database layers. Design and develop enterprise-grade services using Python, Pyspark and AWS. Implement microservices-based architecture to build scalable and maintainable solutions. Apply software engineering principles to enhance the reliability, scalability, and maintainability of the codebase. Incorporate automated testing and ensure robust code coverage as a part of daily development. Collaborate closely with cross-functional teams in an Agile development environment. Design and implement ETL/ELT pipelines for automated data extraction, transformation, and loading Develop parameterized queries, job configurations, and data ingestion logic for diverse datasets Build and maintain data orchestration workflows using tools like Apache Airflow Partner with the Full Stack team to integrate backend APIs with ETL jobs and job status monitoring Support integration and load testing, and optimize job execution across environments Work on metadata ingestion, job audit logging, and traceability for compliance Participate in production deployment, job monitoring, and troubleshooting post go-live Document data models, pipelines, and configurations; support training and knowledge transfer Must-Have Skills: 10+ years of experience in data engineering and ETL development in complex enterprise environments Strong object-oriented programming knowledge Proficient in Python 3.6+. Strong debugging and performance tuning skills across the stack. Strong experience in Python and Django for backend API development Hands-on experience with AWS services (EC2, S3, Lambda, etc) Experience integrating and managing SQL databases. Hands-on experience with Apache Airflow for workflow orchestration. Hands-on experience with Apache Spark for big data processing and analytics. Hands-on experience in building ETL pipelines with tools like Python, SQL, or similar Exposure to containerization and orchestration tools (eg, Docker, Kubernetes). CI/CD pipeline experience with tools like Jenkins, GitLab CI/CD, or AWS Code Pipeline. Expertise in SQL Server: complex queries, stored procedures, performance tuning Familiarity with RESTful API integration for triggering and monitoring jobs Proficiency with job parameterization, scheduling, and configuration-driven pipelines Exposure to data validation, quality checks, and error handling mechanisms Experience with Git-based version control, CI/CD, and deployment processes Knowledge of metadata management and data cataloging concepts Familiarity with unit tests (Preferably with PYtest) Proven track record of working in fast-paced, agile teams. Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee we'llness Job Location : Bengaluru, India

Posted 2 weeks ago

Apply

2.0 - 3.0 years

2 - 3 Lacs

Ahmedabad

Work from Office

Naukri logo

We are looking for a skilled and motivated .NET Core Developer with 2 years of professional experience. The ideal candidate will have strong backend development skills and a solid understanding of building scalable, efficient web applications using .NET Core. Key Responsibilities: Develop, test, and maintain web applications using .NET Core and C# Design and consume RESTful APIs for seamless data integration Work with SQL Server and write complex queries and stored procedures Collaborate with UI/UX designers and front-end developers to ensure smooth integration Participate in code reviews, debugging, and performance optimization Write clean, maintainable code and follow best practices in software development Contribute to unit testing and continuous improvement of the codebase Maintain proper documentation of development work and processes Required Skills & Qualifications: bachelors degree in Computer Science, Engineering, or related field 2 years of hands-on experience in .NET Core and C# Strong understanding of Object-Oriented Programming (OOP), SOLID principles, and design patterns Proficient in SQL Server and database design Experience with Entity Framework Core or ADO.NET Familiarity with front-end basics: HTML, CSS, JavaScript, jQuery Experience with version control systems like Git or SVN Basic knowledge of RESTful API integration Nice to Have: Experience with front-end frameworks such as Angular or React Knowledge of cloud platforms like Azure or AWS Familiarity with CI/CD pipelines and DevOps tools

Posted 2 weeks ago

Apply

2.0 - 4.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

We are seeking a Data Engineer with advanced expertise in Databricks SQL, PySpark, Spark SQL, and workflow orchestration using Airflow. The successful candidate will lead critical projects, including migrating SQL Server Stored Procedures to Databricks Notebooks, designing incremental data pipelines, and orchestrating workflows in Azure Databricks What will your job look like Migrate SQL Server Stored Procedures to Databricks Notebooks, leveraging PySpark and Spark SQL for complex transformations. Design, build, and maintain incremental data load pipelines to handle dynamic updates from various sources, ensuring scalability and efficiency. Develop robust data ingestion pipelines to load data into the Databricks Bronze layer from relational databases, APIs, and file systems. Implement incremental data transformation workflows to update silver and gold layer datasets in near real-time, adhering to Delta Lake best practices. Integrate Airflow with Databricks to orchestrate end-to-end workflows, including dependency management, error handling, and scheduling. Understand business and technical requirements, translating them into scalable Databricks solutions. Optimize Spark jobs and queries for performance, scalability, and cost-efficiency in a distributed environment. Implement robust data quality checks, monitoring solutions, and governance frameworks within Databricks. Collaborate with team members on Databricks best practices, reusable solutions, and incremental loading strategies All you need is bachelors degree in computer science, Information Systems, or a related discipline. 4+ years of hands-on experience with Databricks, including expertise in Databricks SQL, PySpark, and Spark SQL. Proven experience in incremental data loading techniques into Databricks, leveraging Delta Lakes features (eg, time travel, MERGE INTO). Strong understanding of data warehousing concepts, including data partitioning, and indexing for efficient querying. Proficiency in T-SQL and experience in migrating SQL Server Stored Procedures to Databricks. Solid knowledge of Azure Cloud Services, particularly Azure Databricks and Azure Data Lake Storage. Expertise in Airflow integration for workflow orchestration, including designing and managing DAGs. Familiarity with version control systems (eg, Git) and CI/CD pipelines for data engineering workflows. Excellent analytical and problem-solving skills with a focus on detail-oriented development. Preferred Qualifications Advanced knowledge of Delta Lake optimizations, such as compaction, Z-ordering, and vacuuming. Experience with real-time streaming data pipelines using tools like Kafka or Azure Event Hubs. Familiarity with advanced Airflow features, such as SLA monitoring and external task dependencies. Certifications such as Databricks Certified Associate Developer for Apache Spark or equivalent. Experience in Agile development methodologie Why you will love this job: You will be able to use your specific insights to lead business change on a large scale and drive transformation within our organization. You will be a key member of a global, dynamic and highly collaborative team with various possibilities for personal and professional development. You will have the opportunity to work in multinational environment for the global market leader in its field! We offer a wide range of stellar benefits including health, dental, vision, and life insurance as we'll as paid time off, sick time, and parental leave

Posted 2 weeks ago

Apply

4.0 - 6.0 years

3 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2331_JOB Date Opened 27/07/2024 Industry Other Job Type Work Experience 4-6 years Job Title DotNet Backend Developer City Gurgaon Kty. Province Haryana Country India Postal Code 122001 Number of Positions 1 Proficiency in backend programming languages and framework such as C#, Asp.net core, MVC Understanding of HTTP methods (GET, POST, PUT, DELETE) and status codes. SQL Proficiency: Advanced SQL query writing Joins, subqueries, and Common Table Expressions (CTEs) Aggregate functions and window functions Data manipulation (DML) and data definition (DDL) statements Stored procedures, functions, triggers Database Design & Management: Schema design and normalization Indexing strategies for performance optimization Database partitioning Data integrity and consistency Database System: Experience with major relational databases such asPostgreSQL, Oracle Experience4 to 8 years check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2198_JOB Date Opened 15/04/2024 Industry Technology Job Type Work Experience 6-10 years Job Title PLSQL Developer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 4 Proven experience as a PL/SQL Developer or similar role with at least 6 years of hands-on experience. Proficiency in writing complex SQL queries, PL/SQL stored procedures, functions, triggers, and packages. Experience in performance tuning and optimization techniques for Oracle databases. Strong understanding of database concepts and principles. Familiarity with version control systems such as Git. Excellent problem-solving and analytical skills. Ability to work independently and collaboratively in a team environment. Effective communication skills, both verbal and written. Certification in Oracle PL/SQL or related technologies is a plus. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

7.0 - 9.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1664_JOB Date Opened 19/12/2022 Industry Technology Job Type Work Experience 7-9 years Job Title Sr. Informatica IDQ Developer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 Informatica Data Quality (IDQ) Developer experience mandatory (version 9.6.1 or above, at least 2-5 years) Experience in developing IDQ projects using both IDQ Developer and IDQ Analyst interface Has good understanding about Data Warehousing concepts Experience with RDBMS like SQL Server, Oracle (SQL scripting, Stored Procedures), Unix scripting Proficient in understanding business requirements and producing technical requirements specifications Hands-on experience in Data Profiling, Scorecards and Data Standardization check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Kochi

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2115_JOB Date Opened 05/03/2024 Industry Technology Job Type Work Experience 5-8 years Job Title SQL Developer City Kochi Province Kerala Country India Postal Code 682001 Number of Positions 4 Skillsets- SQL or PLSQL or ETL or DWH or PowerBI. Data warehouse is not mandatory. But should be really good in SQL, Postgre SQL . F2F interview at Kochi Only Kerala candidates check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2059_JOB Date Opened 01/11/2023 Industry Technology Job Type Work Experience 5-8 years Job Title SQL Developer City Bangalore North Province Karnataka Country India Postal Code 560002 Number of Positions 3 SQL development via expertise in all aspects related to it. This means analysis to understand the business requirement, taking an optimized approach to developing code and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, ctes, functions and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL 3+ years of experience in teams where SQL outputs were consumed via PowerBI / Tableau / SSRS and similar tools Good communication skills to be able to discuss and deliver requirements effectively with the client Good to have some prior experience or high-level understanding of hedge funds, private debt and private equity check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies