Home
Jobs

9663 Postgresql Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Position Description: We are looking for PHP Developer who has at least 2-3 years of experience in the same and must have in-depth knowledge of web technology. He/she must be able to work with our team and build out-of-the-box web applications according to our clients requirement. Skill Sets Required: Javascript, HTML, CSS, MySQL, Oracle, MongoDB, Postgresql, JQuery, AngularJS, AJAX, XML, JSON, PHP Frameworks (such as CakePHP, Laravel), CMS (such as Drupal, Joomla), Ecommerce platform (such as WooCommerce, Magento) Looking For: Must be able to design and build modern web based applications Must have knowledge to work with outside databases and APIs Have developed at least 5 web applications Experience with third party libraries or payment gateway integration Must have experience in handling complete life cycle of Web Application Development Good written and verbal communication skill Excellent analytical/troubleshooting skills Experience with Ecommerce websites, CMS, Shopping Cart, Frameworks, object oriented PHP Good knowledge of HTML/PHP editor like Dreamweaver Should be familiar with version controls like SVN, Git Willingness to work on LAMP and upcoming internet technologies (Linux,Apache,MySQL,PHP) Good analytical/troubleshooting skills and problem solving ability Excellent Algorithms & Data-Structure skills

Posted 1 day ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are hiring for Python Developers Experience: 5+ Years Location: Mumbai, Chennai, Pune, Hyderabad, Bangalore Primary Skills: Python, Flask, Django Job Description Proficiency in Python programming and object-oriented design principles Extensive experience with Django framework and building RESTful APIs Experience in Distributed programming using Celery Some experience in Redis Proficiency in working with PostgreSQL databases Hands on experience AWS cloud services In depth knowledge of the software development life cycle and Agile methodologies Proficiency in version control systems such as Git and good communication skills Utilize best practices in software engineering including SOLID principles design patterns and code refactoring techniques "LTIMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, disability or any other characteristic protected by applicable law."

Posted 1 day ago

Apply

5.0 years

0 Lacs

Prayagraj, Uttar Pradesh, India

On-site

Linkedin logo

💼 Job Title: Full Stack Software Engineer (ReactJS / NodeJS / PostgreSQL) 📍 Location: On-site – Prayagraj 🧑‍💻 Experience: 3–5 years (3 years m inimum experience ) About the Role We are seeking a passionate Full Stack Developer to join our product-focused team at Prayagraj. You’ll play a key role in building and scaling high-performance, modern web applications using ReactJS, NodeJS (ExpressJS), and PostgreSQL. If you're someone who thrives in a fast-paced environment and enjoys building end-to-end solutions, this is the role for you. Key Responsibilities Collaborate with leads and stakeholders to understand requirements and translate them into technical designs and features. Guide junior developers and interns in their daily tasks and development practices. Build, test, and deploy scalable web applications using ReactJS, NodeJS, and PostgreSQL. Develop and maintain efficient CI/CD pipelines (preferably with GitHub Actions). Optimize application performance and scalability. Write clean, maintainable, and well-documented code. Debug and resolve issues across the full stack. Integrate RESTful APIs and third-party services. Participate actively in code reviews, agile ceremonies, and team discussions. Required Skills & Qualifications 3+ years of full-stack development experience. Proficient in ReactJS (including hooks, lifecycle methods, Context API/Redux). Strong experience with NodeJS / ExpressJS. Solid knowledge of PostgreSQL – complex queries, performance tuning, schema design. Experience designing and consuming RESTful APIs. Proficient with Git and familiar with version control workflows. Hands-on experience building CI/CD pipelines, preferably with GitHub Actions. Familiarity with Postman, and optionally Docker. Strong experience working with Linux environments, including server optimization. Self-driven, team-oriented, and excellent communication skills. Exposure to AWS services is a plus. This role offers an opportunity to contribute meaningfully to a growing product team and help shape the future of our platform.

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

This is a remote position. We are seeking a Software Engineer (Backend) to join our team. Responsibilties: Design and Implement Scalable Servers in Rust: Build high-performance servers deployed on AWS that interface with the Solana network and Orca's web application. Develop Data Processing Pipelines: Create systems for processing blockchain data, focusing on reliability and latency. Advance Application Features: Leverage these servers to develop key application features such as real-time monitoring, automated transaction execution, and token balance analytics. Manage Database Interactions: Handle data storage and management using PostgreSQL, ensuring data integrity and efficient retrieval. Requirements Proven Track Record: You’ve worked on performant services at scale, ideally in finance or adjacent industries. You have a strong ability to analyze complex software systems, reason about them with clarity, and solve problems efficiently. Educational Background: Bachelor’s or Master’s degree in Computer Science or a related field from a recognized university. Balanced Mindset: Comfortable working independently in an async environment, but also enjoy coming together to collaborate as a team. Passionate about certain technologies and patterns, but recognize their trade-offs in different scenarios. Technical Skills: Experience with Rust is a plus but not required. Strong interest and proficiency in building high performance backend systems. AI Forwardness: We’re looking for flexible thinkers who are open to leveraging AI tools to enhance developer productivity and workflows. You’re curious about the evolving AI landscape and willing to adapt development practices as the tools mature. Understanding of DeFi: You have a good understanding of how AMMs work and the Solana ecosystem. You are a user of DeFi apps on Solana or other chains. Benefits Work Location: Remote 5 days working

Posted 1 day ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Design, develop, and maintain scalable web applications using Java (Spring Boot, Hibernate, etc.) Build responsive and dynamic user interfaces using HTML, CSS, JavaScript, and frameworks like Angular or React Collaborate with cross-functional teams to define, design, and ship new features Write clean, maintainable, and efficient code Participate in code reviews, testing, and debugging Integrate RESTful APIs and third-party services Ensure application performance, quality, and responsiveness Stay up-to-date with emerging technologies and industry trends Required Skills & Qualifications Bachelors degree in Computer Science, Engineering, or related field 8+ years of experience in Java development (Spring Boot, JPA, etc.) Proficiency in frontend technologies: HTML5, CSS3, JavaScript, and frameworks like Angular, React, or Vue.js Experience with RESTful APIs and microservices architecture Familiarity with databases such as MySQL, PostgreSQL, or MongoDB Knowledge of version control systems (e.g., Git) Strong problem-solving and communication skills Preferred Qualifications Experience with DevOps tools (Docker, Jenkins, Kubernetes) Familiarity with cloud platforms (AWS, Azure, GCP) Understanding of Agile/Scrum methodologies Experience with unit testing and test-driven development (TDD)

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Proven experience as a Full Stack Developer, with a focus on Angular, React for the front-end. Strong knowledge of HTML, CSS, JavaScript, TypeScript, and modern JavaScript frameworks. Proficiency in back-end technologies such as Node.js or .NET. Experience with RESTful API development and integration. Familiarity with SQL (MySQL, PostgreSQL) and/or NoSQL (MongoDB) databases. Experience with version control systems such as Git. Knowledge of cloud platforms (e.g., AWS, Azure) and containerization (e.g., Docker) is a plus. Strong understanding of software development principles and Agile methodologies. Excellent problem-solving skills and ability to work in a collaborative team environment. Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

We’re hiring a Java Backend Developer on behalf of our client – a fast-growing technology company building scalable, mission-critical systems. This is a remote, part-time opportunity ideal for professionals experienced in building high-performance applications. The selected candidate will: Design and develop high-volume, low-latency backend services. Work on mission-critical systems with a focus on reliability and scalability. Contribute to the entire development lifecycle in an agile setup. Key Requirements Bachelor’s degree in Computer Science or related field. 2–5 years of hands-on Java development experience . Strong knowledge of Java EE and object-oriented design patterns. Database experience with MySQL, PostgreSQL, or Oracle. Familiarity with Git or other version control tools. Strong problem-solving and analytical skills. Responsibilities Write efficient, maintainable, and testable code. Ensure compliance with design specifications. Prepare documentation and technical releases. Continuously explore improvements and modern technologies. Job Details ⏰ Shift : 8 AM to 1 PM CST 📅 Schedule : Monday to Friday 🧑‍💻 Experience : 2–5 Years 💵 Salary : Based on skills & experience 🌐 Location : Remote (Work from Home)

Posted 1 day ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for data visualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python ○ OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for data visualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python ○ OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below client is a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Senior Test Specialist Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Roles and Resposibilties: Ability to design and execute test methods and automation scripts for requirements verification purposes, at multiple integration / solution levels. Leads incidents triage, performs root cause analysis and contributes with Preventive Actions design. Uses Test Management tools for test planning review, execution, monitoring review (proficient key metrics interpretation) and reporting of Test Results. Uses defect management tools for defect creation, and monitoring of resolution. Fully understands and applies The client's Standards and tools for pipeline development. Follows code artisanship best practices, quality and security standards; and contributes to the improvement of them. Understands and applies industry Software Quality Assurance standards (ASPICE, ISTQB, ASAM, INCOSE, etc.) Often provides professional advice on technical or procedural issues. Creates initial reports/analysis for review by team members; provides feedback on draft reports/analysis for further improvement and may deliver final report. Demonstrates professional communication style in team, cross team and partner settings, and effectively identifies and adapts communications for different audiences; provides feedback to newer team members on communication style. Communicates effectively with peer-to-peer interactions and with 2 levels above (+2 rule). Demonstrates strong relationships within team through knowledge of problem domain. Starts to identify patterns of interactions with others that may help or hinder team success. Direct contribution via own work as well as team level contributions through mentoring. Skills Required: Test Automation Experience Required: Bachelors Degree in Computer Science, Engineering, or equivalent work experience 5+ years of professional experience on Software / Requirements Verification projects (Test Suite and Automation solutions for multiple Verification / Validation scopes & integration levels) 5+ years of experience with test design Experience Preferred: Advanced experience with Requirements Engineering, and development Types and Strategies. Advanced experience on Functional Architecture, or Software Architecture. Effectively uses software configuration management (source control, DevSecOps, CI/CD, etc.). Proven experience with: Java Full stack development (Springboot, Microservices, React) Persistence - Buckets, PostgreSQL Bigtable Work effectively on an agile team following agile practices with Internal SW Development groups as well as Tier I&II (external suppliers) Cloud technologies experience (such as GCP, AWS, Azure). Experience with software operations (DevSecOps, SRE, observability, support/maintenance, etc.). Experience in secure coding practices and modern software development methodology, such as pair programming, test-first/test-driven development OR demonstrated delivery of singular focus programming. Proficient with Automation tools such as Selenium, Cucumber, REST Assured. Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 day ago

Apply

3.0 - 8.0 years

10 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

1. Core Development Design, develop, and maintain scalable backend services using Node.js and TypeScript . Build RESTful APIs or GraphQL endpoints. Write clean, modular, and well-documented code following best practices. 2. Database Management Design and optimize relational (PostgreSQL, MySQL) and/or NoSQL (MongoDB, Redis) database schemas. Write and optimize database queries. Manage migrations and data integrity. 3. API Integration & Microservices Develop and integrate internal and third-party APIs. Implement and manage microservices architecture when required. Ensure secure data transmission and handle authentication/authorization (JWT, OAuth2). 4. Testing & Debugging Write unit, integration, and end-to-end tests (Jest, Mocha, etc.). Conduct debugging, profiling, and performance tuning. Work with CI/CD pipelines for automated testing and deployments. 5. Security & Compliance Implement backend security best practices (input validation, data sanitization, rate limiting). Ensure data protection and compliance (e.g., GDPR, HIPAA) if applicable. 6. Deployment Collaborate with Clouds for deployment using tools like cloud platforms (AWS or GCP) Monitor system performance, uptime, and logs (e.g., Prometheus, Grafana, ELK Stack). 7. Collaboration & Communication Work closely with frontend developers, designers, and product managers. Participate in Agile ceremonies like sprint planning, daily stand-ups, and retrospectives. Document APIs, architecture decisions, and backend logic. 8. Code Quality & Standards Follow TypeScript typing standards for safety and maintainability. Perform code reviews and ensure adherence to team conventions and style guides.

Posted 1 day ago

Apply

2.0 - 4.0 years

7 - 14 Lacs

Pune

Work from Office

Naukri logo

• 2+ years of professional experience working with Python backend frameworks, specifically Django and Fast API. • Experience with ORMs including Django ORM and SQL Alchemy.

Posted 1 day ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Designs, develops, modifies, adapts and implements solutions to information technology needs through new and existing applications, systems architecture, systems strategy, integration services and applications infrastructure to meet client requirements. Reviews system requirements and business processes; codes, tests, debugs and implements software solutions. Designs, codes, tests and/or customizes solutions to meet client requirements. May support systems infrastructure, desktop or network architects by preparing detailed specifications. Develops new technology product ideas or strategic product extensions for internal use or as commercial products. Establishes technology product specifications, and collaborates with various functions to ensure successful product development and implementation. Follows work instructions; codes, tests, debugs; implements and maintains software solutions, under direction of the manager. Develops tests, debugs and implements operating systems components, software tools and utilities, under direct supervision. Provides technical support in a defined project under direct supervision. Develops program logic for new applications or analyses and modifies logic in existing applications, often learning on the job from more senior colleagues as size of the challenge increases. Modifies existing internal software products to add new functions, adapt to new hardware, improve performance or enhance product usability under direct supervision. Implements and monitors basic system improvements to increase efficiency. Under supervision, supports applications to be compatible across multiple computing platforms and browsers. Education Required: Degree qualified in a discipline related to Computer Science, Information Systems, or equivalent work experience. Experience Required: At least 2 year Special Qualifications: Knowledge of Java, JavaScript, SQL, Spring, Spring Boot, Hibernate, HTML, CSS, RESTful APIs, MySQL, PostgreSQL, Oracle, and tools such as Maven, Gradle, Git, and Jenkins. Good to have knowledge of AWS and OCI and at least one or more operating system, such as Windows, Linux/Unix, etc. Come as You Are Nasdaq is an equal opportunity employer. We positively encourage applications from suitably qualified and eligible candidates regardless of age, color, disability, national origin, ancestry, race, religion, gender, sexual orientation, gender identity and/or expression, veteran status, genetic information, or any other status protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The Global Analyst will perform, manage, and coordinate activities associated with data analysis leveraging IQVIA Connected Intelligence™ Real-World Data (RWD) and clinical data assets for multiple Research & Development projects while closely working with our team of Therapeutic Analytic Leads. They will also ensure standardization in how IQVIA uses data, tools, and process to inform quality decision making at the indication, program, and study level. Responsibilities: Utilizes the IQVIA Connected-intelligence data sets and resources to define and enhance clinical trial strategy, both pre- and post-award Communicates with internal stakeholders to align on requirements, capabilities, and delivery of data analytics Drives the methodology and implementation of data analytics deliverables, including country evaluation and ranking, competitive landscape assessment, historical recruitment analysis, and patient density analytics Generates patient insights using real-world data to support site targeting activities Coordinates the collection of site outreach data to support development of the country/site strategy Qualifications: Bachelor’s in Life Sciences, Information Technology, Computer Science, Statistics or related field Experience in data analytics, clinical research, or consulting in the pharmaceutical or healthcare industries Experience working with large volumes of electronic data, such as medical claims, sales, prescriptions, electronic medical records/electronic health records, or similar General knowledge of the pharmaceutical and healthcare market, as well as familiarity with drug development processes Experience working with global teams based across multiple geographies is preferred Experience leveraging business intelligence tools such as Power BI or Tableau is preferred Hands-on experience using object oriented and/or scripting languages (Python, R, Spark, or PySpark) and/or to relational databases (MS SQL Server, Oracle SQL, or PostgreSQL) would be a plus Skills Strong attention to detail Effective presentation skills Proficiency using MS Excel and MS PowerPoint Logical approach to problem solving and task prioritization Excellent communication (verbal/written) and ability to interact with dynamic, global teams Ability to acquire new skills and evolve to new systems IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com

Posted 1 day ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Hi there, I won't bore you with a long introduction you'll probably never read. The terms are simple: Role: Flutter Developer Intern Duration: 2 months Location: Remote Timings: Flexible Timings Stipend: Mentioned in Registration form Apply if: - You're looking for cool projects to work on - Zero to one projects is definitely your thing - You’re strong with fundamentals of Computer Science Engineering - You're looking to get your hands dirty with Generative AI & LLMs - You're willing to work remotely in a startup-environment directly with the founding team. - You have an appetite for high-ownership large scale projects with real life impact - You don't stop at what you already know and are willing to learn throughout the course of the internship Do not apply if: - You've already skipped reading the above description - You're here just to "figure things out" - You have zero hosted projects in your resume/cover letter - You can't think on your feet and take critical decisions on your own. - You're a sitting duck and do the work assigned to you but don't come up with any innovative ideas around increasing productivity or building faster workflows. Tech Stack: Flutter, Android, Git/Github, PostgreSQL, Firebase + GenAI Interview Rounds: - Technical Interview - Direct Interview with Founder - Onboarding We don't do lengthy DSA rounds or technical rounds. If you can build good stuff, you're our type + brownie points If you've ever done open source contributions, released a package at pub.dev

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Date: Jun 19, 2025 Company: Zelestra Location Gurugram, India About Us Zelestra (formerly Solarpack) is a multinational company fully focused on multi-technology renewables with a vertically integrated business model focused large-scale renewable projects in rapidly growing markets across Europe, North America, Latin America, Asia, and Africa. Headquartered in Spain, Zelestra has more than 1000 employees worldwide and is backed by EQT, one of three largest funds in the world with $200B in assets. One solution doesn’t fit all, especially in energy. We’re on a journey alongside our clients, assisting them in achieving their decarbonization goals. We are committed to developing tailored-made solutions by analyzing power market challenges and co-creating structured products based on customer insights. One of the top 10 sellers of clean energy to corporates in the world, according to Bloomberg NEF, we are committed to tailored solutions to meet customer needs. At Zelestra we aim to be a solid and solvent company, capable of executing quality and valuable projects for the society and the environment. Therefore, we maintain a firm commitment to contribute directly to the social development of the communities and markets in which we operate, not only through the creation of economic value, but also through the generation of quality employment and through the social projects we promote. To support our mission, we are looking for a highly skilled Web and Mobile app developer with a strong background in renewable energy, particularly solar PV and wind operations. Mission We are seeking a highly skilled Web and Mobile App Developer with expertise in designing, developing, and maintaining responsive web and mobile applications. The ideal candidate should have experience with front-end and back-end technologies, cross-platform frameworks, and cloud-based services. You will work closely with UI/UX designers, product managers, and backend AI/ML engineers to build seamless, high-performance applications. This is an individual contributor role where candidate will need to collaborate with Product, Tech and Business team members. Responsibilities Web & Mobile App Development Design, develop, and maintain responsive, cross-platform applications using modern frameworks such as React, Angular, Flutter, React Native, Swift, and Kotlin. Implement adaptive UI/UX designs to ensure high-quality user experiences across devices. Ensure compatibility and performance on both iOS and Android platforms. Backend Integration & API Development Collaborate with backend teams to integrate RESTful APIs, GraphQL, and cloud services (AWS, Firebase, Azure). Implement secure authentication mechanisms (OAuth, JWT) and follow best practices for data security and protection. Performance Optimization & Debugging Optimize applications for speed, scalability, and responsiveness. Identify, debug, and resolve UI, connectivity, and performance issues across various devices and platforms. Collaboration & Deployment Work closely with designers, QA testers, and product teams to deliver innovative and functional digital solutions. Deploy applications to Google Play Store, Apple App Store, and web hosting platforms. Continuous Learning & Best Practices Stay updated with emerging technologies, frameworks, and development trends. Follow best coding practices, maintain clean documentation, and work within Agile/Scrum development methodologies. Job Requirements Bachelor's / Master's degree in Computer Science, Software Engineering, or related field. A minimum of 5 years of experience in web and mobile app development. Proficiency in full-stack development, including frontend frameworks (React.js, Angular, Vue.js), mobile platforms (React Native, Flutter, Swift, Kotlin), and backend technologies (Node.js, Python with Django/Flask, Firebase, .NET, PHP). Strong experience with databases (PostgreSQL, MySQL, MongoDB, Firebase Firestore), API design and integration (RESTful APIs, GraphQL, WebSockets), DevOps practices (Docker, AWS, Azure, CI/CD pipelines), and version control systems (Git, GitHub, GitLab). Strong problem-solving and critical-thinking abilities, with excellent analytical skills. Proven ability to communicate and collaborate effectively across technical and business teams. Comfortable working in fast-paced, agile development environments. What We Offer Career opportunities and professional development in a growing multinational company with a team highly qualified. Flexible compensation. Full working day. Remote work 2 days a week. Zelestra celebrates the diversity of thought and experience that comes from a variety of backgrounds including, among others, gender, age, ethnicity... Our mission is to contribute to a fairer and more equitable society. JR2248 Let's co-build a carbon-free tomorrow! Visit us at zelestra.energy

Posted 1 day ago

Apply

11.0 - 21.0 years

50 - 100 Lacs

Bengaluru

Hybrid

Naukri logo

Our Engineering team is driving the future of cloud securitydeveloping one of the worlds largest, most resilient cloud-native data platforms. At Skyhigh Security, were enabling enterprises to protect their data with deep intelligence and dynamic enforcement across hybrid and multi-cloud environments. As we continue to grow, were looking for a Principal Data Engineer to help us scale our platform, integrate advanced AI/ML workflows, and lead the evolution of our secure data infrastructure. Responsibilities: As a Principal Data Engineer, you will be responsible for: Leading the design and implementation of high-scale, cloud-native data pipelines for real-time and batch workloads. Collaborating with product managers, architects, and backend teams to translate business needs into secure and scalable data solutions. Integrating big data frameworks (like Spark, Kafka, Flink) with cloud-native services (AWS/GCP/Azure) to support security analytics use cases. Driving CI/CD best practices, infrastructure automation, and performance tuning across distributed environments. Evaluating and piloting the use of AI/LLM technologies in data pipelines (e.g., anomaly detection, metadata enrichment, automation). Evaluate and integrate LLM-based automation and AI-enhanced observability into engineering workflows. Ensure data security and privacy compliance. Mentoring engineers, ensuring high engineering standards, and promoting technical excellence across teams. What We’re Looking For (Minimum Qualifications) 10+ years of experience in big data architecture and engineering including deep proficiency with AWS cloud platform. Expertise in distributed systems and frameworks such as Apache Spark, Scala, Kafka, Flink and Elasticsearch with experience building production-grade data pipelines. Strong programming skills in Java for building scalable data applications. Hands-on experience with ETL tools and orchestration systems. Solid understanding of data modeling across both relational (PostgreSQL, MySQL) and NoSQL (Hbase) databases and performance tuning. What Will Make You Stand Out (Preferred Qualifications) Experience integrating AI/ML or LLM frameworks (e.g., LangChain, LlamaIndex) into data workflows. Experience implementing CI/CD pipelines with Kubernetes, Docker, and Terraform. Knowledge of modern data warehousing (e.g., BigQuery, Snowflake) and data governance principles (GDPR, HIPAA). Strong ability to translate business goals into technical architecture and mentor teams through delivery. Familiarity with visualization tools (Tableau, Power BI) to communicate data insights, even if not a primary responsibility.

Posted 1 day ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

Ahmedabad

Work from Office

Naukri logo

Preferred candidate profile Strong knowledge of JavaScript/TypeScript , React.js/Angular Proficiency in Node.js / .NET / Python for backend development. Experience with SQL databases like PostgreSQL, MSSQL, MySQL . Experience with NoSQL databases like MongoDB will be an added advantage. Understanding of Git , Docker , and cloud platforms (AWS/Azure). Knowledge of Agile/Scrum methodologies. Perks Referral bonus. Performance Bonus (on top of CTC) Gratuity (on top of CTC) 5 Days working. No sandwich leave policy / No Bond Leave Encashment /Paid Parental leave. Employee-oriented HR policy. Equal Employment Opportunity Policy.

Posted 1 day ago

Apply

4.0 - 6.0 years

10 - 15 Lacs

Faridabad

Work from Office

Naukri logo

About Specscart Specscart is the fastest-growing eyewear business in the UK, revolutionising the industry by changing how people perceive eyeglasses. In just seven years, we have achieved an impressive year-on-year growth of over 100%. With three smart retail stores in the UK and a state-of-the-art glazing lab in Bury, Manchester, we are looking for a talented Sr.Full Stack Developer to join our team in India. What Were Looking For We are seeking a Sr.Full Stack Developer , specialising in Node.js, React.js and Next.js . The ideal candidate will be responsible for designing and developing scalable web applications, ensuring seamless integration of front-end and back-end functionalities. This role is crucial in building robust, high-performance systems to support Specscart's growth. Roles and Responsibilities 1. Lead development of complex, scalable web applications with a strong focus on performance, reliability, and architectural integrity. 2. Architect and implement systems with a future-ready approach especially for launching and managing multiple international storefronts. 3. Optimise systems and API integrations to ensure seamless functionality and fast load times. 4. Take ownership of full project lifecycles from planning and prototyping to deployment and maintenance. 5. Mentor and guide junior developers, conducting code reviews, solving blockers, and ensuring best coding practices. 6. Collaborate cross-functionally with product, design, operations, and marketing teams to align development with business needs. 7. Ensure strong data security, performance, and scalability across all services. Must-Have Skills: 1. With 4-6 years of experience as a Full Stack Developer, brings solid logical reasoning and technical expertise to build efficient and scalable applications. 2. Strong command over Node.js, Express.js, TypeScript/JavaScript, and database management (MongoDB, PostgreSQL, or MySQL). 3. Experience with frontend technologies like React, Next.js, HTML, CSS, and Tailwind. 4. Clear understanding of RESTful APIs, server-side logic, and cloud infrastructure (AWS, GCP, or similar). 5. Proven experience in system design, codebase optimisation, and modular architecture. 6. Ability to troubleshoot deeply and solve complex bugs across backend and integration points. 7. Familiarity with CI/CD pipelines, Git workflows, and deployment processes. 8. Excellent communication skills and a track record of leading or mentoring teams. Nice to Have: 1. Experience with internationalisation (i18n), multi-store setups, or headless CMS integrations. 2. Exposure to e-commerce platforms and performance optimisation for high-traffic environments. 3. Experience in development team handling. Benefits Salary: Negotiable (no bar for the right candidate). Sundays off and 3rd Saturday off each month. Bonus plans. Paid holidays annually. Opportunities for career growth and the chance to work in the UK in the future. Why Join Specscart? Working at Specscart gives you the opportunity to thrive in your career and contribute to the success of the biggest challenger in the global eyewear market. With achievements like making it to the prestigious Forbes 30 Under 30 Europe list, The Sunday Times Seven Ones to Watch list, Wired Trailblazers and North West Business Insider 42 Under 42, the opportunities are endless for both the brand and every individual working towards its vision. Join us and become part of a dynamic and innovative organisation where personal and professional growth opportunities abound. How to Apply? Send your CV and portfolio to Shreya@specscart.co.uk with a cover letter explaining why you're the perfect fit for this role.

Posted 1 day ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Gurugram

Work from Office

Naukri logo

Responsibilities: Design and develop robust backend services in Java and GraphQL. Ensure high performance, scalability, and integration with frontend systems. Collaborate with product teams to translate requirements into backend functionality. Troubleshoot and optimize backend performance issues. Requirements: Strong proficiency in Java, GraphQL, Spring Boot, and Hibernate. Experience with relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Familiarity with microservices architecture, Docker, and Kubernetes. Understanding of security principles, OAuth, and API Gateway implementation. Passion for technology that positively impacts rural India.

Posted 1 day ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Noida

Work from Office

Naukri logo

Job Title: Lead DevOps Engineer Cloud, CI/CD, Security (Banking Domain) Location: Noida Experience: 7 -12 years Employment Type: Full-Time Work Mode: Onsite Job Summary: We are seeking a technically proficient and independent DevOps Lead to drive automation for a large enterprise-scale application built using .NET Core, Angular, Oracle, and PostgreSQL . The role requires a strong background in Linux and AIX environments , proficiency in SVN and Git , and the ability to build and maintain automated, scalable, and secure CI/CD pipelines. Key Responsibilities: Lead and own all DevOps functions for development, testing, release, and operations. Build and maintain CI/CD pipelines for multi-tier applications developed in .NET Core and Angular , with backend databases in Oracle and PostgreSQL . Administer and support Linux (RedHat/CentOS) and AIX environments, including script automation. Manage source code repositories using both SVN (Apache Subversion) and Git (GitHub, GitLab, Bitbucket). Develop and manage Infrastructure as Code (IaC) using Terraform , Ansible , and Shell scripts . Containerise applications using Docker and orchestrate using Kubernetes or OpenShift . Integrate and maintain application/database deployment workflows, including version control for Oracle and PostgreSQL schemas and scripts. Implement robust monitoring and logging using Prometheus , Grafana , ELK stack , or Splunk . Ensure security, compliance, and rollback strategies for all deployments. Collaborate with application architects, developers, testers, DBAs, and system administrators. Drive DevOps maturity across environments by introducing automation and standardisation. Required Skills and Experience: 7+ years of experience , including at least 3 years in a DevOps leadership role . Proven expertise in .NET Core , Angular , and DevOps automation practices. Strong experience in supporting and automating Oracle and PostgreSQL database deployments. Proficiency in using and managing both SVN and Git -based source control systems. Hands-on with CI/CD tools like Jenkins, GitLab CI, Azure DevOps, or similar. Deep experience in Linux and AIX OS environments , including scripting in Shell/Bash. Knowledge of Docker , Kubernetes , or OpenShift for container orchestration. Experience with automation tools such as Ansible , Terraform , or Puppet . Strong understanding of deployment security and rollback strategies. Self-starter with excellent troubleshooting, communication, and collaboration skills.

Posted 1 day ago

Apply

0.0 - 4.0 years

1 - 4 Lacs

Chennai

Work from Office

Naukri logo

As a Full Stack Software Engineer at Fractal Street, you will drive the technical vision and execution of our fintech platforms frontend and backend systems.

Posted 1 day ago

Apply

Exploring PostgreSQL Jobs in India

PostgreSQL, an open-source relational database management system, is widely used in various industries across India. Job opportunities for PostgreSQL professionals are on the rise as companies continue to adopt this powerful database technology.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT industries and have a high demand for PostgreSQL experts.

Average Salary Range

The salary range for PostgreSQL professionals in India varies based on experience: - Entry-level: ₹3-5 lakhs per annum - Mid-level: ₹6-10 lakhs per annum - Experienced: ₹12-20 lakhs per annum

Career Path

Typically, a career in PostgreSQL progresses as follows: - Junior Developer - Database Administrator - Senior Developer - Tech Lead

Advancing in this career path often requires gaining experience in managing large databases, optimizing performance, and implementing complex queries.

Related Skills

In addition to PostgreSQL expertise, professionals in this field are often expected to have knowledge of: - SQL - Database design - Data modeling - Query optimization

Interview Questions

  • What is the difference between SQL and PostgreSQL? (basic)
  • Explain the concept of normalization in databases. (medium)
  • How can you optimize a query in PostgreSQL? (medium)
  • Describe the process of setting up replication in PostgreSQL. (advanced)
  • What are the different indexing techniques available in PostgreSQL? (medium)
  • How does PostgreSQL handle concurrent access to data? (advanced)
  • Explain the role of triggers in PostgreSQL. (medium)
  • What is the purpose of the EXPLAIN statement in PostgreSQL? (medium)
  • Describe the role of vacuum in PostgreSQL. (medium)
  • How does PostgreSQL ensure data integrity? (advanced)
  • What is a foreign key constraint in PostgreSQL? (basic)
  • Explain the difference between CHAR and VARCHAR data types in PostgreSQL. (basic)
  • How can you monitor the performance of a PostgreSQL database? (medium)
  • What is the purpose of the pg_hba.conf file in PostgreSQL? (medium)
  • How would you handle a database crash in PostgreSQL? (advanced)
  • Describe the process of setting up high availability in PostgreSQL. (advanced)
  • What are the advantages of using stored procedures in PostgreSQL? (medium)
  • Explain the concept of tablespaces in PostgreSQL. (basic)
  • How does PostgreSQL handle transactions? (advanced)
  • What is the role of shared buffers in PostgreSQL? (medium)
  • Describe the process of upgrading PostgreSQL to a new version. (advanced)
  • How can you troubleshoot performance issues in PostgreSQL? (medium)
  • What is the purpose of the pg_stat_statements extension in PostgreSQL? (advanced)
  • Explain the concept of partial indexes in PostgreSQL. (medium)

Closing Remark

As you explore job opportunities in PostgreSQL in India, remember to showcase your skills and expertise confidently during interviews. Prepare thoroughly and stay updated with the latest trends in database management to stand out in this competitive job market. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies