Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
3 - 7 Lacs
Pune
Work from Office
Job summery We are looking for a Data Engineer with 2+ years of experience in SQL databases, AWS data pipelines, and Python programming. The ideal candidate should understand API structures, push/pull APIs, API testing, and the basics of mobile application functionalit Position details Department Software Engineer Location Pune Roles and Responsibilities Design, develop, and manage SQL databases, ensuring efficient data storage and retrieval. Work with AWS services such as EC2, RDS, Lambda, and AWS data pipelines for processing and managing data. Write basic Python scripts for data processing, automation, and integration. Understand API structures, including RESTful APIs, push-pull mechanisms, and how APIs interact with databases and applications. Work with mobile application teams to understand data flow and optimize API interactions. Monitor, debug, and optimize cloud-based data workflows and pipelines. Requirements Work Experience 2+ Years Technical Skills SQL Database: Strong understanding of relational databases, query optimization, and data management. AWS Services: Experience with EC2, RDS, Lambda, and AWS data pipeline setup and maagement. Python Programming: Basic scripting skills for automation and data handling. API Knowledge: Understanding of API structures, push/pull API mechanisms, API testing, and integration. Mobile Application Basics: Knowledge of how mobile apps interact with back-end systems and APIs. Debugging & Testing: Ability to test, troubleshoot, and validate data flows, APIs, and system integrations. Problem-Solving: Analytical mindset to optimize workflows and data processes. Preferred Qualifications Experience with cloud-based data architectures. Exposure to serverless computing and event-driven architectures. Basic understanding of front-end and back-end application development.
Posted 1 month ago
5.0 - 8.0 years
20 - 25 Lacs
Gurugram
Work from Office
Responsibility : Low level analysis, quality development, unit test. Communication with BA, QA and other stakeholders to make sure bug free end-to-end delivery. Take the ownership of their code and services in long run. Mandate : Education : B.E. , B.Tech ,MCA or other equivalent engineering degree Preferred experience required: 5-8 Year. Minimum 5-year relevant experience in .Net or .Net core server-side programming. Knowledge of – OOPS , Data Structure, Rest or gRPC, WCF, Multithreading, Kafka, Azure, basics of Devops Strong understanding of SQL, Query Store, Execution plan, Query Optimization Design pattern & SOLID principle Good communication skills (Verbal and Written) Good to have- exposure to financial markets, Docker and Kubernetes. Roles and Responsibilities Responsibility : Low level analysis, quality development, unit test. Communication with BA, QA and other stakeholders to make sure bug free end-to-end delivery. Take the ownership of their code and services in long run. Mandate : Education : B.E. , B.Tech ,MCA or other equivalent engineering degree Preferred experience required: 5-8 Year. Minimum 5-year relevant experience in .Net or .Net core server-side programming. Knowledge of – OOPS , Data Structure, Rest or gRPC, WCF, Multithreading, Kafka, Azure, basics of Devops Strong understanding of SQL, Query Store, Execution plan, Query Optimization Design pattern & SOLID principle Good communication skills (Verbal and Written) Good to have- exposure to financial markets, Docker and Kubernetes.
Posted 1 month ago
7.0 - 10.0 years
8 - 16 Lacs
Bengaluru
Work from Office
We are seeking a detail-oriented and skilled Oracle SQL/PLSQL Developer to join our team. The ideal candidate will have strong expertise in Oracle SQL and PL/SQL, with a proven ability to understand, debug, and optimize existing codebases. Experience with Linux and Oracle Warehouse Builder (OWB) is a strong plus. Familiarity with WebFOCUS is highly appreciated. Key Responsibilities: Analyze, develop, and maintain complex PL/SQL code and Oracle SQL queries. Debug and optimize existing stored procedures, functions, and packages. Collaborate with team members to understand business requirements and translate them into technical solutions. Review and refactor legacy code for performance improvements and best practices. Conduct unit testing and assist in system and integration testing. Work in Linux-based environments to deploy and maintain database scripts and jobs. (If applicable) Work with Oracle Warehouse Builder (OWB) for ETL processes. (Optional) Contribute to reporting and data visualization using WebFOCUS tools. Required Skills: Strong expertise in Oracle SQL and PL/SQL development . Solid understanding of database design and performance tuning techniques. Ability to read, understand, and improve existing codebases. Experience in debugging, error handling, and optimizing complex queries. Exposure to Linux environments (basic shell scripting is a plus). Preferred / Nice to Have: Hands-on experience with Oracle Warehouse Builder (OWB) . Familiarity with WebFOCUS for reporting and analytics. Understanding of ETL concepts and data warehouse principles.
Posted 1 month ago
2.0 - 4.0 years
5 - 9 Lacs
Noida
Work from Office
Develop, maintain, and enhance applications using .NET Core, and C# Web API. Implement state management solutions using Redux or Flux. Write clean, maintainable, and efficient code following best practices and design patterns. Design, optimize, and debug complex SQL queries and stored procedures. Ensure application performance and scalability through query optimization and database tuning. Develop and enhance RESTful APIs for seamless frontend-backend integration. Work with MVC and WCF for developing robust and scalable applications. Collaborate with cross-functional teams to define, design, and ship new features. Conduct thorough testing and debugging of applications to ensure quality and security. Stay up-to-date with emerging technologies and industry trends.
Posted 1 month ago
1.0 - 4.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Description Job Title: Backend Developer Job Type: Full-time Location: On-site, Hyderabad, Telangana, India About us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: Join our customer's team as a Backend Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications Key Responsibilities: Develop, test, and maintain scalable backend components and microservices using Python and PySpark Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance Integrate machine learning models into production-grade backend systems powering innovative AI features Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills Required Skills and Qualifications: Proficient in Python for backend development with strong coding standards Practical experience with Databricks and PySpark in live production environments Advanced knowledge of MySQL database design, query optimization, and maintenance Solid foundation in machine learning concepts and deploying ML models in backend systems Experience utilizing Redis for effective caching and state management Outstanding written and verbal communication abilities with strong attention to detail Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad Preferred Qualifications: Background in high-growth AI/ML or complex data engineering projects Familiarity with additional backend technologies or cloud-based platforms Experience mentoring or leading technical teams Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving
Posted 1 month ago
2.0 - 4.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Job Description Job Title: Backend Developer Job Type: Full-time Location: On-site, Hyderabad, Telangana, India About us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: Join our customer's team as a Backend Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications Key Responsibilities: Develop, test, and maintain scalable backend components and microservices using Python and PySpark Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance Integrate machine learning models into production-grade backend systems powering innovative AI features Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills Required Skills and Qualifications: Proficient in Python for backend development with strong coding standards Practical experience with Databricks and PySpark in live production environments Advanced knowledge of MySQL database design, query optimization, and maintenance Solid foundation in machine learning concepts and deploying ML models in backend systems Experience utilizing Redis for effective caching and state management Outstanding written and verbal communication abilities with strong attention to detail Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad Preferred Qualifications: Background in high-growth AI/ML or complex data engineering projects Familiarity with additional backend technologies or cloud-based platforms Experience mentoring or leading technical teams Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving
Posted 1 month ago
8.0 - 13.0 years
2 - 30 Lacs
Hyderabad
Work from Office
About us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: Join our customer's team as a Software Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications Key Responsibilities: Develop, test, and maintain scalable backend components and microservices using Python and PySpark Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance Integrate machine learning models into production-grade backend systems powering innovative AI features Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills Required Skills and Qualifications: Proficient in Python for backend development with strong coding standards Practical experience with Databricks and PySpark in live production environments Advanced knowledge of MySQL database design, query optimization, and maintenance Solid foundation in machine learning concepts and deploying ML models in backend systems Experience utilizing Redis for effective caching and state management Outstanding written and verbal communication abilities with strong attention to detail Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad Preferred Qualifications: Background in high-growth AI/ML or complex data engineering projects Familiarity with additional backend technologies or cloud-based platforms Experience mentoring or leading technical teams Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving
Posted 1 month ago
1.0 - 4.0 years
3 - 7 Lacs
Gurugram
Work from Office
Job Description Job Title: Backend Developer Job Type: Full-time Location: On-site, Hyderabad, Telangana, India About us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: Join our customer's team as a Backend Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications Key Responsibilities: Develop, test, and maintain scalable backend components and microservices using Python and PySpark Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance Integrate machine learning models into production-grade backend systems powering innovative AI features Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills Required Skills and Qualifications: Proficient in Python for backend development with strong coding standards Practical experience with Databricks and PySpark in live production environments Advanced knowledge of MySQL database design, query optimization, and maintenance Solid foundation in machine learning concepts and deploying ML models in backend systems Experience utilizing Redis for effective caching and state management Outstanding written and verbal communication abilities with strong attention to detail Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad Preferred Qualifications: Background in high-growth AI/ML or complex data engineering projects Familiarity with additional backend technologies or cloud-based platforms Experience mentoring or leading technical teams Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Noida, Hyderabad
Work from Office
Location: Hyderabad / Noida / Gurugram (Onsite preferred) Type: Contract Position (612 Months) Experience: 9+ Years Apply: Please share your resume with current CTC, expected CTC, location, and notice period to [your-email@example.com] We are hiring a Senior Java Full Stack Developer for a contract role at one of our client locations. The ideal candidate should have 9+ years of strong hands-on experience in Java-based software development, front-end technologies, and cloud platforms. Key Responsibilities: Design and develop modern full stack applications using Java, Spring Boot, and ReactJS Build and maintain single-page web applications using HTML5, CSS3, Bootstrap, and JavaScript Develop and integrate RESTful APIs and microservices Work on Node.js backend services and support UI integration Architect, develop, and deploy applications in cloud environments like AWS or Azure Work with databases: RDBMS (PostgreSQL, SQL Server) and NoSQL (MongoDB, ElasticSearch) Ensure security using OAuth2.0, OpenID Connect, or similar frameworks Apply best practices for clean, maintainable, and efficient code Conduct code reviews, testing, and optimization for performance and scalability Required Skills: Strong proficiency in Java, Spring, Spring Boot Hands-on experience in ReactJS, NodeJS, and frontend architecture Cloud deployment experience on AWS, Azure, or other platforms Experience with REST API development & integration Experience with OAuth2, OpenID Connect, and security implementation Strong understanding of SQL, query optimization, and NoSQL Good problem-solving and debugging skills Experience with automated testing frameworks like Jest, Mocha, etc. To Apply, please share: Full Name Total Experience / Relevant Experience Current CTC Expected CTC Current Location Preferred Work Location (Hyderabad/Noida/Gurugram) Notice Period / Availability Send to: [your-email@example.com] (Replace with the actual contact)
Posted 1 month ago
3.0 - 8.0 years
9 - 15 Lacs
Hyderabad
Work from Office
Job Title: Database Developer Location: Madhapur Industry: IT Services & Consulting Department: Engineering - Software & QA Employment Type: Full-Time Role Category: DBA / Data Warehousing Job Description: We are on the lookout for a skilled Database Developers to join our team. In this role, you will work closely with our client to enhance their product and provide essential post-go-live support for users across the US, Bangkok, Philippines, Shanghai, and Penang. If you are passionate about database development and eager to tackle complex challenges, we invite you to apply! Key Responsibilities: Develop and implement product enhancements. Provide post-go-live production support, troubleshooting issues as they arise. Write and optimize complex SQL queries using advanced SQL functions. Perform query performance tuning, optimization, and debugging. Design and maintain database triggers, indexes, and views. Manage and understand complex data organization within RDBMS environments. Required Candidate Profile: Database Experience: Proficiency in Oracle, MySQL, or MSSQL SERVER. Stored Procedures Expertise: Strong background in Stored Procedures, including writing and debugging complex queries. Query Optimization: Proven expertise in query performance tuning and optimization. Database Design: Competency in writing triggers, and creating indexes and views. Industry Experience: Experience in the manufacturing domain is a significant advantage. Educational Requirements: Undergraduate Degree: Any Graduate Postgraduate Degree: Other Post Graduate - Other Specialization Doctorate: Other Doctorate - Other Specialization Key Skills: Query Optimization MySQL SQL Queries PL/SQL Data Warehousing Performance Tuning Oracle Role: Database Developer / Engineer If you are a proactive, detail-oriented database professional with a knack for problem-solving and performance tuning, we would love to hear from you. Apply now to join our dynamic team and make a meaningful impact!
Posted 1 month ago
5.0 - 7.0 years
8 - 12 Lacs
Mohali
Work from Office
Senior SQL Cloud Database Administrator | CS Soft Solutions Senior SQL Cloud Database Administrator (DBA) Role And Responsibilities Managing, optimizing, and securing our cloud-based SQL databases, ensuring high availability and performance. Design and implement scalable and secure SQL database structures in AWS and GCP environments. Plan and execute data migration from on-premises or legacy systems to AWS and GCP cloud platforms. Monitor database performance, identify bottlenecks, and fine-tune queries and indexes for optimal efficiency. Implement and manage database security protocols, including encryption, access controls, and compliance with regulations. Develop and maintain robust backup and recovery strategies to ensure data integrity and availability. Perform regular maintenance tasks such as patching, updates, and troubleshooting database issues. Work closely with developers, DevOps, and data engineers to support application development and deployment. Ensure data quality, consistency, and governance across distributed systems. Keep up with emerging technologies, cloud services, and best practices in database management. Required Skills: Proven experience as a SQL Database Administrator with expertise in AWS and GCP cloud platforms. Strong knowledge of SQL database design, implementation, and optimization. Experience with data migration to cloud environments. Proficiency in performance monitoring and query optimization. Knowledge of database security protocols and compliance regulations. Familiarity with backup and disaster recovery strategies. Excellent troubleshooting and problem-solving skills. Strong collaboration and communication skills. Knowledge of DevOps integration
Posted 1 month ago
3.0 - 6.0 years
7 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. we'll versed in Data Quality rules, principles and implementation. Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies
Posted 1 month ago
5.0 - 10.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
JD : 5+ years of experience in software engineeringStrong proficiency in SQL, with a deep understanding of query optimization and performance tuningExperience in implementing automated SQL code review using AI/ML techniques to identify performance bottlenecks and suggest query optimizationsExperience working with GCP servicesSolid hands-on experience with Python for scriptingExperience with automation of GitHub ActionsHands-on experience in designing, developing and deploying microservicesExperience in building APIs in fastapi/flask for data services and system integration
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Noida
Work from Office
Proficient in database technologies with a specific understanding of RDBMS like PostgreSQL, MySQL, and NoSQL data stores such as HBase, MongoDB, etc. Database query development (write and optimize DB queries) and Migration.
Posted 1 month ago
1.0 - 6.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Number of Openings 1 ECMS ID in sourcing stage 530186 Assignment Duration 12 Months Total Yrs. of Experience 7+ Relevant Yrs. of experience 5 yrs Detailed JD (Roles and Responsibilities) Develop and implement logical and Physical Data models to meet business requirements Experience in Data Modeling tools Erwin/Power Designer Develop, optimize, and maintain DB tables, Schemas, Procedures etc Ensure DB performance through query optimization and indexing techniques Strong experience with Relational Databases Create and maintain documentation for data processes and architectures Mandatory skills Data Modeling, Erwin or Power Designer, RDBMS Desired/ Secondary skills Knowledge on ETL Vendor Rate 8500 INR/Day Delivery Anchor for tracking the sourcing statistics, technical evaluation, interviews and feedback etc. Selvakumar_R Work Location given in ECMS ID Bangalore Is it complete WFO or Hybrid model (specify the days) Hybrid (3 days) BG Check (Before OR After onboarding) Before Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No
Posted 1 month ago
4.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Experience: 6+ Years Location: Bangalore, Chennai, Gurgaon Join our engineering team as a Senior Backend Engineer and lead the development of cloud-native, scalable microservices and RESTful APIs using modern Python frameworks You'll work with Docker, AWS, and CI/CD tools to build robust backend systems powering next-gen platforms If you have hands-on experience with FastAPI, Flask, or Django, and are skilled in distributed systems, Kafka, and relational/NoSQL databases, we want to hear from you Key Responsibilities: Microservices Development: Design, build, and optimize microservices architecture using patterns like Service Discovery, Circuit Breaker, API Gateway, and Saga orchestration REST API Engineering: Develop high-performance RESTful APIs using Python frameworks like FastAPI, Flask, or Django REST Framework Cloud-Native Backend Systems: Build and deploy containerized applications using Docker Familiarity with Kubernetes (K8s) for orchestration is a plus CI/CD Automation: Create and maintain DevOps pipelines using GitLab CI/CD, GitHub Actions, or Jenkins for automated testing and deployment Source Code Management: Collaborate through Git-based version control, ensuring code quality via pull requests and peer reviews on platforms like GitHub or GitLab Event-Driven Architecture: Implement and manage data streaming and messaging pipelines with Apache Kafka, Amazon Kinesis, or equivalent Database Engineering: Work with PostgreSQL, MySQL, and optionally NoSQL solutions such as MongoDB, DynamoDB, or Cassandra Cloud Infrastructure: Architect and manage AWS backend services using EC2, ECS, S3, Lambda, RDS, and CloudFormation Big Data Integration (Desirable): Leverage PySpark for distributed data processing and scalable ETL workflows in data engineering pipelines Polyglot Collaboration: Integrate with backend services or data processors developed in Java, Scala, or other enterprise technologies Required Skills & Qualifications: Bachelor's or Master's in Computer Science, Software Engineering, or a related technical field 6+ years in backend development using Python Proven expertise in API development, microservices, and cloud-native applications Proficiency in SQL, database schema design, and query optimization Strong grasp of DevOps best practices, Git workflows, and code quality standards Experience with streaming platforms, message queues, or event-driven design Nice to Have: Experience with Kubernetes, Terraform, or CloudWatch Exposure to big data tools (e g , Spark, Airflow, Glue) Familiarity with Agile/Scrum methodologies and cross-functional teams Benefits: Competitive salary and performance-based bonuses Opportunity to build next-gen backend platforms for global-scale applications Work with a team that values engineering best practices, code quality, and continuous learning Flexible work model with remote and hybrid options
Posted 1 month ago
1.0 - 2.0 years
1 - 2 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Node.js Backend Developer Client: Neutrinos Location: Bangalore About the Role We are seeking experienced Node.js Backend Engineers to develop and optimize microservices for data-intensive applications . This is a hands-on role focused on building scalable systems, integrating with event-driven architectures, and delivering high-performance backend solutions. Key Responsibilities 1. Backend Development Design, build, and maintain microservices using Node.js (Express.js or Nest.js). Ensure clean, modular architecture with scalability and maintainability as core priorities. 2. Performance Optimization Optimize Node.js runtime performance and reduce API latency. Implement caching strategies (e.g., Redis) for throughput and response time improvements. 3. Kafka Integration Design and manage Kafka consumers/producers for event-driven microservices. Collaborate with data teams on message schemas and data flow orchestration . Use gRPC for efficient inter-service communication where necessary. 4. Observability & Monitoring Integrate Open Telemetry or similar tools for monitoring, tracing, and logging. Implement logging best practices and metrics for production readiness and reliability. 5. Cross-functional Collaboration Work closely with Data Integration Engineers to ensure smooth pipeline integration. Coordinate with PostgreSQL experts on query optimization and database performance. Desired Candidate Profile Experience Minimum 1-2 years of hands-on experience in Node.js backend development . Solid background in microservices , event-driven systems , and high-volume data processing . Technical Skills Proficiency in RESTful APIs , JWT/OAuth , and core Node.js libraries. Strong knowledge of Kafka or similar messaging platforms. Familiarity with gRPC for structured, high-performance communication. Experience with observability tools like Open Telemetry , Prometheus, or Jaeger. Database Expertise Good command of PostgreSQL : writing optimized queries, indexing, and tuning. Understanding of partitioning , read/write optimization , and data modelling . Soft Skills & Team Collaboration Strong team player with experience in code reviews , mentoring, and guiding juniors. Comfortable working in Agile environments , participating in sprint planning , and collaborating across teams. Cultural Fit We value: High-performance mindset with attention to Quality, On-Time Delivery, Efficiency, and Accuracy . Passion for solving complex data challenges using modern engineering practices. Ownership, continuous learning, and a collaborative approach to problem-solving. Why Join Us Be part of a dynamic team pushing the boundaries of data-intensive software engineering , working with cutting-edge technologies including: Node.js microservices Kafka-driven architectures gRPC-based service communication Advanced observability and monitoring
Posted 1 month ago
4.0 - 6.0 years
9 - 13 Lacs
Indore, Pune
Work from Office
What will your role look like Develop requirements, wireframes, and dashboards for varied audiences related to financial and project management related metrics. These dashboards will pull from multiple data sources and require the need to format and aggregate data. Produce compelling and informative visualizations using various native and custom chart types. Create and maintain relationships between visuals, filters, bookmarks, and numeric/field parameters. Design strategic visual interactions that enhance the end-users experience using cross- filtering and cross- highlighting. Write and optimize DAX expressions to create measures and calculated columns. Create effective data models. Maintain relationship cardinality and cross- filtering between tables. Manage online deployment pipelines to test and publish Power BI reports. Manage user roles and implement row-level security to restrict data access. Clean, transform, reshape, and aggregate data from different sources such as Excel, SQL Server, SharePoint, etc. Create dynamic, reusable queries using parameters. Why you will love this role Besides a competitive package, an open workspace full of smart and pragmatic team members, with ever-growing opportunities for professional and personal growth Be a part of a learning culture where teamwork and collaboration are encouraged, diversity is valued and excellence, compassion, openness and ownership is rewarded We would like you to bring along Strong Power BI expertise with in-depth understanding of dataflows, datasets and integration with backend databases Python exposure will be a huge plus. M Language Knowledge of different data types and data structures like values, records, tables, lists, etc. Familiarity with built-in functions and the ability to write custom functions. Understand native query folding to optimize performance. SQL Experience writing and optimizing queries. Strong understanding of relational databases. Familiarity with common filtering functions (CALCULATE, FILTER, etc.) and iteration functions (SUMX, AVERAGEX, etc.) Understand the use cases for Import, DirectQuery, Dual, and Live data storage modes.
Posted 1 month ago
1.0 - 5.0 years
2 - 6 Lacs
Nagercoil
Work from Office
Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Roles and Responsibilities Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.
Posted 1 month ago
5.0 - 9.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Educational Bachelor of Engineering,Bachelor Of Comp. Applications,Master Of Comp. Applications Service Line Application Development and Maintenance Responsibilities Analyzing user requirements, envisioning system features and functionality. Design, build, and maintain efficient, reusable, and reliable .Net codes by setting expectations and features priorities throughout development life cycle Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Additional Responsibilities: Min. 5 years of relevant .Net experience with team handling experience Must have design experience using best practices, Design Patterns, SDLC, OOP, OOD Must have experience in leading and mentoring teams Must be experienced in developing applications using SQL databases, schema, SQL queries Must be experienced in GIT and version control systems Must be skilled in Database constructs, schema design, SQL Server or Oracle, SQL Queries, query optimization. Must be hands-on experienced in MSTest or NUnit, Mocking frameworks, Jasmine, Karma, Cucumber Solid understanding of object-oriented programming Experience with both external and embedded databases Creating database schemas that represent and support business processes Implementing automated testing platforms and unit tests Good verbal and written communication skills Ability to communicate with remote teams in effective manner High flexibility to travel Strong analytical, logical skills and team leading skills Technical and Professional : .NET, ASP.NET, MVC, C#, WPF, WCF, SQL Server, Entity Framework Preferred Skills: Technology-Microsoft Technologies-.NET Frameworks Technology-ASP.Net-ASP.Net Web API
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Your responsibilities Develop and implement scalable applications using a mix of Microsoft technologies (Power Platform, Power Automate, .NET, SQL Server) and Pega, following best practices for architecture, coding, and design patterns. Build high-quality, maintainable, and test-covered solutions, ensuring seamless deployment across cloud (Azure, AWS) and on-premises environments with a focus on security and compliance. Design, develop, and integrate APIs and automation workflows leveraging Power Automate, Pega, and cloud-native services to enable seamless interoperability and process automation. Troubleshoot and resolve complex implementation, environment, and deployment issues across both Pega and Microsoft stacks, minimizing downtime and ensuring system reliability. Develop and automate comprehensive testing frameworks (unit, system) and CI/CD pipelines using tools like Azure DevOps and GitHub to support continuous integration and delivery. Analyze business requirements to translate them into robust technical solutions, applying secure development practices especially in payments processing and enterprise integrations. Leverage Agentic AI, advanced analytics, and data-driven insights to automate workflows, optimize processes, and enhance system intelligence within Pega and Microsoft environments. Stay current with emerging technologies and industry trends in Pega, Microsoft Power Platform, AI, and cloud computing, integrating new best practices into development workflows. Collaborate with cross-functional teams, including Solution Architects and SMEs, to prototype, validate, and refine scalable, enterprise-grade solutions. Develop, review, and maintain architecture artifacts, reference models, and platform initiatives impacting Pega, Power Platform, Azure, and other cloud ecosystems. Solid understanding of payment operations to ensure software solutions support secure, compliant, and efficient transaction processing aligned with business workflows. Implement monitoring and observability capabilities to track application performance, detect issues early, and ensure system health across Pega and Microsoft platforms. Your skills & Your Experience 7-10 years of experience with Power Platform (Power Automate, Power Apps), .NET, and SQL Server Strong expertise in database design, schema development, and query optimization Experience developing scalable, secure enterprise applications on cloud and on-premises API design, development, and system integration, with some Pega development experience Troubleshooting complex deployment and performance issues across Pega, Microsoft, and database layers CI/CD pipeline automation using Azure DevOps, GitHub Knowledge of secure payments processing, database encryption, and compliance standards Monitoring and observability tools for database and system health management Experience with AI, advanced analytics, and workflow automation Collaboration with cross-functional teams and stakeholders Understanding of payment operations, business workflows, and data security best practices
Posted 1 month ago
7.0 - 12.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Your responsibilities Develop and implement scalable applications using a mix of Microsoft technologies (Power Platform, Power Automate, .NET, SQL Server) and Pega, following best practices for architecture, coding, and design patterns. Build high-quality, maintainable, and test-covered solutions, ensuring seamless deployment across cloud (Azure, AWS) and on-premises environments with a focus on security and compliance. Design, develop, and integrate APIs and automation workflows leveraging Power Automate, Pega, and cloud-native services to enable seamless interoperability and process automation. Troubleshoot and resolve complex implementation, environment, and deployment issues across both Pega and Microsoft stacks, minimizing downtime and ensuring system reliability. Develop and automate comprehensive testing frameworks (unit, system) and CI/CD pipelines using tools like Azure DevOps and GitHub to support continuous integration and delivery. Analyze business requirements to translate them into robust technical solutions, applying secure development practices especially in payments processing and enterprise integrations. Leverage Agentic AI, advanced analytics, and data-driven insights to automate workflows, optimize processes, and enhance system intelligence within Pega and Microsoft environments. Stay current with emerging technologies and industry trends in Pega, Microsoft Power Platform, AI, and cloud computing, integrating new best practices into development workflows. Collaborate with cross-functional teams, including Solution Architects and SMEs, to prototype, validate, and refine scalable, enterprise-grade solutions. Develop, review, and maintain architecture artifacts, reference models, and platform initiatives impacting Pega, Power Platform, Azure, and other cloud ecosystems. Solid understanding of payment operations to ensure software solutions support secure, compliant, and efficient transaction processing aligned with business workflows. Implement monitoring and observability capabilities to track application performance, detect issues early, and ensure system health across Pega and Microsoft platforms. Your skills & Your Experience 5-7 years of experience with Power Platform (Power Automate, Power Apps), .NET, and SQL Server Strong expertise in database design, schema development, and query optimization Experience developing scalable, secure enterprise applications on cloud and on-premises API design, development, and system integration, with some Pega development experience Troubleshooting complex deployment and performance issues across Pega, Microsoft, and database layers CI/CD pipeline automation using Azure DevOps, GitHub Knowledge of secure payments processing, database encryption, and compliance standards Monitoring and observability tools for database and system health management Experience with AI, advanced analytics, and workflow automation Collaboration with cross-functional teams and stakeholders Understanding of payment operations, business workflows, and data security best practices
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Educational Bachelor Of Comp. Applications,Bachelor of Engineering,Bachelor Of Technology,Master Of Technology,Master Of Engineering,Master Of Science Service Line Cloud & Infrastructure Services Responsibilities Roles and Responsibilities: Set up and support HA/DR solutions/ replication Lead efforts related to system and SQL performance tuning, index/ partition creation and management Set up log shipping, Mirroring/log forwarding, analyzing traces Architect, Design, Implement, Administer database consolidation platforms Perform duties including monitoring, software installs and upgrades, scripting, automation, incident response and documentation Perform DB restores and point-in-time recoveryIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Experience in troubleshooting and resolving database integrity issues, performance issues, blocking and deadlocking issues, replication issues, log shipping issues, connectivity issues, security issues etc. Hands on experience in Performance Tuning, Query Optimization, monitoring and troubleshooting tools. Solid understanding of how indexes, index management, integrity checks, configuration, patching. How statistics work, how indexes are stored, how they can be created and managed effectively Technical and Professional : Technology | Database Administration | MS SQL Server, Technology | Database Administration| Oracle DBA, Technology | Database Administration| Postgres SQL Preferred Skills: Technology-Database-Database- ALL Technology-Database-Oracle Database Technology-Database Administration-MS SQL Server-SQL Server Technology-Database Administration-PostGreSQL
Posted 1 month ago
12.0 - 17.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HANA DB Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : BTech:Install, configure, and administer SAP HANA databases across various environments.Perform database upgrades, patches, and system refreshes.Monitor & optimize SAP HANA memory, CPU, and disk utilization to ensure high availability and reliability.Implement and manage SAP HANA high availability (HA) and disaster recovery (DR) solutions.Conduct performance tuning and query optimization to enhance system efficiency.Manage backup, restore, and disaster recovery procedures for SAP HANA databases.Troubleshoot and resolve database-related issues, working closely with application teams and SAP support.Ensure database security, user management, and authorization compliance in accordance with IT policies.Collaborate with SAP BASIS, infrastructure, and security teams to support SAP applications running on HANA.Document best practices, troubleshooting steps, and operational procedures for SAP HANA administration. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Collaborate with stakeholders to understand application requirements.- Integrate functional, security, integration, performance, quality, and operations requirements.- Review and integrate technical architecture requirements.- Provide input into final decisions regarding hardware, network products, system software, and security.- Ensure successful integration of application and technical architecture.- Analyze requirements and provide solutions to problems.- Manage and coordinate with team members.- Stay updated with industry trends and advancements.- Conduct research and make recommendations for improvements.- Ensure compliance with coding standards and best practices.- Identify and resolve technical issues and bugs.- Collaborate with cross-functional teams to deliver high-quality solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HANA DB Administration.- Good To Have Skills: Experience with SAP BASIS Administration.- Strong understanding of database administration principles.- Experience in managing and optimizing SAP HANA databases.- Knowledge of SAP HANA architecture and components.- Experience in performance tuning and troubleshooting.- Familiarity with SAP HANA security and authorization concepts.- Experience in backup and recovery strategies for SAP HANA databases. Additional Information:- The candidate should have a minimum of 12 years of experience in SAP HANA DB Administration.- This position is based at our Pune office.- A 15 years full-time education is required. Qualification BTech
Posted 1 month ago
10.0 - 15.0 years
11 - 16 Lacs
Hyderabad
Work from Office
Bachelors Degree in Computer Science, Information Technology, or equivalent degree and/or experience 10+ plus years of software development experience Primary Skill8+ years of experience as a MS SQL DB Developer Should know SSRS (Report creations and server configuration) Hands on experience on following DB Development skills: SQL querying Join operations SP writing SQL Functions / Triggers / Indexing / Temp Tables CTE Cursors Query Optimization Tables / Views (Creation / Update) System Objects SQL Server security Synonyms SQL Profiler Query analyzer Gitlab CICD Pipeline Powershell scripting Google Cloud Firestore , Firebase Hosting skills and experience will be an added advantage
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough