Home
Jobs

271 Nosql Databases Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 10 years

13 - 17 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a skilled Data Engineering Lead with 8 to 10 years of experience, based in Bengaluru. The ideal candidate will have a strong background in designing and implementing scalable data lake architecture and data pipelines. ### Roles and Responsibility Design and implement scalable data lake architectures using Azure Data Lake services. Develop and maintain data pipelines to ingest data from various sources. Optimize data storage and retrieval processes for efficiency and performance. Ensure data security and compliance with industry standards. Collaborate with data scientists and analysts to facilitate data accessibility. Monitor and troubleshoot data pipeline issues to ensure reliability. Document data lake designs, processes, and best practices. Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Experience in developing streaming pipelines using Azure Event Hub, Azure Stream analytics, Spark streaming. Experience in integrating with business intelligence tools such as Power BI. ### Job Requirements Strong knowledge of Azure Data Lake, Azure Synapse Analytics, Azure Data Factory, and Azure DataBricks. Proficiency in Python (PySpark, Numpy), SQL, ETL, and data warehousing. Experience with Agile and DevOps methodologies and the software development lifecycle. Proactive and responsible for deliverables; escalates dependencies and risks. Works with most DevOps tools, limited supervision, and completes assigned tasks on time with regular status reporting. Ability to train new team members and build strong relationships with project stakeholders. Knowledge of cloud solutions such as Azure or AWS with DevOps/Cloud certifications is desired. Ability to work with multi-cultural global teams virtually. Completion of assigned tasks on time and regular status reporting.

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a skilled professional with 3 to 8 years of experience to join our team as an EY Data Engineer in Bengaluru. The ideal candidate will have a strong background in data engineering, analytics, and reporting. ### Roles and Responsibility Collaborate with cross-functional teams to design and implement data solutions. Develop and maintain large-scale data pipelines using tools like Azure Data Factory and Azure Synapse. Design and implement data models and architectures to support business intelligence and analytics. Work with stakeholders to understand business requirements and develop solutions that meet their needs. Ensure data quality and integrity by implementing data validation and testing procedures. Optimize data storage and retrieval processes for improved performance and efficiency. ### Job Requirements Strong knowledge of data modeling, architecture, and visualization techniques. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Proficiency in programming languages like Python, Java, and SQL. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Strong understanding of data governance principles and practices. A B.Tech/B.E. degree is required; higher professional or master’s qualification is preferred. Active membership in related professional bodies or industry groups is preferred.

Posted 1 month ago

Apply

4 - 9 years

6 - 10 Lacs

Kolkata

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Scientist to join our team in Bengaluru. The ideal candidate will have 4 to 9 years of experience in data science, with a strong background in machine learning, deep learning, and natural language processing. ### Roles and Responsibility Develop and implement innovative AI solutions using Python and open-source models. Extract insights from complex PDF/Word Doc/Forms using techniques such as entity extraction, table extraction, and information comparison. Collaborate with Solution Architects on the deployment of AI POCs and scaling them up to production-level applications. Own the AI/ML implementation process, including model design, feature planning, testing, production setup, monitoring, and release management. Work closely with cross-functional teams to support global clients. Develop analytics-enabled solutions by integrating data science activities with business-relevant aspects. ### Job Requirements Excellent academic background in data science, Business Analytics, Statistics, Engineering, Operational Research, or other related fields. Solid background in Python with excellent coding skills. Minimum 4 years of core data science experience in areas such as machine learning, deep learning, and natural language processing. Good understanding of open-source LLM frameworks like Mistral, Llama, and fine-tuning on custom datasets. Experience with SQL/NoSQL Databases and API Deployment (Flask/FastAPI/Azure Function Apps). Strong written, oral, presentation, and facilitation skills. Ability to coordinate multiple projects and initiatives simultaneously through effective prioritization, organization, flexibility, and self-discipline. Demonstrated project management experience. Knowledge of firm’s reporting tools and processes. Proactive, organized, and self-sufficient with the ability to prioritize and multitask. Analyze complex or unusual problems and deliver insightful and pragmatic solutions. Ability to quickly and easily create/gather/analyze data from a variety of sources. A robust and resilient disposition able to encourage discipline in team behaviors.

Posted 1 month ago

Apply

5 - 10 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a skilled Full Stack Engineer with expertise in .NET Core and Angular to join our Security Remediation Team in Bangalore. The ideal candidate will have 5-10 years of experience. ### Roles and Responsibility Design and develop secure, enterprise-level web applications using .NET Core and Angular, following best practices for code security. Collaborate with cross-functional teams to gather security requirements and deliver secure software solutions. Write clean, maintainable, and secure code in C# and .NET Core, incorporating request sanitization and input validation to mitigate vulnerabilities. Enhance and remediate existing applications by identifying and addressing security vulnerabilities within the codebase. Conduct code reviews to ensure adherence to security standards and apply security testing techniques. Work closely with the security team to integrate security best practices throughout the Software Development Life Cycle (SDLC). Investigate and implement security tools and techniques to continuously enhance the security posture of applications. Test, deploy, and maintain secure applications, ensuring timely remediation of vulnerabilities through the use of security-focused development tools. Develop documentation for security processes and decisions, ensuring compliance with internal security guidelines and industry standards. Stay informed about emerging security trends, threats, and technologies, and recommend practices and tools to strengthen application security. ### Job Requirements Bachelor’s degree in Computer Science, Engineering, or a related field. Proven experience as a Full Stack Developer or in a similar role, with a focus on secure software development. Strong knowledge of .NET Core and C#, with expertise in Object-Oriented Programming (OOP) concepts. Practical experience in Angular for developing secure front-end interfaces. Solid understanding of security concepts such as request sanitization, input validation, and secure coding practices to protect against OWASP Top 10 vulnerabilities. Familiarity with secure architecture styles/APIs (e.g., REST, GraphQL, RPC) and security protocols (e.g., OAuth2, JWT). Experience with SQL/NoSQL databases, including secure data handling and storage practices. Proficient in Git and other version control tools. Excellent troubleshooting, debugging, and communication skills, with the ability to convey security-related issues to both technical and non-technical stakeholders. Detail-oriented, with a commitment to writing secure, high-quality code.

Posted 1 month ago

Apply

4 - 7 years

6 - 10 Lacs

Kolkata

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Scientist to join our team in Bengaluru. The ideal candidate will have 4-7 years of experience in data science, with a strong background in machine learning, deep learning, and natural language processing. ### Roles and Responsibility Develop and implement innovative AI solutions using Python and other programming languages. Collaborate with cross-functional teams to design and deploy scalable data pipelines. Conduct complex data analysis and provide actionable insights to stakeholders. Design and develop predictive models to drive business growth and improvement. Work closely with the Solution Architects on deploying AI POCs and scaling them up to production-level applications. Extract data from complex PDF/Word Docs/Form entities, tables, and information comparison. ### Job Requirements Excellent academic background in data science, Business Analytics, Statistics, Engineering, Operational Research, or a related field. Strong proficiency in Python, with excellent coding skills and experience in deploying open-source models. Experience in machine learning, deep learning, and natural language processing. Good understanding of SQL/NoSQL Databases and their manipulation components. Ability to coordinate multiple projects and initiatives simultaneously through effective prioritization and organization. Proactive, organized, and self-sufficient with the ability to prioritize and multitask. Knowledge of firm’s reporting tools and processes. Demonstrated project management experience. A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be part of market-leading, multi-disciplinary team of 7200+ professionals, in the only integrated global assurance business worldwide. Opportunities to work with EY GDS Assurance practices globally with leading businesses across a range of industries.

Posted 1 month ago

Apply

10 - 12 years

14 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a skilled AI Architect with 10-12 years of experience in Data Science, including 4-5 years focused on architecting and designing AI solutions. The ideal candidate will have a strong background in machine learning, deep learning, natural language processing, and computer vision. ### Roles and Responsibility Lead the design and implementation of comprehensive AI architectures, encompassing machine learning, generative AI, and intelligent agent systems. Develop detailed technical documentation encompassing end-to-end system architecture, infrastructure requirements, API specifications, and deployment processes. Conduct research to explore new advancements in Machine Learning, GenAI, and Agents, and incorporate them into current and future projects. Establish guidelines and frameworks to maintain ethical AI development, mitigate biases, and ensure compliance with data privacy regulations and industry standards. Collaborate with cross-functional teams to support global clients and drive innovation. Design and implement scalable and efficient AI solutions that meet business needs. ### Job Requirements Excellent academic background in Data Science, Business Analytics, Statistics, Engineering, Operational Research, or other related field. Solid background in Python with excellent coding skills. Experience in machine learning, deep learning, natural language processing, and computer vision. Strong understanding of large language models like OpenAI models like ChatGPT, GPT4, function calling, frameworks like LangChain, Llama Index, agents, etc. Experience with AIOps and MLOps. Good understanding of open source LLM framework like Mistral, Llama, etc., and fine tuning on custom datasets. Deep learning (DNN, RNN, LSTM, Encoder-Decoder Models). Natural Language Processing - Text Summarization, Aspect Mining, Question Answering, Text Classification, NER, Language Translation, NLG, Sentiment Analysis. Computer Vision - Image Classification, Object Detection, Tracking, etc. SQL/NoSQL Databases and its manipulation components. Working knowledge of API Deployment (Flask/FastAPI/Azure Function Apps) and webapps creation, Docker, Kubernetes. Excellent written, oral, presentation, and facilitation skills. Ability to coordinate multiple projects and initiatives simultaneously through effective prioritization, organization, flexibility, and self-discipline. Demonstrated project management experience. Knowledge of firm’s reporting tools and processes. Proactive, organized, and self-sufficient with ability to prioritize and multitask. Analyzes complex or unusual problems and can deliver insightful and pragmatic solutions. Ability to quickly and easily create/gather/analyze data from a variety of sources. A robust and resilient disposition able to encourage discipline in team behaviors.

Posted 1 month ago

Apply

2 - 5 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: The Product Master Data Management (PMDM) Data Engineer is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, structured content management, and integrated data to automate the creation, review, and approval of regulatory content. Roles & Responsibilities: Design, develop, and maintain solutions for data generation, collection, and processing. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Build and maintain back-end services using languages like Python, Java, or Node.js that provide secure, reliable, and scalable access to Product Master Data. Collaborate with the design and product teams to understand user needs and translate them into technical requirements. Write clean, efficient, and well-tested code. Participate in code reviews and provide constructive feedback. Maintain system uptime and optimal performance Learn and adapt to new technologies and industry trends Collaborate and communicate effectively with product teams Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications and Experience: Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with web API development Hands on experience with backend development, proficient with SQL/NoSQL database, proficient in Python and SQL Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Good-to-Have Skills: Strong understanding of data modeling, data warehousing, and data integration concepts Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 month ago

Apply

1 - 3 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role The Data Analytics & Automation Sr. Associate will support the Finance Corporate Services teams, including Global Meetings Management, Global Travel & Expense, Global Learning Solutions, Payroll, and Treasury. This role will be responsible for executing data collection, aggregation, and report development to support business leaders in tracking key performance metrics and service delivery effectiveness. Reporting to the Data Analytics & Automation Manager, this individual will play a critical role in developing and maintaining performance management dashboards and automation solutions. They will assist in analyzing data to identify trends, support investigations, and uncover opportunities for process improvement. The Sr. Associate will also work closely with the Corporate Services Product Team to ensure data connectivity and IT governance compliance. A successful candidate will be comfortable executing tasks under the guidance of leadership, compiling data-driven insights, and contributing to projects (long and short-term) that enhance business functions. He or she will also gather, analyze and prepare data in various forms for the Corporate Services leadership team, as well as other ad hoc needs and requests. Responsibilities Design and develop visual performance dashboards for Corporate Services and Finance functions to monitor key service delivery metrics. Utilize tools such as Tableau, Power BI, and Smartsheet to create effective reporting solutions, ensuring data accuracy and integrity. Execute and implement automation solutions to enhance efficiency and reduce manual effort, leveraging tools such as Power Automate, Power Apps, Power Query, Tableau, Smartsheet, and SharePoint. Develop and maintain data pipelines, queries, and reports to support strategic decision-making, business operations, and ad hoc analytics requests. Collect, aggregate, and analyze data from multiple systems and data warehouses (e.g., Cvent, Concur, SAP) to provide actionable insights and drive process improvements. Support AI automation initiatives, including the maintenance of intake and AI self-service platforms like ServiceNow, while identifying opportunities for AI-driven process enhancements. Ensure seamless data integration and system configurations in collaboration with Technology teams, enforcing data governance policies and standardized data connectivity. Proactively identify trends, conduct investigations, and provide data-driven recommendations to functional leaders to improve business performance and operational efficiency. Prepare recurring reports and dashboards, including monthly, quarterly, and annual performance metrics for Corporate Services leadership. Develop and optimize data analytic queries, standardized/custom report layouts, and a library of executive report formats to align reporting processes with business objectives. Apply data science methodologies, including regression, classification, clustering, and predictive modeling, to enhance reporting and analytics capabilities. Conduct in-depth, ad hoc analyses to investigate operational challenges and provide data-driven insights. Experience with data analytics, reporting tools, and automation solutions. Strong skills in data visualization and dashboard creation (e.g., Power BI, Tableau). Proficiency in SQL and NoSQL databases, including relational table design, indexing strategies, and writing complex queries, with experience handling big data models, data lakes, and distributed computing frameworks. Ability to work with large datasets and extract meaningful insights. Proficiency in data analytics and visualization tools Expertise in automation platforms and workflows, including Microsoft Power Platform (Power Automate, Power Query, Power Apps, SharePoint, and Pages) to streamline processes and improve efficiency. Experience in programming languages such as Python, R, and JSON for data processing, automation, and analytics. Experience with AI-driven analytics and large language models (LLMs) to enhance data insights and automation capabilities. Experience working with self-service platforms such as ServiceNow to support business functions and automation. Understanding of enterprise data governance principles to ensure data accuracy, integrity, and compliance across reporting and automation systems. Familiarity with additional automation tools, such as UiPath and emerging AI technologies, to drive process optimization. Strong data visualization and storytelling skills, with the ability to translate complex data into meaningful dashboards, executive reports, and infographics. Knowledge of statistical techniques, including regression, clustering, and classification, as well as data discovery and visualization methods such as distributions, histograms, and bar charts. Demonstrated ability to take initiative and execute projects independently, while effectively collaborating across teams and influencing without direct authority. Strong attention to detail and ability to manage multiple tasks effectively. Strong communication skills, with the ability to present insights clearly to leadership and coordinate cross-functional data requests and updates.

Posted 1 month ago

Apply

5 - 10 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

As a Database Architect specializing in NoSQL and MongoDB , you will play a pivotal role in designing and optimizing data architectures that power our systems and applications. Your focus will be on creating scalable, resilient, and high-performance database solutions to handle large volumes of data while ensuring data integrity and security. You will work closely with cross-functional teams to architect and build innovative solutions that solve complex operational problems and drive business value. In this role, you will leverage your deep expertise in NoSQL databases, particularly MongoDB, to develop and optimize data infrastructure, ensuring that our systems run efficiently and effectively. Much of your work will center around building and maintaining databases that can scale to meet growing demands, optimizing performance, and automating processes to reduce operational overhead. As part of a team of highly skilled problem-solvers, you will have the opportunity to take risks, experiment with new approaches, and implement cutting-edge technologies. Your contributions will directly impact the success of our production applications and systems, allowing us to deliver high-quality, scalable solutions in a fast-paced environment. Responsibilities Assist/work with Cloud Solution and Infra Architects. Database administration, design, and deployment of MongoDB(ATLAS) cluster and database. Design and implement sharing and indexing strategies for MongoDB (ATLAS). Advise MongoDB (ATLAS) HA strategies, including replica sets and sharding. Maintain MongoDB (ATLAS) databases/DB projects Design, implement and manage the security of MongoDB (ATLAS) databases. Responsible for Mongo backups and restores Implement and maintain MongoDB OPS Manager Solve difficult technical challenges Administer MongoDB to achieve 100% availability Collaborate with other teams to solve technical issues Maintain detailed documentation of database Design/Architecture and setup. Experience and Skills 3+ years of experience working in Mongo Database administration. Very good experience in the Linux environment in a database administrator role 1+ years of shell scripting Hands-on experience with solving MongoDB performance issues Hands-on experience with building and maintaining MongoDB replica sets Hands-on experience with building and maintaining MongoDB sharded environment Proactively work on monitoring, identifying, and fixing database-related issues Ability to learn quickly and adapt to handle ambiguous situations Ability to work under pressure and to deadlines Ability to work in a collaborative team-oriented environment Team player with good interpersonal and communication skills Experience in automating database administration tasks Experience in Privileged Access Management - authentication mechanisms like LDAP, Kerberos, Hashi Corp, Active Directory Qualifications Bachelor's degree in any stream Benefits: Best in Industry Salary Health Insurance Flexible working hours

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Hyderabad

Work from Office

Naukri logo

Senior Big Data Engineer Experience: 7-9 Years of Experience. Preferred location: Hyderabad Must have Skills: Bigdata, AWS cloud, Java/Scala/Python, Ci/CD Good to Have Skills: Relational Databases (any), No SQL databases (any), Microservices or Domain services or API gateways or similar, Containers (Docker, K8s, etc) Required Skills Big Data,Aws Cloud,CI/CD,Java/Scala/Python

Posted 1 month ago

Apply

3 - 7 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Management Level :09 - Senior Consultant Location :Gurgaon/Bangalore/Mumbai Must have skills :Front-End Frameworks (React, Angular, Vue.js), Back-End Technologies (Node.js, Python/Django, Java), REST APIs, SQL & NoSQL, Cloud (AWS/Azure/GCP), DevOps Practices, CI/CD Pipelines, Web Application Development Good to have skills :Docker/Kubernetes, Performance Optimization, Cloud Integrations, API Development, UI/UX Best Practices, Security Best Practices, Clean Code Practices, Microservices Architecture Job Summary : We are seeking a talented and motivated Full-Stack Developer to join our Banking team. The ideal candidate will leverage their expertise in designing and implementing scalable web applications, ensuring optimal performance and user experience. This role involves collaborating with cross-functional teams to deliver innovative solutions that meet business needs. Roles & Responsibilities: Web Application Development :Develop and maintain web applications using modern frameworks and technologies. Collaboration :Work closely with product managers, engineers, and other stakeholders to understand requirements and deliver impactful solutions. Optimization :Optimize application performance and scalability to ensure seamless user experiences. Code Quality :Write clean, maintainable, and well-documented code. REST API Development :Develop and integrate robust APIs for communication between the front-end and back-end systems. Cloud Platform Integration :Utilize cloud platforms like AWS , Azure , or GCP to build and deploy applications and services. CI/CD :Implement and maintain DevOps practices and CI/CD pipelines to ensure continuous delivery and integration. Security :Follow security best practices and ensure compliance with industry standards for web application development. Professional & Technical Skills: 3+ years of experience in Full-Stack Development , focusing on front-end frameworks (e.g., React , Angular , Vue.js ) and back-end technologies (e.g., Node.js , Python/Django , Java ). Strong knowledge of REST APIs , SQL and NoSQL databases . Hands-on experience with cloud platforms (AWS, Azure, or GCP). Proficiency in DevOps practices and the ability to implement and maintain CI/CD pipelines . Experience optimizing web application performance and ensuring scalability. Strong problem-solving and communication skills, with the ability to explain technical concepts clearly to diverse audiences. Additional Information: Portfolio :Resources are expected to have a demonstrable portfolio of visualization work and web application development projects. Security & Compliance :Adherence to security best practices and ensuring data privacy standards are maintained across applications. Qualifications Experience :Minimum 3+ years of experience in Full-Stack Development, with a focus on building scalable web applications, REST API integrations, and cloud-based solutions. Educational Qualification :Bachelors or Masters in Computer Science , Software Engineering , or a related discipline from a premier institute.

Posted 1 month ago

Apply

3 - 5 years

5 - 10 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Management Level :11 - Analyst Location :Gurgaon/Bangalore/Mumbai Must have skills :Front-End Frameworks (React, Angular, Vue.js), Back-End Technologies (Node.js, Python/Django, Java), REST APIs, SQL & NoSQL, Cloud (AWS/Azure/GCP), DevOps Practices, CI/CD Pipelines, Web Application Development Good to have skills :Docker/Kubernetes, Performance Optimization, Cloud Integrations, API Development, UI/UX Best Practices, Security Best Practices, Clean Code Practices, Microservices Architecture Job Summary : We are seeking a talented and motivated Full-Stack Developer to join our Banking team. The ideal candidate will leverage their expertise in designing and implementing scalable web applications, ensuring optimal performance and user experience. This role involves collaborating with cross-functional teams to deliver innovative solutions that meet business needs. Roles & Responsibilities: Web Application Development :Develop and maintain web applications using modern frameworks and technologies. Collaboration :Work closely with product managers, engineers, and other stakeholders to understand requirements and deliver impactful solutions. Optimization :Optimize application performance and scalability to ensure seamless user experiences. Code Quality :Write clean, maintainable, and well-documented code. REST API Development :Develop and integrate robust APIs for communication between the front-end and back-end systems. Cloud Platform Integration :Utilize cloud platforms like AWS , Azure , or GCP to build and deploy applications and services. CI/CD :Implement and maintain DevOps practices and CI/CD pipelines to ensure continuous delivery and integration. Security :Follow security best practices and ensure compliance with industry standards for web application development. Professional & Technical Skills: 3+ years of experience in Full-Stack Development , focusing on front-end frameworks (e.g., React , Angular , Vue.js ) and back-end technologies (e.g., Node.js , Python/Django , Java ). Strong knowledge of REST APIs , SQL and NoSQL databases . Hands-on experience with cloud platforms (AWS, Azure, or GCP). Proficiency in DevOps practices and the ability to implement and maintain CI/CD pipelines . Experience optimizing web application performance and ensuring scalability. Strong problem-solving and communication skills, with the ability to explain technical concepts clearly to diverse audiences. Additional Information: Portfolio :Resources are expected to have a demonstrable portfolio of visualization work and web application development projects. Security & Compliance :Adherence to security best practices and ensuring data privacy standards are maintained across applications. Qualifications Experience :Minimum 3+ years of experience in Full-Stack Development, with a focus on building scalable web applications, REST API integrations, and cloud-based solutions. Educational Qualification :Bachelors or Masters in Computer Science , Software Engineering , or a related discipline from a premier institute.

Posted 1 month ago

Apply

3 - 7 years

6 - 10 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Management Level :11 - Analyst Location :Gurgaon/Bangalore/Mumbai Must have skills :Front-End Frameworks (React, Angular, Vue.js), Back-End Technologies (Node.js, Python/Django, Java), REST APIs, SQL & NoSQL, Cloud (AWS/Azure/GCP), DevOps Practices, CI/CD Pipelines, Web Application Development Good to have skills :Docker/Kubernetes, Performance Optimization, Cloud Integrations, API Development, UI/UX Best Practices, Security Best Practices, Clean Code Practices, Microservices Architecture Job Summary : We are seeking a talented and motivated Full-Stack Developer to join our Banking team. The ideal candidate will leverage their expertise in designing and implementing scalable web applications, ensuring optimal performance and user experience. This role involves collaborating with cross-functional teams to deliver innovative solutions that meet business needs. Roles & Responsibilities: Web Application Development :Develop and maintain web applications using modern frameworks and technologies. Collaboration :Work closely with product managers, engineers, and other stakeholders to understand requirements and deliver impactful solutions. Optimization :Optimize application performance and scalability to ensure seamless user experiences. Code Quality :Write clean, maintainable, and well-documented code. REST API Development :Develop and integrate robust APIs for communication between the front-end and back-end systems. Cloud Platform Integration :Utilize cloud platforms like AWS , Azure , or GCP to build and deploy applications and services. CI/CD :Implement and maintain DevOps practices and CI/CD pipelines to ensure continuous delivery and integration. Security :Follow security best practices and ensure compliance with industry standards for web application development. Professional & Technical Skills: 3+ years of experience in Full-Stack Development , focusing on front-end frameworks (e.g., React , Angular , Vue.js ) and back-end technologies (e.g., Node.js , Python/Django , Java ). Strong knowledge of REST APIs , SQL and NoSQL databases . Hands-on experience with cloud platforms (AWS, Azure, or GCP). Proficiency in DevOps practices and the ability to implement and maintain CI/CD pipelines . Experience optimizing web application performance and ensuring scalability. Strong problem-solving and communication skills, with the ability to explain technical concepts clearly to diverse audiences. Additional Information: Portfolio :Resources are expected to have a demonstrable portfolio of visualization work and web application development projects. Security & Compliance :Adherence to security best practices and ensuring data privacy standards are maintained across applications. Qualifications Experience :Minimum 3+ years of experience in Full-Stack Development, with a focus on building scalable web applications, REST API integrations, and cloud-based solutions. Educational Qualification :Bachelors or Masters in Computer Science , Software Engineering , or a related discipline from a premier institute.

Posted 1 month ago

Apply

3 - 5 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :Overall 3+ years of experience working in Data Analytics projectsMUST be able to understand ETL technologies code (Ab Initio) an translate into Azure native tools or PysparkMUST have worked on complex projectsGood to have1. Good to have any ETL tool development experience2. Good to have Cloud (Azure) exposure or experienceAs an Application Lead, you will be responsible for designing, building, and configuring applications using PySpark. Your typical day will involve leading the effort to develop and deploy PySpark applications, collaborating with cross-functional teams, and ensuring timely delivery of high-quality solutions. Roles & Responsibilities: Lead the effort to design, build, and configure PySpark applications, acting as the primary point of contact. Collaborate with cross-functional teams to ensure timely delivery of high-quality solutions. Develop and deploy PySpark applications, utilizing best practices and ensuring adherence to coding standards. Provide technical guidance and mentorship to junior team members, fostering a culture of continuous learning and improvement. Stay updated with the latest advancements in PySpark and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark. Good To Have Skills:Experience with Hadoop, Hive, and other Big Data technologies. Strong understanding of distributed computing principles and data processing frameworks. Experience with data ingestion, transformation, and storage using PySpark. Solid grasp of SQL and NoSQL databases, including experience with data modeling and schema design. Additional Information: The candidate should have a minimum of 3 years of experience in PySpark. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications A Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

3 - 6 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

locationsIndia, Bangalore time typeFull time posted onPosted 30+ Days Ago job requisition idJR0035398 Job Title: Senior Software Engineer About Skyhigh Security Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the worlds data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on and Twitter . Role Overview: Software development engineer with expertise in networking and security systems and applications. Strong hands-on experience programming in C/C++ and Python/Bash/Other scripting language on windows operation system. In this role, you can expect to Write code to design, develop, maintain and implement scalable, flexible and user-friendly software modules in a given product. Completes major portions of complex functional specs/design documents and/or maintenance assignments. Identify and suggest solutions to problems of significant scope while generating engineering test plans from functional specification documents. Develop secure and highly performant services and APIs Develop compute/memory efficient solutions that maintain system responsiveness under normal/peak processing. Use distributed computing to validate and process large volumes of data. Continuously scale our systems for additional users/transactions, reducing/eliminating latency. Collaborate with technical support and operations to deploy, monitor, and patch as necessary fixes and enhancements. Ensure the maintainability and quality of code Evaluate technologies we can leverage, including open-source frameworks, libraries, and tools as applicable for new feature development. To fly high in this role, you have 6+ years of programming experience in an enterprise-scale environment, with strong hands-on experience programming in C/C++/Golang and Python/Bash/other scripting languages Strong knowledge of TCP/IP protocol stack, HTTP, DNS, and other related protocols Strong hands-on development experience in networking and security systems and applications on Windows Operating systems Strong code design, profiling and verification skills Strong knowledge of data structures, algorithms and designing for performance, scalability and availability Strong knowledge and experience with various SQL and NoSQL databases Strong experience in designing and building multithreaded distributed systems Strong, demonstrated ability to develop code in high-volume applications and large data sets Experience in agile software development practices and DevOps It would be great if you also have Development experience in multiple operating systems - Windows, Linux, MacOS Development experience in web technologies and API frameworks, such as Javascript, CSS, REST Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why we prohibit discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Education: A Bachelors degree in Computer Science, Engineering (B.Tech, BE), or a related field such as MCA (Master of Computer Applications) is required for this role. Experience: 8+ years in data engineering with a focus on building scalable and reliable data infrastructure. Skills: Language: Proficiency in Java or Python or Scala. Prior experience in Oil Gas, Titles Leases, or Financial Services is a must have. Databases: Expertise in relational and NoSQL databases like PostgreSQL, MongoDB, Redis, and Elasticsearch. Data Pipelines: Strong experience in designing and implementing ETL/ELT pipelines for large datasets. Tools: Hands-on experience with Databricks, Spark, and cloud platforms. Data Lakehouse: Expertise in data modeling, designing Data Lakehouses, and building data pipelines. Modern Data Stack: Familiarity with modern data stack and data governance practices. Data Orchestration: Proficient in data orchestration and workflow tools. Data Modeling: Proficient in modeling and building data architectures for high-throughput environments. Stream Processing: Extensive experience with stream processing technologies such as Apache Kafka. Distributed Systems: Strong understanding of distributed systems, scalability, and availability. DevOps: Familiarity with DevOps practices, continuous integration, and continuous deployment (CI/CD). Problem-Solving: Strong problem-solving skills with a focus on scalable data infrastructure. Key Responsibilities: This is a role with high expectations of hands on design and development. Design and development of systems for ingestion, persistence, consumption, ETL/ELT, versioning for different data types e.g. relational, document, geospatial, graph, timeseries etc. in transactional and analytical patterns. Drive the development of applications related to data extraction, especially from formats like TIFF, PDF, and others, including OCR and data classification/categorization. Analyze and improve the efficiency, scalability, and reliability of our data infrastructure. Assist in the design and implementation of robust ETL/ELT pipelines for processing large volumes of data. Collaborate with cross-functional scrum teams to respond quickly and effectively to business needs. Work closely with data scientists and analysts to define data requirements and develop comprehensive data solutions. Implement data quality checks and monitoring to ensure data integrity and reliability across all systems. Develop and maintain data models, schemas, and documentation to support data-driven decision-making. Manage and scale data infrastructure on cloud platforms, leveraging cloud-native tools and services. Benefits: Salary: Competitive and aligned with local standards. Performance Bonus: According to company policy. Benefits: Includes medical insurance and group term life insurance. Continuous learning and development.10 recognized public holidays. Parental Leave

Posted 1 month ago

Apply

6 - 10 years

18 - 20 Lacs

Bengaluru

Remote

Naukri logo

Greetings!!! We have an urgent opening Sr. Full Stack Developer with Java, Modern JavaScript, Spring Boot - Remote Role:- Sr. Full Stack Developer Location- Remote Duration: Long term Contract Budget: 22 LPA Shift Time- Rotational Shift(9am to 6pm, 12pm to 9 pm & 6pm to 3am) Immediate to 15 days Joiner JD: Qualifications • Proven strong experience in building robust services, preferably on a Java, Spring Bootbased cloud development stack • Strong proficiency in modern JavaScript (ES6+) and TypeScript. • Extensive experience in building complex frontend applications using React and related libraries such as Redux or Context API. • Experience with UI frameworks such as Bootstrap, Material-UI, or Ant Design. • Strong knowledge of Node.js and experience in creating RESTful APIs and microservices. • Solid understanding of object-oriented programming, design, and architectural patterns messaging and event-based systems, and REST API design • Experience with multiple architecture styles including API-first, and micro-services architectures • Experience in architecting and building large-scale systems using a scale-out architecture that requires high availability, performance, high scalability and multi-tenancy • Experience working on cloud-based SaaS/PaaS products • Understanding of web frontends and an understanding of HTML DOM, CSS, and event scripting • Experience using Kubernetes, Docker, API Gateways, Service Mesh and related technologies • Hands-on experience working within the agile process and CI/CD frameworks such as GitHub Actions, Opsera, DevOps • Ability to transition between programming languages and toolsets • Ability to effectively communicate new ideas and design tradeoffs • Must have: Java, Spring-boot, PostgreSql/NoSQL DB, JPA/Hibernate- 6+ years experience If you're interested, please send your resume to suhas@iitjobs.com.

Posted 1 month ago

Apply

1 - 5 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: Developer will play an integral role in the PTEIT Machine Learning Data Engineering team. Design, develop and support data pipelines in a hybrid cloud environment to enable advanced analytics. Design, develop and support CI/CD of data pipelines and services. - 5+ years of experience with Python or equivalent programming using OOPS, Data Structures and Algorithms - Develop new services in AWS using server-less and container-based services. - 3+ years of hands-on experience with AWS Suite of services (EC2, IAM, S3, CDK, Glue, Athena, Lambda, RedShift, Snowflake, RDS) - 3+ years of expertise in scheduling data flows using Apache Airflow - 3+ years of strong data modelling (Functional, Logical and Physical) and data architecture experience in Data Lake and/or Data Warehouse - 3+ years of experience with SQL databases - 3+ years of experience with CI/CD and DevOps using Jenkins - 3+ years of experience with Event driven architecture specially on Change Data Capture - 3+ years of Experience in Apache Spark, SQL, Redshift (or) Big Query (or) Snowflake, Databricks - Deep understanding building the efficient data pipelines with data observability, data quality, schema drift, alerting and monitoring. - Good understanding of the Data Catalogs, Data Governance, Compliance, Security, Data sharing - Experience in building the reusable services across the data processing systems. - Should have the ability to work and contribute beyond defined responsibilities - Excellent communication and inter-personal skills with deep problem-solving skills. Minimum Qualifications: 3+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 5+ years of IT-related work experience without a Bachelor"™s degree. 2+ years of any combination of academic or work experience with programming (e.g., Java, Python). 1+ year of any combination of academic or work experience with SQL or NoSQL Databases. 1+ year of any combination of academic or work experience with Data Structures and algorithms. 5 years of Industry experience and minimum 3 years experience in Data Engineering development with highly reputed organizations- Proficiency in Python and AWS- Excellent problem-solving skills- Deep understanding of data structures and algorithms- Proven experience in building cloud native software preferably with AWS suit of services- Proven experience in design and develop data models using RDBMS (Oracle, MySQL, etc.) Desirable - Exposure or experience in other cloud platforms (Azure and GCP) - Experience working on internals of large-scale distributed systems and databases such as Hadoop, Spark - Working experience on Data Lakehouse platforms (One House, Databricks Lakehouse) - Working experience on Data Lakehouse File Formats (Delta Lake, Iceberg, Hudi) Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.

Posted 1 month ago

Apply

1 - 5 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

- 1 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Required Experience 0 - 1 Years Skills DataOps img {max-height240px;} Strong proficiency in MySQL 5. x database management Decent experience with recent versions of MySQL Understanding of MySQLs underlying storage engines, such as InnoDB and MyISAM Tuning of MySQL parameters Administration of MySQL and monitoring of performance Experience with master-master replication configuration in MySQL and troubleshootingreplication Proficiency in writing complex queries, stored procedures, and triggers, eventscheduler Strong Linux shell scripting skills Have strong Unix / Shell scripting skills Familiarity with other SQL/NoSQL databases such as MongoDB, etc. desirable.* Install, Deploy and Manage MongoDB on Physical, Virtual, AWS EC2 instances* Should have experience on MongoDB Active Active sharded cluster setup with highavailability* Should have experience in administering MongoDB on the Linux platform* Experience on MongoDB version upgrade, preferably from version 4.0 to 4.4, on aproduction environment with a zero or very minimum application downtime, either withops manager or custom script* Good understanding and experience with MongoDB sharding and Disaster Recoveryplan* Knowledge of Cloud technologies is an added advantageSign in to applyShare this job

Posted 1 month ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies