Jobs
Interviews

909 Nosql Databases Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 7 years

6 - 10 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Management Level :11 - Analyst Location :Gurgaon/Bangalore/Mumbai Must have skills :Front-End Frameworks (React, Angular, Vue.js), Back-End Technologies (Node.js, Python/Django, Java), REST APIs, SQL & NoSQL, Cloud (AWS/Azure/GCP), DevOps Practices, CI/CD Pipelines, Web Application Development Good to have skills :Docker/Kubernetes, Performance Optimization, Cloud Integrations, API Development, UI/UX Best Practices, Security Best Practices, Clean Code Practices, Microservices Architecture Job Summary : We are seeking a talented and motivated Full-Stack Developer to join our Banking team. The ideal candidate will leverage their expertise in designing and implementing scalable web applications, ensuring optimal performance and user experience. This role involves collaborating with cross-functional teams to deliver innovative solutions that meet business needs. Roles & Responsibilities: Web Application Development :Develop and maintain web applications using modern frameworks and technologies. Collaboration :Work closely with product managers, engineers, and other stakeholders to understand requirements and deliver impactful solutions. Optimization :Optimize application performance and scalability to ensure seamless user experiences. Code Quality :Write clean, maintainable, and well-documented code. REST API Development :Develop and integrate robust APIs for communication between the front-end and back-end systems. Cloud Platform Integration :Utilize cloud platforms like AWS , Azure , or GCP to build and deploy applications and services. CI/CD :Implement and maintain DevOps practices and CI/CD pipelines to ensure continuous delivery and integration. Security :Follow security best practices and ensure compliance with industry standards for web application development. Professional & Technical Skills: 3+ years of experience in Full-Stack Development , focusing on front-end frameworks (e.g., React , Angular , Vue.js ) and back-end technologies (e.g., Node.js , Python/Django , Java ). Strong knowledge of REST APIs , SQL and NoSQL databases . Hands-on experience with cloud platforms (AWS, Azure, or GCP). Proficiency in DevOps practices and the ability to implement and maintain CI/CD pipelines . Experience optimizing web application performance and ensuring scalability. Strong problem-solving and communication skills, with the ability to explain technical concepts clearly to diverse audiences. Additional Information: Portfolio :Resources are expected to have a demonstrable portfolio of visualization work and web application development projects. Security & Compliance :Adherence to security best practices and ensuring data privacy standards are maintained across applications. Qualifications Experience :Minimum 3+ years of experience in Full-Stack Development, with a focus on building scalable web applications, REST API integrations, and cloud-based solutions. Educational Qualification :Bachelors or Masters in Computer Science , Software Engineering , or a related discipline from a premier institute.

Posted 3 months ago

Apply

3 - 5 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : A Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :Overall 3+ years of experience working in Data Analytics projectsMUST be able to understand ETL technologies code (Ab Initio) an translate into Azure native tools or PysparkMUST have worked on complex projectsGood to have1. Good to have any ETL tool development experience2. Good to have Cloud (Azure) exposure or experienceAs an Application Lead, you will be responsible for designing, building, and configuring applications using PySpark. Your typical day will involve leading the effort to develop and deploy PySpark applications, collaborating with cross-functional teams, and ensuring timely delivery of high-quality solutions. Roles & Responsibilities: Lead the effort to design, build, and configure PySpark applications, acting as the primary point of contact. Collaborate with cross-functional teams to ensure timely delivery of high-quality solutions. Develop and deploy PySpark applications, utilizing best practices and ensuring adherence to coding standards. Provide technical guidance and mentorship to junior team members, fostering a culture of continuous learning and improvement. Stay updated with the latest advancements in PySpark and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark. Good To Have Skills:Experience with Hadoop, Hive, and other Big Data technologies. Strong understanding of distributed computing principles and data processing frameworks. Experience with data ingestion, transformation, and storage using PySpark. Solid grasp of SQL and NoSQL databases, including experience with data modeling and schema design. Additional Information: The candidate should have a minimum of 3 years of experience in PySpark. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications A Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 3 months ago

Apply

3 - 6 years

6 - 10 Lacs

Bengaluru

Work from Office

locationsIndia, Bangalore time typeFull time posted onPosted 30+ Days Ago job requisition idJR0035398 Job Title: Senior Software Engineer About Skyhigh Security Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the worlds data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on and Twitter . Role Overview: Software development engineer with expertise in networking and security systems and applications. Strong hands-on experience programming in C/C++ and Python/Bash/Other scripting language on windows operation system. In this role, you can expect to Write code to design, develop, maintain and implement scalable, flexible and user-friendly software modules in a given product. Completes major portions of complex functional specs/design documents and/or maintenance assignments. Identify and suggest solutions to problems of significant scope while generating engineering test plans from functional specification documents. Develop secure and highly performant services and APIs Develop compute/memory efficient solutions that maintain system responsiveness under normal/peak processing. Use distributed computing to validate and process large volumes of data. Continuously scale our systems for additional users/transactions, reducing/eliminating latency. Collaborate with technical support and operations to deploy, monitor, and patch as necessary fixes and enhancements. Ensure the maintainability and quality of code Evaluate technologies we can leverage, including open-source frameworks, libraries, and tools as applicable for new feature development. To fly high in this role, you have 6+ years of programming experience in an enterprise-scale environment, with strong hands-on experience programming in C/C++/Golang and Python/Bash/other scripting languages Strong knowledge of TCP/IP protocol stack, HTTP, DNS, and other related protocols Strong hands-on development experience in networking and security systems and applications on Windows Operating systems Strong code design, profiling and verification skills Strong knowledge of data structures, algorithms and designing for performance, scalability and availability Strong knowledge and experience with various SQL and NoSQL databases Strong experience in designing and building multithreaded distributed systems Strong, demonstrated ability to develop code in high-volume applications and large data sets Experience in agile software development practices and DevOps It would be great if you also have Development experience in multiple operating systems - Windows, Linux, MacOS Development experience in web technologies and API frameworks, such as Javascript, CSS, REST Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why we prohibit discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 3 months ago

Apply

8 - 13 years

25 - 30 Lacs

Bengaluru

Work from Office

Education: A Bachelors degree in Computer Science, Engineering (B.Tech, BE), or a related field such as MCA (Master of Computer Applications) is required for this role. Experience: 8+ years in data engineering with a focus on building scalable and reliable data infrastructure. Skills: Language: Proficiency in Java or Python or Scala. Prior experience in Oil Gas, Titles Leases, or Financial Services is a must have. Databases: Expertise in relational and NoSQL databases like PostgreSQL, MongoDB, Redis, and Elasticsearch. Data Pipelines: Strong experience in designing and implementing ETL/ELT pipelines for large datasets. Tools: Hands-on experience with Databricks, Spark, and cloud platforms. Data Lakehouse: Expertise in data modeling, designing Data Lakehouses, and building data pipelines. Modern Data Stack: Familiarity with modern data stack and data governance practices. Data Orchestration: Proficient in data orchestration and workflow tools. Data Modeling: Proficient in modeling and building data architectures for high-throughput environments. Stream Processing: Extensive experience with stream processing technologies such as Apache Kafka. Distributed Systems: Strong understanding of distributed systems, scalability, and availability. DevOps: Familiarity with DevOps practices, continuous integration, and continuous deployment (CI/CD). Problem-Solving: Strong problem-solving skills with a focus on scalable data infrastructure. Key Responsibilities: This is a role with high expectations of hands on design and development. Design and development of systems for ingestion, persistence, consumption, ETL/ELT, versioning for different data types e.g. relational, document, geospatial, graph, timeseries etc. in transactional and analytical patterns. Drive the development of applications related to data extraction, especially from formats like TIFF, PDF, and others, including OCR and data classification/categorization. Analyze and improve the efficiency, scalability, and reliability of our data infrastructure. Assist in the design and implementation of robust ETL/ELT pipelines for processing large volumes of data. Collaborate with cross-functional scrum teams to respond quickly and effectively to business needs. Work closely with data scientists and analysts to define data requirements and develop comprehensive data solutions. Implement data quality checks and monitoring to ensure data integrity and reliability across all systems. Develop and maintain data models, schemas, and documentation to support data-driven decision-making. Manage and scale data infrastructure on cloud platforms, leveraging cloud-native tools and services. Benefits: Salary: Competitive and aligned with local standards. Performance Bonus: According to company policy. Benefits: Includes medical insurance and group term life insurance. Continuous learning and development.10 recognized public holidays. Parental Leave

Posted 3 months ago

Apply

6 - 10 years

18 - 20 Lacs

Bengaluru

Remote

Greetings!!! We have an urgent opening Sr. Full Stack Developer with Java, Modern JavaScript, Spring Boot - Remote Role:- Sr. Full Stack Developer Location- Remote Duration: Long term Contract Budget: 22 LPA Shift Time- Rotational Shift(9am to 6pm, 12pm to 9 pm & 6pm to 3am) Immediate to 15 days Joiner JD: Qualifications • Proven strong experience in building robust services, preferably on a Java, Spring Bootbased cloud development stack • Strong proficiency in modern JavaScript (ES6+) and TypeScript. • Extensive experience in building complex frontend applications using React and related libraries such as Redux or Context API. • Experience with UI frameworks such as Bootstrap, Material-UI, or Ant Design. • Strong knowledge of Node.js and experience in creating RESTful APIs and microservices. • Solid understanding of object-oriented programming, design, and architectural patterns messaging and event-based systems, and REST API design • Experience with multiple architecture styles including API-first, and micro-services architectures • Experience in architecting and building large-scale systems using a scale-out architecture that requires high availability, performance, high scalability and multi-tenancy • Experience working on cloud-based SaaS/PaaS products • Understanding of web frontends and an understanding of HTML DOM, CSS, and event scripting • Experience using Kubernetes, Docker, API Gateways, Service Mesh and related technologies • Hands-on experience working within the agile process and CI/CD frameworks such as GitHub Actions, Opsera, DevOps • Ability to transition between programming languages and toolsets • Ability to effectively communicate new ideas and design tradeoffs • Must have: Java, Spring-boot, PostgreSql/NoSQL DB, JPA/Hibernate- 6+ years experience If you're interested, please send your resume to suhas@iitjobs.com.

Posted 3 months ago

Apply

1 - 5 years

12 - 17 Lacs

Hyderabad

Work from Office

Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: Developer will play an integral role in the PTEIT Machine Learning Data Engineering team. Design, develop and support data pipelines in a hybrid cloud environment to enable advanced analytics. Design, develop and support CI/CD of data pipelines and services. - 5+ years of experience with Python or equivalent programming using OOPS, Data Structures and Algorithms - Develop new services in AWS using server-less and container-based services. - 3+ years of hands-on experience with AWS Suite of services (EC2, IAM, S3, CDK, Glue, Athena, Lambda, RedShift, Snowflake, RDS) - 3+ years of expertise in scheduling data flows using Apache Airflow - 3+ years of strong data modelling (Functional, Logical and Physical) and data architecture experience in Data Lake and/or Data Warehouse - 3+ years of experience with SQL databases - 3+ years of experience with CI/CD and DevOps using Jenkins - 3+ years of experience with Event driven architecture specially on Change Data Capture - 3+ years of Experience in Apache Spark, SQL, Redshift (or) Big Query (or) Snowflake, Databricks - Deep understanding building the efficient data pipelines with data observability, data quality, schema drift, alerting and monitoring. - Good understanding of the Data Catalogs, Data Governance, Compliance, Security, Data sharing - Experience in building the reusable services across the data processing systems. - Should have the ability to work and contribute beyond defined responsibilities - Excellent communication and inter-personal skills with deep problem-solving skills. Minimum Qualifications: 3+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 5+ years of IT-related work experience without a Bachelor"™s degree. 2+ years of any combination of academic or work experience with programming (e.g., Java, Python). 1+ year of any combination of academic or work experience with SQL or NoSQL Databases. 1+ year of any combination of academic or work experience with Data Structures and algorithms. 5 years of Industry experience and minimum 3 years experience in Data Engineering development with highly reputed organizations- Proficiency in Python and AWS- Excellent problem-solving skills- Deep understanding of data structures and algorithms- Proven experience in building cloud native software preferably with AWS suit of services- Proven experience in design and develop data models using RDBMS (Oracle, MySQL, etc.) Desirable - Exposure or experience in other cloud platforms (Azure and GCP) - Experience working on internals of large-scale distributed systems and databases such as Hadoop, Spark - Working experience on Data Lakehouse platforms (One House, Databricks Lakehouse) - Working experience on Data Lakehouse File Formats (Delta Lake, Iceberg, Hudi) Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.

Posted 3 months ago

Apply

1 - 5 years

6 - 11 Lacs

Pune

Work from Office

About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 months ago

Apply

- 1 years

3 - 7 Lacs

Bengaluru

Work from Office

Required Experience 0 - 1 Years Skills DataOps img {max-height240px;} Strong proficiency in MySQL 5. x database management Decent experience with recent versions of MySQL Understanding of MySQLs underlying storage engines, such as InnoDB and MyISAM Tuning of MySQL parameters Administration of MySQL and monitoring of performance Experience with master-master replication configuration in MySQL and troubleshootingreplication Proficiency in writing complex queries, stored procedures, and triggers, eventscheduler Strong Linux shell scripting skills Have strong Unix / Shell scripting skills Familiarity with other SQL/NoSQL databases such as MongoDB, etc. desirable.* Install, Deploy and Manage MongoDB on Physical, Virtual, AWS EC2 instances* Should have experience on MongoDB Active Active sharded cluster setup with highavailability* Should have experience in administering MongoDB on the Linux platform* Experience on MongoDB version upgrade, preferably from version 4.0 to 4.4, on aproduction environment with a zero or very minimum application downtime, either withops manager or custom script* Good understanding and experience with MongoDB sharding and Disaster Recoveryplan* Knowledge of Cloud technologies is an added advantageSign in to applyShare this job

Posted 3 months ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies