Home
Jobs

286 Dynamo Db Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 1 week ago

Apply

4.0 - 7.0 years

5 - 15 Lacs

Mumbai

Hybrid

Naukri logo

Role: Sr Python FastAPI Developer Location: Mumbai Experience: 4yrs to 7yrs Technologies / Skills: Python (FastAPI), Advance SQL, Postgres, DynamoDB, Docker Responsibilities: - Build high-performance REST APIs & WebSockets to power web applications. - Design, develop, and maintain scalable and efficient backend services using FastAPI for web applications. - Coordinating with development teams to determine application requirements and integration points. - Understanding of fundamental design principles behind a scalable application and writing scalable code. - Implement security best practices to safeguard sensitive data and ensure compliance with privacy regulations. - Own and manage all phases of the software development lifecycle planning, design, implementation, deployment, and support. - Build reusable, high-quality code and libraries for future use that are high-performance and can be used across multiple projects. - Conduct code reviews and provide constructive feedback to team members. - Stay up-to-date with emerging technologies and trends in Python development and FastAPI framework. - Ensuring the reliability and correctness of FastAPI applications using Pytest - Defines and documents business requirements for complex system development or testing - Comfortable working with agile / scrum / kanban - Willingness to join a distributed team operating across different time-zones Required Qualification for Sr Python FastAPI Developer - Bachelors degree in IT, computer science, computer engineering, or similar - Min. 3+ years of experience in Python (FastAPI) development. - Strong understanding of asynchronous programming and background tasks. - Knowledge of Pydantic, CRON jobs scheduler, Swagger Ul for endpoints. - Proficiency in database management systems.(e. g., DynamoDB, PostgreSQL). - Familiarity with containerization technologies such as Docker. - Excellent verbal and written communication skills - Experience with version control systems (e.g., Git, Git actions) is a plus.

Posted 1 week ago

Apply

6.0 - 8.0 years

11 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 8+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Noida

Work from Office

Naukri logo

Roles & Responsibilities: Proficient in Python including, Github, Git commands Develop code based on functional specifications through an understanding of project code Test code to verify it meets the technical specifications and is working as intended, before submitting to code review Experience in writing tests in Python by using Pytest Follow prescribed standards and processes as applicable to software development methodology, including planning, work estimation, solution demos, and reviews Read and understand basic software requirements Assist with the implementation of a delivery pipeline, including test automation, security, and performance Assist in troubleshooting and responding to production issues to ensure the stability of the application Must-Have and Mandatory: Very Good experience in Python Flask, SQL Alchemy, Pytest Knowledge of Cloud like AWS Cloud , Lambda, S3, Dynamo DB Database - Postgres SQL or MySQL or Any relational database. Can provide suggestions for performance improvements, strategy, etc. Expertise in object-oriented design and multi-threaded programming Total Experience Expected: 04-06 years

Posted 1 week ago

Apply

12.0 - 15.0 years

35 - 60 Lacs

Chennai

Work from Office

Naukri logo

AWS Solution Architect: Experience in driving the Enterprise Architecture for large commercial customers Experience in healthcare enterprise transformation Prior experience in architecting cloud first applications Experience leading a customer through a migration journey and proposing competing views to drive a mutual solution. Knowledge of cloud architecture concepts Knowledge of application deployment and data migration Ability to design high availability applications on AWS across availability zones and availability regions Ability to design applications on AWS taking advantage of disaster recovery design guidelines Design, implement, and maintain streaming solutions using AWS Managed Streaming for Apache Kafka (MSK) Monitor and manage Kafka clusters to ensure optimal performance, scalability, and uptime. Configure and fine-tune MSK clusters, including partitioning strategies, replication, and retention policies. Analyze and optimize the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements. Design and implement data integration solutions to stream data between various sources and targets using MSK. Lead data transformation and enrichment processes to ensure data quality and consistency in streaming applications Mandatory Technical Skillset: AWS Architectural concepts - designs, implements, and manages cloud infrastructure AWS Services (EC2, S3, VPC, Lambda, ELB, Route 53, Glue, RDS, DynamoDB, Postgres, Aurora, API Gateway, CloudFormation, etc.) Kafka Amazon MSK Domain Experience: Healthcare domain exp. is required Blues exp. is preferred Location – Pan India

Posted 1 week ago

Apply

12.0 - 18.0 years

35 - 45 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Summary We are seeking an experienced Amazon Connect Architect with 12 to 15 years of experience to design, develop and implement scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services You will play a key role in translating business needs into technical solutions and lead implementation across clients or business units Key Responsibilities Architect and design contact center solutions using Amazon Connect and AWS services like Lambda Lex DynamoDB S3 and CloudWatch Lead the endtoend implementation and configuration of Amazon Connect Integrate Amazon Connect with CRMs, Salesforce, ServiceNow etc, ticketing systems, and third-party tools Define call flows IVR designs, routing profiles and queue configurations Implement Contact Lens realtime metrics and historical reporting Collaborate with cross-functional teams, developers, business analysts project managers Create technical documentation diagrams and handoff materials Stay updated on AWS best practices and new Amazon Connect features Provide technical leadership and mentorship to development support teams Required Skills Proven experience designing and deploying Amazon Connect solutions Strong hands-on knowledge of AWS Lambda, IAM, S3, DynamoDB, Kinesis, and CloudFormation Experience with Amazon Lex and AIML for voice bots Proficiency in programming scripting JavaScript, Node.js Familiarity with CRM integrations especially Salesforce Service Cloud Voice Understanding of telephony concepts SIP DID ACD IVR CTI Experience with CICD pipelines and version control Git Strong documentation and communication skills Preferred Skills AWS Certified Solutions Architect or Amazon Connect accreditation

Posted 1 week ago

Apply

8.0 - 13.0 years

22 - 30 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

LOCATION: PAN INDIA Experience: 8+ years Support Model: 24x7 rotational Role Overview: Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation point for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

15 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

LOCATION: PAN INDIA Experience: 3-5 years Support Model: 24x7 rotational Role Overview: Provide support on Amazon Connect-related incidents and user issues. Handle basic troubleshooting of voice, call routing, and UI-based configurations. Support change announcements and basic deployment activities. Coordinate with L2/L3 engineers for escalation. Maintain documentation and update knowledge base. Skills Required: Hands-on experience with Amazon Connect (basic flows, routing, and settings) Exposure to AWS Lambda, S3, DynamoDB Basic understanding of WebRTC and voice troubleshooting Familiar with CloudWatch, Connect Metrics Willingness to learn Salesforce integration. (Service Cloud Voice) Strong willingness to work in support model and take ownership Experience: 5-8 years Support Model: 24x7 rotational Role Overview: Provide L2 level support for Amazon Connect and associated AWS services. Address incidents and troubleshoot system or telephony-related issues. Support service delivery and ensure announced changes are implemented. Maintain SLAs and escalate where required. Contribute to documentation and improvement plans. Support deployment through CI/CD pipeline. Skills Required: Strong hands-on experience with Amazon Connect Working knowledge of Lambda, DynamoDB, S3 Good understanding of call flows, routing, and WebRTC troubleshooting Familiarity with CloudWatch, Connect Metrics, CI/CD Exposure to Salesforce integration helpful. (Service Cloud Voice) Ability to work independently with issue resolution Good communication and support handling

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

7.0 - 8.0 years

11 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 8+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 1 week ago

Apply

2.0 - 5.0 years

6 - 17 Lacs

Noida

Work from Office

Naukri logo

Responsibilities: * Design, develop, test & maintain Vue.js applications using TypeScript, Node.js, GraphQL & AWS services. * Collaborate with cross-functional teams on API Gateway integration & Dynamo DB data management. Food allowance

Posted 1 week ago

Apply

3.0 - 6.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Intellimind is seeking and PL/SQL Developer with 3-6 years of hands-on experience to join our expanding IT team. The ideal candidate will possess a robust technical knowledge of Oracle databases, be skilled in creating and configuring application pages using Oracle objects, and have actively contributed to both the design and development phases. This role involves collaborating with a team of skilled professionals to drive our database initiatives forward. Key Responsibilities: Participate in the full lifecycle of Oracle PL/SQL development. Develop, Test, and Implement Oracle PL/SQL packages, Table, View, Procedures, Functions and Triggers Create and configure application pages using Oracle or any database objects based on the product training provided by us. Assist in the analysis and gathering of business requirements. Improve query performance and optimize queries using both optimization tools and manual methods. Collaborate with other team members to integrate databases with other applications. Maintain data integrity and security using quality procedures and Oracle features. Follow assigned tasks and resolve issues using ticketing tools. Conduct root-cause analysis of technical and application issues. Create documentation for technical specifications, user manuals. Prepare development time estimates for requirements. Participate in regular team meetings to discuss ongoing projects and potential roadblocks with Project managers and Operations team. Stay updated with the latest Oracle features, technologies, and best practices. Work closely with the Business and Operations team - collaborate effectively to ensure alignment on project goals and priorities. Qualifications & Skills: Bachelors degree in computer science, Information Technology, or a related field. 4-5 years of hands-on experience with Oracle PL/SQL development . Proficient in Oracle Database SQL and PL/SQL with a strong working knowledge of DDL, DML, DQL, DCL, and TCL commands. Strong working knowledge in IDE Tools like Toad for oracle and optimization tools. Working experience in AWS RDS databases, No SQL databases Familiarity with database design principles and normalization. Strong problem-solving abilities and attention to detail. Ability to work both independently and as part of a collaborative team. Effective communication skills, both written and verbal. Knowledge of XML batch configuration and integration with Oracle. Experience with version control tools like GitHub. Ensure close collaboration with the Reporting Manager, Internal teams and the stakeholders. Working Hours : The shift timings for this role are flexible and may vary between 9:00 AM to 6:00 PM and 1:00 PM to 10:00 PM. Candidates should be available for both shifts as required

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Naukri logo

Client name: Atos Syntel Payroll Company Compunnel Inc. in Noida has completed six months of direct collaboration with Atos. Job Location: Pune/Mumbai/Chennai Experience Required: 4+ years Mode of Work: Hybrid, 3 days work from the office and 2 days work from home Job Title: AWS Java Lead Should have min 4+ years of experience in the software industry. Must have experience in Java, Data structures, Algorithms, Spring Boot, Microservices, Rest API, Design Patterns, Problem Solving & Knowledge on any cloud. 4+ experience with AWS (S3, Lambda, DynamoDB, API Gateway, etc.) Hands-on with engineering excellence, AWS CloudFormation, AWS DevOps toolchain, and practices. Excellent problem-solving and critical thinking. Independent and strong ownership of business problems and technical solutions Strong Communication and interpersonal skills Experience with open source (Apache Projects, Spring, Maven, etc.) Expert knowledge of the Java language, platform, ecosystem, and underlying concepts and constructs Knowledge of common design patterns and design principles Good knowledge of networking & security constructs specific to AWS AWS Associate or Professional Solutions Architect Certification will provide more weightage. Please fill in all the essential details which are given below & attach your updated resume, and send it to ralish.sharma@compunnel.com 1. Total Experience: 2. Relevant Experience in Java : 3. Relevant Experience in AWS: 4. Experience in S3/Lambda/Dynamo DB/ API Gateway: 5. Experience in Design Patterns: 6. Experience in Spring Boot : 7. Experience in Microservices : 8. Current company : 9. Current Designation : 10. Highest Education : 11. Notice Period: 12 Current CTC: 13. Expected CTC: 14. Current Location: 15. Preferred Location: 16. Hometown: 17. Contact No: 18. If you have any offer from some other company, please mention the Offer amount and Offer Location: 19. Reason for looking for change: 20. PANCARD : If the job description is suitable for you, please get in touch with me at the number below: 9910044363 .

Posted 1 week ago

Apply

2.0 - 7.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Apply Here (Mandatory to consider your profile) : https://forms.gle/8UigBWPhwYFM6Uuu5 Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) The role is within a dynamic and fast-evolving team focused on developing innovative technology solutions to convert offline customers into online customers. The team builds scalable, strategic, and sustainable products tailored to market needs, often deploying solutions that can be adapted globally. In this position, you will lead end-to-end development of solutions, leveraging a wide range of technologies, including cloud computing platforms, big data, machine learning, mobile platforms, APIs, and modern frontend frameworks. Your responsibilities will include maintaining high code quality standards, optimizing development processes, and mentoring junior engineers. This role thrives in a fast-paced environment where delivering impactful features and products is key. The ideal candidate is a skilled software engineer with experience building and launching distributed systems at scale. You are adaptable, thrive in dynamic and entrepreneurial environments, and are eager to mentor others. You excel at managing competing priorities, navigating ambiguity, and making data-driven decisions. Strong communication skills and the ability to influence and lead are critical for success in this role. Key Responsibilities : Collaborate with senior engineers to design and deliver high-quality technology solutions. Contribute to the development of distributed workflows hosted in cloud-native architecture. Maintain operational excellence for a rapidly scaling technology stack. Drive innovation through patents, technical presentations, and ideation sessions. Play a key role in hiring and nurturing technical talent. Define and measure success metrics to guide the evolution of technology products. Basic Qualifications: 2-8 years of professional software development experience (excluding internships). Proficiency in at least one software programming language. Preferred Qualifications : Bachelors degree in Computer Science or a related field (or equivalent experience). Coding Efficiency Key Skills : Arrays Graphs User Acceptance Testing (UAT) MySQL Oracle DynamoDB DNS VPN Unix Java C++ C# Python SQL Kotlin TypeScript Greedy Algorithms Backtracking Infrastructure as Code (IaC) Cloud Platform Management Serverless Computing Continuous Integration (CI) IDE Jenkins Docker Web Design Note : Work with FAANG, top MNCs, and fast-growing startupsLorvensoft connects you to premier tech roles via our recruitment partners like JCurve

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Maharashtra

Work from Office

Naukri logo

Technical Infrastructure: Heres just some of what we use: AWS (EC2, IAM, EKS etc.), Terraform Enterprise, Docker, Kubernetes, Aurora, Mesos, HashiCorp Vault and Consul Datadog and PagerDuty Microservices architecture, Spring, Java & NodeJS, React, Koa, Express.js. Amazon RDS, Dynamo DB, Postgres, Oracle, MySQL, GitHub, Jenkins, Concourse CI , Jfrog Artifactory About the role: You will constantly be asking; what are the most important infrastructure problems we need to solve for today that will increase our applications and infrastructures reliability and performance. You will apply your deep technical knowledge, taking a broad look at our technology infrastructure. Youll help us identify common and systematic issues and validate these, prioritizing which to strategically address first. We value collaboration. So, you will partner with our SRE/DevOps team, discussing and refining your ideas and preparing proof of concepts. You will present and validate these across technology teams, figuring out the best solution and youll be given ownership to engineer and implement your solutions. Theres lot of interesting technology problems for you to solve, so you are constantly applying latest thinking. These include, implementing Canary, designing a new automated pipeline solution, extension of Kubernetes capabilities, implementation of machine learning to build load testing, ensuring mutability of containerization etc. You will get to evaluate existing technologies and design the future state without being afraid to challenge the status quo. And youll regularly review existing infrastructure, looking for opportunities to improve (E.g. service improvements, cost reduction, security, performance). Youll also get to automate everything necessary, combining reliability with a pragmatic approach, doing it right, first time. Were continuing our journey of making our code and configuration deployments self-serve for our development teams. Youll help us build and maintain the right tooling and youll have ownership to design and implement the infrastructure needed Youll also be involved in the daily management of our AWS infrastructure. This means working with our Agile development teams, to troubleshoot server, application, and performance issues Skills & Experience: Relevant 5 to 8 years hands-on SRE/DevOps experience in an Agile environment Substantial experience with AWS services in a production environment. Demonstrated expertise in managing and modernizing legacy systems and infrastructure. Youll be able to collaborate effectively with both engineers and operations, and be comfortable recommending best practices You have the expertise and skills to navigate the AWS ecosystem and will know when and where to recommend the most appropriate service, and/or usage pattern. You have experienced resolving outages, and are able to quickly diagnose issues and been instrumental in restoring normal service levels You have an intellectual curiosity, and an appetite to learn more Strong hands-on experience working with Linux environments; Windows experience is a plus. Strong Proficiency in scripting languages (e.g., Bash, Python) for automation and process optimization. Experience with CI/CD tools such as Jenkins, GitHub Actions, Concourse CI preferably. Expertise in containerization technologies like Docker and orchestration tools such as Kubernetes. Practical experience managing event-driven systems, messaging queues, and load balancers. Strong understanding of monitoring, logging, and observability tools to ensure system reliability. Good to have Datadog, Pager duty exposure. Proven ability to troubleshoot critical outages, identify root causes, and restore service quickly. Proficiency in HashiCorp technologies including Terraform IaC, Vault (Secret management) and Consul (service discovery and config management). Youll also have significant experience and/or an interest in the following: Managing cloud infrastructure as code preferably using Terraform Application Container Management and orchestration primarily in Kubernetes environments preferably AWS EKS Maintaining managed databases including AWS RDS. Experience in how to tune, scale and how performance and reliability are achieved. Good understanding of PKI infrastructure and CDN technologies including AWS Cloud front Expertise in AWS security including AWS IAM service. Experience with AWS lambda, AWS Sagemaker. Experience working and strong understanding with firewalls, network and application load balancing. A strong and informed point of view with respect to monitoring tools and how best to use them. Ability to work cloud-based environments spanning multiple AWS accounts management and integration. An analytical mindset with a passion for identifying and solving infrastructure bottlenecks.

Posted 1 week ago

Apply

8.0 - 9.0 years

11 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5 to 10+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 1 week ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

Noida

Work from Office

Naukri logo

Responsibilities: * Design, develop & maintain backend applications using Node.JS, NestJS & Python. * Collaborate with cross-functional teams on project delivery. Annual bonus

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 1 week ago

Apply

Exploring Dynamo DB Jobs in India

With the increasing demand for cloud-based solutions, Dynamo DB jobs in India are on the rise. Companies are looking for skilled professionals who can manage and optimize their NoSQL databases effectively. If you are a job seeker interested in pursuing a career in Dynamo DB, here is a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Dynamo DB professionals.

Average Salary Range

The average salary range for Dynamo DB professionals in India varies based on experience levels. Entry-level positions can expect a salary of INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career progression in Dynamo DB may look like this: - Junior Developer - Developer - Senior Developer - Tech Lead

As you gain experience and expertise in Dynamo DB, you can move up the ladder to more senior roles with increased responsibilities.

Related Skills

In addition to Dynamo DB proficiency, employers often look for candidates with the following skills: - AWS (Amazon Web Services) - NoSQL databases - Data modeling - Database optimization - Query optimization

Having a combination of these skills can make you a more competitive candidate in the job market.

Interview Questions

  • What is Dynamo DB?
  • Explain the difference between SQL and NoSQL databases. (basic)
  • How does Dynamo DB ensure high availability and durability? (medium)
  • What is the partition key in Dynamo DB? (basic)
  • How does Dynamo DB handle data consistency? (medium)
  • Describe the read and write capacity units in Dynamo DB. (medium)
  • What is the difference between Provisioned and On-Demand capacity modes in Dynamo DB? (medium)
  • How can you optimize Dynamo DB performance? (medium)
  • Explain the concepts of item, attribute, and table in Dynamo DB. (basic)
  • How does secondary indexing work in Dynamo DB? (medium)
  • What is the importance of scaling in Dynamo DB? (basic)
  • How does Dynamo DB handle data replication and backups? (medium)
  • Explain the concept of Dynamo DB Streams. (medium)
  • How can you secure your data in Dynamo DB? (medium)
  • What are the different data types supported by Dynamo DB? (basic)
  • Describe the pricing model of Dynamo DB. (medium)
  • How can you monitor and troubleshoot performance issues in Dynamo DB? (medium)
  • What are the best practices for designing Dynamo DB tables? (medium)
  • How does Dynamo DB handle schema changes? (medium)
  • Explain the concept of eventual consistency in Dynamo DB. (medium)
  • How can you automate tasks in Dynamo DB using AWS SDK? (advanced)
  • Describe the differences between Dynamo DB and other NoSQL databases like MongoDB. (medium)
  • What are the limitations of Dynamo DB? (basic)
  • How does Dynamo DB handle partition keys with high request rates? (medium)

Closing Remark

As you prepare for Dynamo DB job interviews, make sure to brush up on your technical skills and be ready to showcase your understanding of NoSQL databases and cloud computing. With the right preparation and confidence, you can land a rewarding job in the thriving tech industry in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies