Home
Jobs

1421 Dynamodb Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Kozhikode, Kerala, India

On-site

Linkedin logo

Overview: 3+ year’s experience in Core Java and Enterprise Java Technologies with following skills: Core Java (8 >) Spring, Spring Boot Front-end technologies such as HTML, CSS, Typescript, and popular JavaScript frameworks (Angular 9 >) , NodeJS , RxJS Proficiency in working with RDBMS (SQL) Good knowledge of REST API and Micro service architectures. Aware of DevOps (CI/CD) process, Jenkins, Docker, Kubernetes Knowledge on Cloud (AWS Lambdas, SQS, EKS, DynamoDB, Redshift etc) Experience in following Tools: IntelliJ, Maven, DB tools, Bitbucket,Confluence. Shoule be hands-on on SQL. Knowledge on data life cycle with ETL and semantic data processing. Responsibilities: Design and Develop Java applications for Data Ingestion Understand the existing system and optimise the code , develop new capabilities. Build UI components on Angular and NodeJS for Admin, Audit, Monitoring and Self service Data Ingestion. Take ownership of individual task end to end. Qualifications: Bachelor’s degree/master’s degree in engineering in Computer Science/Information Technology or any similar stream Good overall Academic background. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Job Purpose ICE Mortgage Technology is driving value to every customer through our effort to automate everything that can be automated in the residential mortgage industry. Our integrated solutions touch each aspect of the loan lifecycle, from the borrower's "point of thought" through e-Close and secondary solutions. Drive real automation that reduces manual workflows, increases productivity, and decreases risk. You will be working in a dynamic product development team while collaborating with other developers, management, and customer support teams. You will have an opportunity to participate in designing and developing services utilized across product lines. The ideal candidate should possess a product mentality, have a strong sense of ownership, and strive to be a good steward of his or her software. More than any concrete experience with specific technology, it is critical for the candidate to have a strong sense of what constitutes good software; be thoughtful and deliberate in picking the right technology stack; and be always open-minded to learn (from others and from failures). Responsibilities Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule. Turn ideas and concepts into carefully designed and well-authored quality code. Articulate the interdependencies and the impact of the design choices. Develop APIs to power data driven products and external APIs consumed by internal and external customers of data platform. Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results. Improve and develop new engineering processes & tools. Knowledge And Experience 3+ years of building Enterprise Software Products. Experience in object-oriented design and development with languages such as Java. J2EE and related frameworks. Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred) Knowledge in Java/J2EE frameworks like Spring Boot, Microservice, JPA, JDBC and related frameworks is must. Built high throughput real-time and batch data processing pipelines using Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift . (Should know basics atleast) Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres Proven ability to deliver working solutions on time Strong analytical thinking to tackle challenging engineering problems. Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills. Experience with working in DevOps environment - “you build it, you run it” Demonstrated ability to set priorities and work in a fast-paced, dynamic team environment within a start-up culture. Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc (Nice to have) Experience with handling large data sets using technologies like HDFS, S3, Avro and Parquet (Nice to have) Show more Show less

Posted 1 week ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Requirements Role/Job Title: Developer Function/Department: Information technology Job Purpose As a Backend Developer, you will play a crucial role in designing, developing, and maintaining complex backend systems. You will work closely with cross-functional teams to deliver high-quality software solutions and drive the technical direction of our projects. Your experience and expertise will be vital in ensuring the performance, scalability, and reliability of our applications. Roles and Responsibilities: Solid understanding of backend performance optimization and debugging. Formal training or certification on software engineering concepts and proficient applied experience Strong hands-on experience with Python Experience in developing microservices using Python with FastAPI. Commercial experience in both backend and frontend engineering Hands-on experience with AWS Cloud-based applications development, including EC2, ECS, EKS, Lambda, SQS, SNS, RDS Aurora MySQL & Postgres, DynamoDB, EMR, and Kinesis. Strong engineering background in machine learning, deep learning, and neural networks. Experience with containerized stack using Kubernetes or ECS for development, deployment, and configuration. Experience with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Educational Qualification (Fulltime) Bachelor of Technology (B.Tech) / Bachelor of Science (B.Sc) / Master of Science (M.Sc) /Master of Technology (M.Tech) / Bachelor of Computer Applications (BCA) / Master of Computer Applications (MCA) Experience : 2-5 Years Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We’re seeking a highly skilled Senior Fullstack Engineer with deep expertise in TypeScript, a strong preference for React on the frontend and NestJS on the backend, and rock-solid software-engineering fundamentals. You’ll balance frontend and backend work, contribute ideas when new technical challenges arise, and help deliver features quickly and reliably. Familiarity with AWS and modern event-driven patterns is valuable, but you won’t own infrastructure or DevOps pipelines. Key Responsibilities: Design, develop, and maintain scalable fullstack applications using TypeScript (React + NestJS preferred). Comfortably tackle complex sprint tickets and help teammates unblock issues, delivering high-quality solutions efficiently. Propose and discuss technical approaches with the team when new problems surface. Collaborate closely with designers, product managers, and engineers to ship high-quality features. Write clean, testable, maintainable code and participate in code reviews. Deploy and troubleshoot applications in AWS-based environments. Qualifications: 7+ years of professional experience across frontend and backend development. Advanced proficiency in TypeScript with significant React and NestJS experience. Strong foundations in design patterns, automated testing, clean architecture, and SOLID principles. Experience with relational databases (e.g., PostgreSQL) and ORMs such as Prisma or TypeORM. Practiced in writing and maintaining automated tests (e.g., Jest, Playwright, Cypress). Fluent English—clear, efficient verbal and written communication. Experience deploying applications to AWS and understanding common services (Lambda, S3, DynamoDB, API Gateway, IAM). Comfortable working in Agile environments, with a strong sense of ownership and accountability for quality and performance. Preferred Qualifications: Familiarity with event-driven architectures, including tools and patterns such as Kafka / Amazon MSK, SNS + SQS fan-out, and Amazon EventBridge. Experience building microservices or modular monoliths and understanding their trade-offs. Familiarity with CI/CD pipelines (including GitHub Actions) and infrastructure-as-code tooling. Awareness of application-security best practices and performance tuning techniques. Experience with GraphQL, WebSockets, or other real-time communication patterns. Exposure to additional tech stacks and a demonstrated eagerness to learn new technologies quickly. About Us: TechAhead is a global digital transformation company with a strong presence in the USA and India. We specialize in AI-first product design thinking and bespoke development solutions . With over 15 years of proven expertise, we have partnered with Fortune 500 companies and leading global brands to drive digital innovation and deliver excellence. At TechAhead, we are committed to continuous learning, growth and crafting tailored solutions that meet the unique needs of our clients. Join us to shape the future of digital innovation worldwide and drive impactful results with cutting-edge AI tools and strategies! Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Job requisition ID :: 82489 Date: Jun 4, 2025 Location: Hyderabad Designation: Associate Director Entity: Job Overview: We are seeking a skilled and motivated Cloud Database Administrator (Cloud DBA) to join our growing team. The ideal candidate will be responsible for the administration, optimization, and management of cloud-based database platforms across major providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud. As a Cloud DBA, you will work closely with the infrastructure, development, and DevOps teams to ensure that the cloud databases are secure, scalable, and performant. You will have the opportunity to implement cutting-edge cloud database solutions and work with large-scale, mission-critical systems. Key Responsibilities: Cloud Database Administration: Administer and manage cloud-based databases across AWS, Azure, and Google Cloud, including databases like Amazon RDS, Azure SQL Database, Google Cloud SQL, and NoSQL databases like DynamoDB, Cosmos DB, and MongoDB. Manage database instances in cloud environments, including provisioning, scaling, patching, and backups. Collaborate with cloud architecture teams to ensure the database infrastructure is aligned with best practices for scalability, security, and performance. Performance Tuning & Optimization: Continuously monitor cloud database performance, including query optimization, resource usage, and storage optimization. Implement proactive tuning techniques to ensure high availability and efficient performance of cloud-based databases. Review and optimize SQL queries, indexes, and database structures for optimal performance in the cloud. High Availability & Disaster Recovery: Design and implement high-availability solutions in the cloud, such as automated failover, replication, and clustering for cloud-based databases. Implement robust disaster recovery plans with real-time backup and restore procedures to minimize downtime. Regularly test and update disaster recovery strategies to ensure quick and efficient data restoration during emergencies. Cloud Database Security & Compliance: Enforce cloud database security policies, including data encryption, secure user access management, and access control configurations. Implement secure cloud database environments by applying cloud provider security tools and best practices (e.g., IAM policies, encryption, VPC configurations). Ensure compliance with regulations and industry standards (e.g., GDPR, HIPAA, PCI-DSS) within the cloud databases and perform regular security audits. Automation & Infrastructure as Code: Leverage cloud-native tools and frameworks (e.g., AWS CloudFormation, Azure Resource Manager, Terraform) to automate database provisioning, scaling, and management. Develop and maintain automation scripts for routine tasks, such as backups, health checks, and monitoring. Implement infrastructure as code (IaC) practices for reproducible and scalable cloud database deployments. Database Monitoring & Reporting: Utilize cloud-native monitoring tools such as Amazon CloudWatch, Azure Monitor, or Google Stackdriver to track database performance, availability, and health. Design and maintain custom dashboards and alerting systems for proactive database management. Generate regular performance and health reports to provide insights into database operations and guide improvements. Collaboration & Support: Collaborate with application development teams to design, deploy, and maintain cloud-native database solutions that meet business needs. Provide second- and third-level support for database-related issues, troubleshooting performance or security problems in the cloud environment. Assist with cloud database migrations from on-premises systems to cloud environments, including hybrid and multi-cloud strategies. Innovation & Continuous Improvement: Stay current with the latest trends, tools, and technologies in cloud computing and database management. Propose and implement database innovations that improve performance, security, and cost-efficiency in the cloud. Continuously optimize cloud database costs by monitoring and optimizing resource usage and storage. Experience: At least 3-5 years of experience as a Database Administrator, with a strong focus on cloud-based databases. Hands-on experience with major cloud platforms such as AWS, Microsoft Azure, or Google Cloud, and their database services (Amazon RDS, Azure SQL Database, Google Cloud SQL, etc.). Experience with database migration to the cloud and managing hybrid cloud environments. Familiarity with cloud-based NoSQL databases, including DynamoDB, Cosmos DB, or Firebase. Technical Skills: Strong expertise in managing relational databases (SQL Server, MySQL, PostgreSQL) and NoSQL databases (MongoDB, DynamoDB, Cosmos DB). Familiarity with cloud-native database tools and services for high availability, backups, security, and scaling. Proficient with SQL, database performance tuning, and optimization techniques. Strong scripting skills (e.g., Python, PowerShell, Bash) and experience with cloud automation tools (Terraform, CloudFormation, Ansible). Understanding of cloud security principles (encryption, VPCs, IAM, security groups, access control). Experience with cloud monitoring and logging services (Amazon CloudWatch, Azure Monitor, Google Stackdriver). Certifications (Preferred): AWS Certified Database - Specialty. Microsoft Certified: Azure Database Administrator Associate (DP-300). Google Professional Cloud Database Engineer. AWS Certified Solutions Architect – Associate. Microsoft Certified: Azure Solutions Architect Expert. Key Attributes: Problem-Solving: Strong troubleshooting skills and the ability to analyze complex database problems and implement efficient solutions. Adaptability: Comfort with cloud technologies, willingness to learn new tools and platforms. Collaboration: Ability to work effectively with cross-functional teams, including development, infrastructure, and security teams. Attention to Detail: Commitment to database security, data integrity, and performance monitoring. Communication: Clear communication skills for liaising with technical and non-technical stakeholders.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Job requisition ID :: 78129 Date: Jun 4, 2025 Location: Delhi Designation: Consultant Entity: Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Design, develop and deploy solutions using different tools, design principles and conventions. Configure robotics processes and objects using core workflow principles in an efficient way; ensure they are easily maintainable and easy to understand. Understand existing processes and facilitate change requirements as part of a structured change control process. Solve day to day issues arising while running robotics processes and provide timely resolutions. Maintain proper documentation for the solutions, test procedures and scenarios during UAT and Production phase. Coordinate with process owners and business to understand the as-is process and design the automation process flow. Desired Qualifications Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Datapost, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases: Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus. Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation. Location and way of working: Base location: Bangalore, Mumbai, Delhi, Pune, Hyderabad This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Analyst/Consultant/Senior Consultant: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst/Consultant/Senior Consultant across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation. Committed to creating purpose - Creating a sense of vision and purpose. Agile - Achieving high-quality results through collaboration and Team unity. Skilled at building diverse capability - Developing diverse capabilities for the future. Persuasive / Influencing - Persuading and influencing stakeholders. Collaborating - Partnering to build new solutions. Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities. Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization. Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities. Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems. Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 2 weeks ago

Apply

5.0 years

5 - 5 Lacs

Gurgaon

On-site

GlassDoor logo

Job Summary: We are looking for a Tech Lead – Java to drive the architecture, design, and development of scalable, high-performance applications. The ideal candidate will have expertise in Java, Spring Boot, Microservices, and AWS and be capable of leading a team of engineers in building enterprise-grade solutions. Key Responsibilities: Lead the design and development of complex, scalable, and high-performance Java applications. Architect and implement Microservices-based solutions using Spring Boot . Optimize and enhance existing applications for performance, scalability, and reliability. Provide technical leadership, mentoring, and guidance to the development team. Work closely with cross-functional teams, including Product Management, DevOps, and QA , to deliver high-quality software. Ensure best practices in coding, testing, security, and deployment . Design and implement cloud-native applications using AWS services such as EC2, Lambda, S3, RDS, API Gateway, and Kubernetes . Troubleshoot and resolve technical issues and system bottlenecks. Stay up-to-date with the latest technologies and drive innovation within the team. Required Skills & Qualifications: 5+ years of experience in Java development. Strong expertise in Spring Boot, Spring Cloud, and Microservices architecture . Hands-on experience with RESTful APIs, event-driven architecture, and messaging systems (Kafka, RabbitMQ, etc.) . Deep understanding of database technologies such as MySQL, PostgreSQL, or NoSQL (MongoDB, DynamoDB, etc.). Experience with CI/CD pipelines and DevOps tools (Jenkins, Docker, Kubernetes, Terraform, etc.). Proficiency in AWS cloud services and infrastructure. Strong knowledge of security best practices , performance tuning, and monitoring. Excellent problem-solving skills and ability to work in an Agile environment. Strong communication and leadership skills. Job Types: Full-time, Permanent Schedule: Day shift Monday to Friday Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Senior Engineering Manager Job Req ID: 47563 Posting Date: 4 Jun 2025 Function: Engineering Unit: Digital Location: Building No 14 Sector 24 & 25A, Gurugram, India Salary: Competitive Why this job matters At BT we are undergoing a huge E2E transformation of our business. Transforming, journeys & technology with both process, IT & Network change. As part of this transformation we are redefining how customers interact with us across digital channels. As Head of Digital Engineering, you will be responsible for leading the engineering strategy across self-serve web and app journeys, helping 1.1 million B2B customers manage their services seamlessly. You will define and drive a transformation roadmap that enhances our digital platforms, exploits technologies like React, Flutter on platforms like AWS & improve how we deliver identity management, user access management (UAM). Whilst ensuring we have a solid application support service to maintain these capabilities in life. Your leadership will ensure our digital experiences are secure, intuitive, and constantly evolving to meet user expectations. What you’ll be doing Set the technical vision & engineering strategy for self-serve journeys in web & mobile platforms. Build diverse, high-performing engineering teams across the UK and India, cultivating a culture of innovation, technical excellence, and inclusion. Architect and drive the transition to cloud-native, event-driven, microservices-based architectures (AWS, Kubernetes, Kafka). Embed world-class observability, resilience, and reliability practices , leveraging platforms like ServiceNow, Dynatrace, and PagerDuty . Champion security-by-design, identity & access mgt (IAM/UAM) & Zero Trust Architectures. Lead backend modernisation with technologies like Node.js, Java , relational (Postgres, MySQL) and non-relational (MongoDB, DynamoDB) databases. Integrate customer experience mgt platforms like Sprinklr & drive real-time improvements to journeys. Own, Execute and deliver business outcomes for digital channels like Web, App, Chatbots . Establish robust application support models ensuring 24x7 operational excellence , rapid incident recovery, and proactive service quality. Lead Talent Councils for talent management, skills inventories & succession planning. Encourage a strong innovation engine, encouraging open-source contributions & participation in industry tech forums . The skills you’ll need to succeed Extensive leadership across full-stack engineering environments (ReactJs, NextJs, Android/IoS Native, Flutter, Node.js, AWS). Knowledge of chatbots, Machine Learning, Conversational AI, Generative AI , Large Language Models (LLM). Deep expertise in cloud-native architectures, container orchestration (Kubernetes), and server-less deployments. Knowledge of Micro-Service patterns , architecture patterns, Micro-frontend patterns. Knowledge of Mock-Services , A/B Framework, internationalisation, multi-tenancy etc. Advanced knowledge of event-driven architectures (Kafka, Pub/Sub) & API-first service designs. Strong hands-on background in both relational and non-relational database technologies. Operational excellence in site reliability engineering (SRE) principles, DevSecOps , and automated observability. In-depth understanding of identity and user access management , security frameworks & compliance. Experience you’d be expected to have Led digital engineering delivery for large-scale B2B or B2C customer self-serve journeys across web and mobile. Successfully driven cloud-native modernisation programmes on AWS , leveraging event-driven and microservices architectures. Designed and operated scalable authentication, Authorisation, Audit (AAA) systems with in-built security and compliance. Designed scalable, highly performant systems over non-performant telco systems like (OneSiebel). Embedded agile at scale , DevOps maturity, and SRE practices across complex engineering organisations. Delivered measurable improvements in customer experience , platform resilience, and operational performance. Operated effectively in fast-paced digital environments where r esilience, customer centricity, and speed to market are critical . About us BT Group was the world’s first telco and our heritage in the sector is unrivalled. As home to several of the UK’s most recognised and cherished brands – BT, EE, Openreach and Plusnet, we have always played a critical role in creating the future, and we have reached an inflection point in the transformation of our business. Over the next two years, we will complete the UK’s largest and most successful digital infrastructure project – connecting more than 25 million premises to full fibre broadband. Together with our heavy investment in 5G, we play a central role in revolutionising how people connect with each other. While we are through the most capital-intensive phase of our fibre investment, meaning we can reward our shareholders for their commitment and patience, we are absolutely focused on how we organise ourselves in the best way to serve our customers in the years to come. This includes radical simplification of systems, structures, and processes on a huge scale. Together with our application of AI and technology, we are on a path to creating the UK’s best telco, reimagining the customer experience and relationship with one of this country’s biggest infrastructure companies. Change on the scale we will all experience in the coming years is unprecedented. BT Group is committed to being the driving force behind improving connectivity for millions and there has never been a more exciting time to join a company and leadership team with the skills, experience, creativity, and passion to take this company into a new era. A FEW POINTS TO NOTE: Although these roles are listed as full-time, if you’re a job share partnership, work reduced hours, or any other way of working flexibly, please still get in touch. We will also offer reasonable adjustments for the selection process if required, so please do not hesitate to inform us. DON'T MEET EVERY SINGLE REQUIREMENT? Studies have shown that women and people who are disabled, LGBTQ+, neurodiverse or from ethnic minority backgrounds are less likely to apply for jobs unless they meet every single qualification and criteria. We're committed to building a diverse, inclusive, and authentic workplace where everyone can be their best, so if you're excited about this role but your past experience doesn't align perfectly with every requirement on the Job Description, please apply anyway - you may just be the right candidate for this or other roles in our wider team.

Posted 2 weeks ago

Apply

0 years

5 - 6 Lacs

Gurgaon

On-site

GlassDoor logo

Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Us We are a cutting-edge technology company specializing in media forensics. Our mission is to develop advanced AI solutions that detect and localize tampering attempts in digital media such as images and PDFs, ensuring the authenticity and integrity of our clients' content. The Role We are seeking a DevOps Engineer to design, implement, and maintain robust cloud infrastructure that underpins mission-critical workflows for our international clients. The successful candidate will collaborate closely with development teams to automate deployments, ensure scalability, and maintain high availability across our services. Key Responsibilities Infrastructure as Code: Define and manage cloud resources with Terraform or AWS CDK, adhering to strict version-control and peer-review practices. Continuous Integration / Continuous Deployment (CI/CD): Build and maintain pipelines that enable reliable, zero-downtime deployments. Scalability and Reliability: Architect and optimise serverless and containerised solutions (AWS Lambda, ECS/EKS) capable of handling variable workloads. Observability: Implement comprehensive logging, metrics, and tracing to facilitate proactive incident detection and resolution. Collaboration: Work with backend engineers to streamline deployment processes, improve system performance, and uphold security best practices. Required Qualifications Fluent in English Extensive experience with Amazon Web Services (Cognito, API Gateway, Lambda, S3, Dynamo, ECS/EKS, IAM, and CloudWatch) Proficiency with Terraform or AWS CDK Proficiency in Python Knowledge of Kubernetes (managed via EKS or self-hosted). CI/CD using bitbucket pipelines, bitbucket pipes, ECR (elastic containers registry) for docker images, CodeArtifact repository for hosting private libraries Nice-to-Haves Proficiency in Node.js Expertise in designing and optimising DynamoDB architectures. Why Photocert? Fully remote work Salary based on experience Coaching and learning opportunities Social virtual Fridays Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 3 Lacs

Chennai

On-site

GlassDoor logo

Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. - Job Family Group: Technology - Job Family: Digital Software Engineering - Time Type: - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

4 - 9 Lacs

Noida

On-site

GlassDoor logo

Posted On: 5 Jun 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Full-stack developer with 5 – 8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles . Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in microservices architecture and implementation Strong knowledge of HTML, CSS and JavaScript, React Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, DynamoDB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., Git) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc . Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, Git, SonarQube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Java - Core JAVA Others - Micro services Java Fullstack - React JS Java Fullstack - HTML CSS Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Java - Spring Framework Core Java Others - Spring Boot Java Others - Spring Batch Cloud - AWS Data on Cloud - AWS S3 Cloud - AWS Lambda DevOps - Git DevOps - CI/CD Agile - Agile Database - SQL Database - PL/SQL Java Fullstack - Javascript DevOps - Docker Python - Rest API Java Fullstack - WebServies/REST Beh - Communication and collaboration Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

30 - 32 Lacs

Noida

Work from Office

Naukri logo

The Team: We are seeking a seasoned engineering leader to join us and lead our technology team. In this role, you will be leading by example and responsible for executing our strategy to modernize theexisting platform and making it scalable and cost efficient.Youll work closely with cross-functional teams to ensure seamless transitions and optimal performance. Responsibilities And Impact: In this role, you will have the opportunity to lead a highly skilled and technical teamcurrently working in Agile model, ensuring we meet our customer requirements and deliver impactful quality software. Moreover, you are required toexhibit the below responsibilities as well: Execute the engineering strategy , ensuring alignment with business objectives, technology roadmaps, and industry trends.? Lead and oversee multiple engineering teams , fostering a high-performance culture focused on innovation, scalability, and delivery excellence.? Architect and govern large-scale, distributed systems and enterprise-level solutions, ensuring technical best practices and design principles are followed.? Shape the technology vision , evaluating emerging trends and recommending strategic investments in tools, frameworks, and infrastructure.? Establish and enforce engineering excellence , including coding standards, hygiene, architectural guidelines, security practices, and automation.? Lead technical governance and decision-making , balancing innovation with risk management, cost efficiency, and long-term maintainability.? Collaborate with software architects and developers to assess existing applications. Design and implement modernization strategies, including refactoring, containerization, and microservices adoption. Develop and maintain scalable, secure, and efficient solutions on AWS. Optimize application performance, reliability, and scalability. Conduct code reviews and provide constructive feedback. Troubleshoot and resolve issues related to application modernization. Stay up to date with industry trends and best practices in development and AWS services. What Were Looking For: Bachelor's degree in computer science, Engineering, or related field. 15+ years of experience in software development with a strong focus on AWS and .NET technologies 8+ years of experience in leading the engineering teams Proven experience in technical leadership, mentoring engineers, and driving architectural decisions. Expert proficiency in C# and .NET Core. Advanced SQL programming with expertise in database performance tuning for large-scale datasets. Strong experience with relational (MS SQL, PostgreSQL) or NoSQL databases (MongoDB, DynamoDB, etc.). Knowledge of UI, Python is a plus. Hands on and design level experience in designing AWS cloud-native services. Strong knowledge about CI/CD for automated deployments. Hands-on experience with large-scale messaging systems or commercial equivalents. Proven ability to lead and mentor engineering teams, fostering a culture of technical excellence and innovation. Strong problem-solving skills and ability to work in a collaborative manner. Excellent communication and teamwork abilities. Basic Required Qualifications: Education & Experience: Bachelors degree in computer science, Software Engineering, or a related field (or equivalent practical experience) Soft Skills: Strong problem-solving skills and attention to detail Excellent communication skills and the ability to collaborate in a team environment Ability to handle multiple tasks and meet deadlines in a fast-paced environment

Posted 2 weeks ago

Apply

8.0 years

6 - 7 Lacs

Noida

On-site

GlassDoor logo

Job Description: Key Responsibilities Hands-on Development: Develop and implement machine learning models and algorithms, including supervised, unsupervised, deep learning, and reinforcement learning techniques. Implement Generative AI solutions using technologies like RAG (Retrieval-Augmented Generation), Vector DBs, and frameworks such as LangChain and Hugging Face, Agentic Ai. Utilize popular AI/ML frameworks and libraries such as TensorFlow, PyTorch, and scikit-learn. Design and deploy NLP models and techniques, including text classification, RNNs, CNNs, and Transformer-based models like BERT. Ensure robust end-to-end AI/ML solutions, from data preprocessing and feature engineering to model deployment and monitoring. Technical Proficiency: Demonstrate strong programming skills in languages commonly used for data science and ML, particularly Python. Leverage cloud platforms and services for AI/ML, especially AWS, with knowledge of AWS Sagemaker, Lambda, DynamoDB, S3, and other AWS resources. Mentorship: Mentor and coach a team of data scientists and machine learning engineers, fostering skill development and professional growth. Provide technical guidance and support, helping team members overcome challenges and achieve project goals. Set technical direction and strategy for AI/ML projects, ensuring alignment with business goals and objectives. Facilitate knowledge sharing and collaboration within the team, promoting best practices and continuous learning. Strategic Advisory: Collaborate with cross-functional teams to integrate AI/ML solutions into business processes and products. Provide strategic insights and recommendations to support decision-making processes. Communicate effectively with stakeholders at various levels, including technical and non-technical audiences. Qualifications Bachelor’s degree in a relevant field (e.g., Computer Science) or equivalent combination of education and experience. Typically, 8-10 years of relevant work experience in AI/ML/GenAI 12+ years of overall work experience. With proven ability to manage projects and activities. Extensive experience with generative AI technologies, including RAG, Vector DBs, and frameworks such as LangChain and Hugging Face, Agentic AI Proficiency in machine learning algorithms and techniques, including supervised and unsupervised learning, deep learning, and reinforcement learning. Extensive experience with AI/ML frameworks and libraries such as TensorFlow, PyTorch, and scikit-learn. Strong knowledge of natural language processing (NLP) techniques and models, including Transformer-based models like BERT. Proficient programming skills in Python and experience with cloud platforms like AWS. Experience with AWS Cloud Resources, including AWS Sagemaker, Lambda, DynamoDB, S3, etc., is a plus. Proven experience leading a team of data scientists or machine learning engineers on complex projects. Strong project management skills, with the ability to prioritize tasks, allocate resources, and meet deadlines. Excellent communication skills and the ability to convey complex technical concepts to diverse audiences. Preferred Qualifications Experience in setting technical direction and strategy for AI/ML projects. Experience in the Insurance domain Ability to mentor and coach junior team members, fostering growth and development. Proven track record of successfully managing AI/ML projects from conception to deployment. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

2.0 years

4 - 6 Lacs

Noida

On-site

GlassDoor logo

Job Description: Technical Skills - Required - Java, Spring, JSF, PostgreSQL, Angular/React, GIT, HTML, JavaScript Good to have - AWS services like lambda, ECS, EC2, DynamoDB, RDS, Cognito etc., Linux ,Jenkins, Terraform, DevOps, Artifactory Essential Job Functions: Participate in software development projects by writing, testing, and debugging code, under the guidance of more experienced team members. Collaborate with team members to achieve project objectives and meet deadlines. Contribute to the documentation of software requirements and specifications. Assist in diagnosing and resolving technical issues, seeking guidance from senior team members. Support the implementation of emerging technologies and best practices. Participate in training and development programs to enhance technical skills. Contribute to knowledge sharing within the team. Actively learn from and assist more experienced team members. Basic Qualifications: Bachelor's degree in a relevant field (i.e., Computer Science) or equivalent combination of education and experience Relevant experience, typically 2+ years of relevant software engineering experience Proficiency in 1 or more software languages and development methodologies Basic understanding of coding and debugging Willingness to learn and grow in the field Effective communication and collaboration abilities Other Qualifications: Advanced degree in a related field is a plus Relevant certifications or training a plus Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 6 Lacs

Noida

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 09 The Role: Platform Engineer Department overview PVR DevOps is a global team that provides specialized technical builds across a suite of products. DevOps members work closely with the Development, Testing and Client Services teams to build and develop applications using the latest technologies to ensure the highest availability and resilience of all services. Our work helps ensure that PVR continues to provide high quality service and maintain client satisfaction. Position Summary S&P Global is seeking a highly motivated engineer to join our PVR DevOps team in Noida. DevOps is a rapidly growing team at the heart of ensuring the availability and correct operation of our valuations, market and trade data applications. The team prides itself on its flexibility and technical diversity to maintain service availability and contribute improvements through design and development. Duties & accountabilities The role of Principal DevOps Engineer is primarily focused on building functional systems that improve our customer experience. Responsibilities include: Creating infrastructure and environments to support our platforms and applications using Terraform and related technologies to ensure all our environments are controlled and consistent. Implementing DevOps technologies and processes, e.g: containerisation, CI/CD, infrastructure as code, metrics, monitoring etc Automating always Supporting, monitoring, maintaining and improving our infrastructure and the live running of our applications Maintaining the health of cloud accounts for security, cost and best practices Providing assistance to other functional areas such as development, test and client services. Knowledge, Skills & Experience Strong background of At least 3 to 5 years of experience in Linux/Unix Administration in IaaS / PaaS / SaaS models Deployment, maintenance and support of enterprise applications into AWS including (but not limited to) Route53, ELB, VPC, EC2, S3, ECS, SQS Good understanding of Terraform and similar ‘Infrastructure as Code’ technologies Strong experience with SQL and NoSQL databases such MySQL, PostgreSQL, DB/2, MongoDB, DynamoDB Experience with automation/configuration management using toolsets such as Chef, Puppet or equivalent Experience of enterprise systems deployed as micro-services through code pipelines utilizing containerization (Docker) Working knowledge, understanding and ability to write scripts using languages including Bash, Python and an ability to understand Java, JavaScript and PHP Personal competencies Personal Impact Confident individual – able to represent the team at various levels Strong analytical and problem-solving skills Demonstrated ability to work independently with minimal supervision Highly organised with very good attention to detail Takes ownership of issues and drives through the resolution. Flexible and willing to adapt to changing situations in a fast moving environment Communication Demonstrates a global mindset, respects cultural differences and is open to new ideas and approaches Able to build relationships with all teams, identifying and focusing on their needs Ability to communicate effectively at business and technical level is essential. Experience working in a global-team Teamwork An effective team player and strong collaborator across technology and all relevant areas of the business. Enthusiastic with a drive to succeed. Thrives in a pressurized environment with a “can do” attitude Must be able to work under own initiative ​ About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 309235 Posted On: 2025-06-04 Location: Noida, Uttar Pradesh, India

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description & Summary – AWS Job Description (Aws Data Engineer) :- Minimum 3 years of professional experience with working knowledge in a Data and Analytics role with a Global organization AWS developers are expected to understand the core AWS services and apply best practices regarding security and scalability. In depth knowledge of Key services like EC2, S3, Athena, Glue, dynamodb, redshift etc Experience provisioning and spinning up AWS clusters Develop business case and cloud adoption roadmap Ability to drive project from architecture standpoint Good hands-on experience in Pyspark Should have good knowledge of Python and spark concepts Develop and maintain data pipelines and ETL processes using Python and Pyspark. Design, implement, and optimize Spark jobs for performance and scalability. Perform data analysis and troubleshooting to ensure data quality and reliability. Experience with building CI/CD pipelines in Data environments would be good Should have Understanding of data modelling, Data warehousing concepts Understand the current application infrastructure and suggest changes to it. Define and document best practices and strategies regarding application deployment and infrastructure Mandatory Skill Sets AWS Preferred Skill Sets AWS Years Of Experience Required 3+ Education Qualification BE/BTech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Business Analyzer Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 7 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Node.js Developer with AWS 3–7 Years Experience Key Responsibilities: Develop and maintain backend services using Node.js and AWS. Design and implement RESTful APIs and integrate with front-end applications. Utilize AWS services like Lambda, API Gateway, and DynamoDB for serverless applications. Participate in Agile ceremonies and collaborate with cross-functional teams. Experience in TDD/BDD Required Skills: Strong proficiency in Node.js and JavaScript. Experience with AWS services (Lambda, API Gateway, DynamoDB). Knowledge of database systems like MongoDB or PostgreSQL. Knowledge of containerization tools like Docker. Background in finance-related projects is advantageous. Good To have Skills: Develop and maintain responsive web applications using Angular framework. Create engaging and interactive web interfaces using HTML, CSS, and JavaScript Optimize web performance and ensure cross-browser compatibility Integrate APIs and backend systems to enable seamless data flow EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

0 years

4 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, Azure+Python ! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Kolkata Schedule Full-time Education Level Master's / Equivalent Job Posting Jun 4, 2025, 6:38:02 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Andhra Pradesh

Remote

GlassDoor logo

HIH - Software Engineering Associate Advisor Position Overview The successful candidate will be a member of our US medical Integration Solutions ETL team. They will play a major role in the design and development if the ETL application in support of various portfolio projects. Responsibilities Analyze business requirements and translate into ETL architecture and data rules Serve as advisor and subject matter expert on project teams Manage both employees and consultants on multiple ETL projects. Oversee and review all design and coding from developers to ensure they follow company standards and best practices, as well as architectural direction Assist in data analysis and metadata management Test planning and execution Effectively operate within a team of technical and business professionals Asses new talent and mentor direct reports on best practices Review all designs and code from developers Qualifications Desired Skills & Experience: 8 - 11 Years of Experience in Java and Python, PySpark to support new development as well as support existing 7+ Years of Experience with Cloud technologies, specifically AWS Experience in AWS services such as Lambda, Glue, s3, MWAA, API Gateway and Route53, DynamoDB, RDS MySQL, SQS, CloudWatch, Secrete Manager, KMS, IAM, EC2 and Auto Scaling Group, VPC and Security Groups Experience with Boto3, Pandas and Terraforms for building Infrastructure as a Code Experience with IBM Datastage ETL tool Experience with CD /CI methodologies and processing and the development of these processes DevOps experience Knowledge in writing SQL Data mapping: source to target : target to multiple formats Experience in the development of data extraction and load processes in a parallel framework Understanding of normalized and de-normalized data repositories Ability to define ETL standards & processes SQL Standards / Processes / Tools: Mapping of data sources ETL Development, monitoring, reporting and metrics Focus on data quality Experience with DB2/ZOS, Oracle, SQL Server, Teradata and other database environments Unix experience Excellent problem solving and organizational skills Strong teamwork and interpersonal skills and ability to communicate with all management levels Leads others toward technical accomplishments and collaborative project team efforts Very strong communication skills, both verbal and written, including technical writing Strong analytical and conceptual skills Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Python Developer Experience: 5 to 8 Years Location: Hyderabad Work Model: Onsite Notice Period: Immediate Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workflows to process and transform large datasets. • Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Proficiency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualifications AWS certifications (e.g., AWS Certified Developer – Associate or AWS Certified Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools If Interested, please share your resume on lraveena@charterglobal.com Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.). Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

About Hakkoda Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!. As an AWS Managed Services Architect, you will play a pivotal role in architecting and optimizing the infrastructure and operations of a complex Data Lake environment for BOT clients. You’ll leverage your strong expertise with AWS services to design, implement, and maintain scalable and secure data solutions while driving best practices. You will work collaboratively with delivery teams across the U.S., Costa Rica, Portugal, and other regions, ensuring a robust and seamless Data Lake architecture. In addition, you’llproactively engage with clients to support their evolving needs, oversee critical AWS infrastructure, and guide teams toward innovative and efficient solutions. This role demands a hands-on approach, including designing solutions, troubleshooting,optimizing performance, and maintaining operational excellence. Role Description AWS Data Lake Architecture: Design, build, and support scalable, high-performance architectures for complex AWS Data Lake solutions. AWS Services Expertise: Deploy and manage cloud-native solutions using a wide range of AWS services, including but not limited to- Amazon EMR (Elastic MapReduce): Optimize and maintain EMR clusters for large-scale big data processing. AWS Batch: Design and implement efficient workflows for batch processing workloads. Amazon SageMaker: Enable data science teams with scalable infrastructure for model training and deployment. AWS Glue: Develop ETL/ELT pipelines using Glue to ensure efficient data ingestion and transformation. AWS Lambda: Build serverless functions to automate processes and handle event-driven workloads. IAM Policies: Define and enforce fine-grained access controls to secure cloud resources and maintain governance. AWS IoT & Timestream: Design scalable solutions for collecting, storing, and analyzing time-series data. Amazon DynamoDB: Build and optimize high-performance NoSQL database solutions. Data Governance & Security: Implement best practices to ensure data privacy, compliance, and governance across the data architecture. Performance Optimization: Monitor, analyze, and tune AWS resources for performance efficiency and cost optimization. Develop and manage Infrastructure as Code (IaC) using AWS CloudFormation, Terraform, or equivalent tools to automate infrastructure deployment. Client Collaboration: Work closely with stakeholders to understand business objectives and ensure solutions align with client needs. Team Leadership & Mentorship: Provide technical guidance to delivery teams through design reviews, troubleshooting, and strategic planning. Continuous Innovation: Stay current with AWS service updates, industry trends, and emerging technologies to enhance solution delivery. Documentation & Knowledge Sharing: Create and maintain architecture diagrams, SOPs, and internal/external documentation to support ongoing operations and collaboration. Qualifications 7+ years of hands-on experience in cloud architecture and infrastructure (preferably AWS). 3+ years of experience specifically in architecting and managing Data Lake or big datadata solutions on AWS. Bachelor’s Degree in Computer Science, Information Systems, or a related field (preferred) AWS Certifications such as Solutions Architect Professional or Big Data Specialty. Experience with Snowflake, Matillion, or Fivetran in hybrid cloud environments. Familiarity with Azure or GCP cloud platforms. Understanding of machine learning pipelines and workflows. Technical Skills: Expertise in AWS services such as EMR, Batch, SageMaker, Glue, Lambda,IAM, IoT TimeStream, DynamoDB, and more. Strong programming skills in Python for scripting and automation. Proficiency in SQL and performance tuning for data pipelines and queries. Experience with IaC tools like Terraform or CloudFormation. Knowledge of big data frameworks such as Apache Spark, Hadoop, or similar. Data Governance & Security: Proven ability to design and implement secure solutions, with strong knowledge of IAM policies and compliance standards. Problem-Solving:Analytical and problem-solving mindset to resolve complex technical challenges. Collaboration:Exceptional communication skills to engage with technical and non-technicalstakeholders. Ability to lead cross-functional teams and provide mentorship. Benefits Health Insurance Paid leave Technical training and certifications Robust learning and development opportunities Incentive Toastmasters Food Program Fitness Program Referral Bonus Program Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive. Ready to take your career to the next level? 🚀 💻 Apply today👇 and join a team that’s shaping the future!! Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 7.0 years

25 - 32 Lacs

Noida

Work from Office

Naukri logo

-5-7 years of experience in Software/Application development/enhancement and handling high-priority customer escalations. - Rich experience in Node.Js, JavaScript, Angular, AWS (S3, Lambda, EC2, Dynamo, Cloudfront, ALB). - Good Experience in Redis, DynamoDB, SQL Databases - Good Experience with Microservices - Strong analytical, communication and interpersonal skills.

Posted 2 weeks ago

Apply

Exploring DynamoDB Jobs in India

DynamoDB is a popular NoSQL database service offered by Amazon Web Services (AWS) that is widely used by companies in India. The job market for dynamodb professionals in India is currently booming, with many opportunities available for skilled individuals.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for dynamodb professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

A typical career path in dynamodb may involve progressing from roles such as Junior Developer to Senior Developer and eventually to a Tech Lead position. Opportunities for specialization in areas like database architecture or cloud solutions may also arise.

Related Skills

In addition to expertise in DynamoDB, professionals in this field are often expected to have knowledge of related technologies and tools such as AWS services, NoSQL databases, data modeling, and serverless architecture.

Interview Questions

  • What is DynamoDB and how does it differ from traditional relational databases? (basic)
  • Explain the primary key structure in DynamoDB. (basic)
  • What are the different types of DynamoDB primary keys? (basic)
  • How does DynamoDB handle read and write capacity units? (medium)
  • Can you explain the concepts of eventual consistency and strong consistency in DynamoDB? (medium)
  • How can you optimize DynamoDB performance? (medium)
  • What is the difference between partition key, sort key, and secondary index in DynamoDB? (medium)
  • How does DynamoDB handle data partitioning? (advanced)
  • Explain the DynamoDB Streams feature and its use cases. (advanced)
  • How can you implement transactions in DynamoDB? (advanced)
  • Describe the benefits and limitations of using DynamoDB Global Tables. (advanced)
  • How does DynamoDB handle data backups and restores? (medium)
  • Can you explain the capacity modes in DynamoDB? (medium)
  • What is the difference between a scan and a query operation in DynamoDB? (medium)
  • How can you troubleshoot performance issues in DynamoDB? (advanced)
  • Explain the concepts of read/write sharding in DynamoDB. (advanced)
  • How does DynamoDB handle conflicts in concurrent write operations? (advanced)
  • What are the best practices for designing DynamoDB tables? (medium)
  • How can you monitor and optimize costs in DynamoDB? (medium)
  • Describe the differences between DynamoDB and Aurora. (medium)
  • What are the security features available in DynamoDB? (medium)
  • How does DynamoDB handle data durability and availability? (medium)
  • Can you explain how to implement data encryption in DynamoDB? (medium)
  • Describe the DynamoDB Accelerator (DAX) service and its benefits. (medium)
  • How can you integrate DynamoDB with other AWS services like Lambda or S3? (medium)

Closing Remark

As you explore opportunities in the dynamodb job market in India, remember to stay updated on the latest trends and technologies in the field. Prepare thoroughly for interviews by honing your skills and showcasing your expertise confidently. Good luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies