Jobs
Interviews

2905 Dynamodb Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

18 Lacs

Mohali

On-site

Key Responsibilities: Design and develop full-stack web applications using the MERN (MongoDB, Express, React, Node.js) stack. Build RESTful APIs and integrate front-end and back-end systems. Deploy and manage applications using AWS services such as EC2, S3, Lambda, API Gateway, DynamoDB, CloudFront, RDS, etc. Implement CI/CD pipelines using AWS CodePipeline, CodeBuild, or other DevOps tools. Monitor, optimize, and scale applications for performance and availability. Ensure security best practices in both code and AWS infrastructure. Write clean, modular, and maintainable code with proper documentation. Work closely with product managers, designers, and QA to deliver high-quality products on schedule. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). 3+ years of professional experience with MERN Stack development. Strong knowledge of JavaScript (ES6+), React.js (Hooks, Redux), and Node.js. Hands-on experience with MongoDB and writing complex queries and aggregations. Proficiency in deploying and managing applications on AWS. Experience with AWS services like EC2, S3, Lambda, API Gateway, RDS, CloudWatch, etc. Knowledge of Git, Docker, and CI/CD pipelines. Understanding of RESTful API design, microservices architecture, and serverless computing. Strong debugging and problem-solving skills. Preferred Qualifications: AWS Certification (e.g., AWS Certified Developer – Associate). Experience with Infrastructure as Code (IaC) using Terraform or AWS CloudFormation. Experience with GraphQL and WebSockets. Familiarity with container orchestration tools like Kubernetes or AWS ECS/EKS. Exposure to Agile/Scrum methodologies. Company overview: smartData is a leader in global software business space when it comes to business consulting and technology integrations making business easier, accessible, secure and meaningful for its target segment of startups to small & medium enterprises. As your technology partner, we provide both domain and technology consulting and our inhouse products and our unique productized service approach helps us to act as business integrators saving substantial time to market for our esteemed customers. With 8000+ projects, vast experience of 20+ years, backed by offices in the US, Australia, and India, providing next door assistance and round-the-clock connectivity, we ensure continual business growth for all our customers. Our business consulting and integrator services via software solutions focus on important industries of healthcare, B2B, B2C, & B2B2C platforms, online delivery services, video platform services, and IT services. Strong expertise in Microsoft, LAMP stack, MEAN/MERN stack with mobility first approach via native (iOS, Android, Tizen) or hybrid (React Native, Flutter, Ionic, Cordova, PhoneGap) mobility stack mixed with AI & ML help us to deliver on the ongoing needs of customers continuously. Job Type: Full-time Pay: Up to ₹1,800,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Work Location: In person

Posted 11 hours ago

Apply

2.0 years

0 Lacs

Mohali

On-site

Hi, Greetings from CS Soft Solutions (India) Pvt Ltd. We are looking for a highly skilled Node JS Developers to join our dynamic team. Minimum Experience Required: 2 years Required Skills: Strong proficiency in Node.js. Hands-on experience with AWS Lambda and Serverless architecture. Familiarity with API Gateway, DynamoDB, S3 , and other core AWS services. Proficient in designing and consuming RESTful APIs. Knowledge of CI/CD pipelines , preferably with AWS CodePipeline or similar tools. Ability to write clean, modular, and scalable code. Company Details: Company: CS Soft Solutions (I) Pvt Ltd (ISO 9001: 2015, ISO / IEC 27001:2013 & NASSCOM Certified) Address: CS Soft Solutions (I) Pvt Ltd i-18, Sector 101-A, IT City- SAS Nagar, Mohali. Industry: Software Services (Mobile, Web designing & Development) If you’re passionate and ready to take your career to the next level, we’d love to hear from you! Job Type: Full-time Pay: ₹14,640.29 - ₹450,000.00 per month Benefits: Health insurance Leave encashment Paid sick time Provident Fund Work Location: In person

Posted 11 hours ago

Apply

5.0 years

3 - 5 Lacs

Hyderābād

On-site

DESCRIPTION The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements and business objectives. You'll be a key player in driving customer success through their cloud journey, providing technical expertise and best practices throughout the project lifecycle. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Possessing a deep understanding of AWS products and services, as a Delivery Consultant you will be proficient in architecting complex, scalable, and secure solutions tailored to meet the specific needs of each customer. You’ll work closely with stakeholders to gather requirements, assess current infrastructure, and propose effective migration strategies to AWS. As trusted advisors to our customers, providing guidance on industry trends, emerging technologies, and innovative solutions, you will be responsible for leading the implementation process, ensuring adherence to best practices, optimizing performance, and managing risks throughout the project. The AWS Professional Services organization is a global team of experts that help customers realize their desired business outcomes when using the AWS Cloud. We work together with customer teams and the AWS Partner Network (APN) to execute enterprise cloud computing initiatives. Our team provides assistance through a collection of offerings which help customers achieve specific outcomes related to enterprise cloud adoption. We also deliver focused guidance through our global specialty practices, which cover a variety of solutions, technologies, and industries. 10034 Key job responsibilities As an experienced technology professional, you will be responsible for: Designing and implementing complex, scalable, and secure AWS solutions tailored to customer needs Providing technical guidance and troubleshooting support throughout project delivery Collaborating with stakeholders to gather requirements and propose effective migration strategies Acting as a trusted advisor to customers on industry trends and emerging technologies Sharing knowledge within the organization through mentoring, training, and creating reusable artifacts About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS Bachelor's degree in computer science, engineering, related field, or equivalent experience 5+ years of hands-on experience in migration and modernization of mainframe applications to cloud platforms using refactoring approach Strong hands-on experience in Java and Spring Boot framework development and experience with RESTful web services using Spring Boot Proficiency in Spring framework components (Spring MVC, Spring Data, Spring Security) and Experience with ORM frameworks like Hibernate/JPA Hands-on experience in mainframe technologies including COBOL, JCL, DB2, CICS, IMS, VSAM, PL/1, Assembler, REXX, etc. Knowledge of various modernization strategies such as rehosting, replatforming, and refactoring AWS experience required, with proficiency in services such as EC2, S3, RDS, DynamoDB, Lambda, IAM, VPC, and CloudFormation Experience with build tools like Maven, Gradle and working in agile software development environments utilizing automated build-test-deploy pipelines Strong communication skills, ability to explain complex technical concepts to both technical and non-technical audiences PREFERRED QUALIFICATIONS AWS Professional level certifications (e.g., Solutions Architect Professional, DevOps Engineer Professional) AWS Blu Age L3 certification Knowledge of testing frameworks like JUnit, Mockito Knowledge of mainframe modernization tools like Micro Focus, Blu Age, Astadia, AWS Mainframe Modernization Service Familiarity with containerization of Spring Boot applications using Docker Exposure to Generative AI coding assistants such as Amazon Q Developer, GitHub Copilot Experience with automation and scripting (e.g., Python, Shell scripting) Experience in mainframe database migration to cloud databases (e.g., DB2 to Amazon Aurora) Knowledge of security and compliance standards (e.g., HIPAA, GDPR) Conduct technical workshops, training sessions, and knowledge-sharing initiatives to upskill teams Experience in writing technical documentation and providing mentorship Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

DESCRIPTION Amazon Business Customer Support (ABCS) is looking for a Business Intelligence Engineer to help build next generation metrics and drive business analytics that have measurable impact. The successful candidate will have a strong understanding of different businesses and customer profiles - the underlying analytics, and the ability to translate business requirements into analysis, collect and analyze data, and make recommendations back to the business. BIEs also continuously learn new systems, tools, and industry best practices to help design new studies and build new tools that help our team automate, and accelerate analytics. As a Business Intelligence Engineer, you will develop strategic reports, design UIs and drive projects to support ABCS decision making. This role is inherently cross-functional — you will work closely with finance teams, engineering, and leadership across Amazon Business Customer Service. A successful candidate will be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail). You should be skilled in database design, be comfortable dealing with large and complex data sets, and have experience building self-service dashboards and using visualization tools especially Tableau. You should have strong analytical and communication skills. You will work with a team of analytics professionals who are passionate about using machine learning to build automated systems and solve problems that matter to our customers. Your work will directly impact our customers and operations. Members of this team will be challenged to innovate use the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to complex problems while working hard, having fun, and making history, this may be the opportunity for you. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards on the key drivers of our business. Key job responsibilities Scope, Design, and build database structure and schema. Create data pipelines using ETL connections/ SQL queries. Retrieve and analyze data using a broad set of Amazon's data technologies. Pull data on an ad-hoc basis using SQL queries. Design, build and maintain automated reporting and dashboards Conduct deep dives to identify root causes of pain points and opportunities For improvement: Become a subject matter expert in AB CS data, and support team members in Dive deep: Work closely with CSBI teams to ensure ABCS uses Globally alleged standard Metrics and definition: Collaborate with finance, business to gather data and metrics requirements. A day in the life We thrive on solving challenging problems to innovate for our customers. By pushing the boundaries of technology, we create unparalleled experiences that enable us to rapidly adapt in a dynamic environment. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! BASIC QUALIFICATIONS Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

DESCRIPTION AOP team within Amazon Transportation is looking for an innovative, hands-on and customer-obsessed Business Intelligence Engineer for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams [January 21, 2025, 1:30 PM] Dhingra, Gunjit: Day in life A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

4.0 years

0 Lacs

Hyderābād

On-site

DESCRIPTION Amazon Transportation team is looking for an innovative, hands-on and customer-obsessed Business Analyst for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 4+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business 4+ years of ecommerce, transportation, finance or related analytical field experience PREFERRED QUALIFICATIONS Experience in Statistical Analysis packages such as R, SAS and Matlab Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

5.0 years

4 - 8 Lacs

Hyderābād

On-site

About Kanerika: Who we are: Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth. Awards and Recognitions Kanerika has won several awards over the years, including: CMMI Level 3 Appraised in 2024. Best Place to Work 2022 & 2023 by Great Place to Work®. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA today. NASSCOM Emerge 50 Award in 2014. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture. Recognized for ISO 27701, 27001, SOC2, and GDPR compliances. Featured as Top Data Analytics Services Provider by GoodFirms. Working for us Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees. Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika. Locations We are located in Austin (USA), Singapore, Hyderabad, Indore and Ahmedabad (India). Job Location: Hyderabad, Indore and Ahmedabad (India) Requirements Key Responsibilities: Lead the design and development of AI-driven applications, particularly focusing on RAG-based chatbot solutions. Architect robust solutions leveraging Python and Java to ensure scalability, reliability, and maintainability. Deploy, manage, and scale AI applications using AWS cloud infrastructure, optimizing performance and resource utilization. Collaborate closely with cross-functional teams to understand requirements, define project scopes, and deliver solutions effectively. Mentor team members, providing guidance on best practices in software development, AI methodologies, and cloud deployments. Ensure solutions meet quality standards, including thorough testing, debugging, performance tuning, and documentation. Continuously research emerging AI technologies and methodologies to incorporate best practices and innovation into our products. - Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, Mathematics, Statistics or related fields. At least 5 years of professional experience in AI/Machine Learning engineering. Strong programming skills in Python and Java. Demonstrated hands-on experience building Retrieval-Augmented Generation (RAG)-based chatbots or similar generative AI applications. Proficiency in cloud platforms, particularly AWS, including experience with EC2, Lambda, SageMaker, DynamoDB, CloudWatch, and API Gateway. Solid understanding of AI methodologies, including natural language processing (NLP), vector databases, embedding models, and large language model integrations. Experience with leading projects or teams, managing technical deliverables, and ensuring high-quality outcomes. AWS certifications (e.g., AWS Solutions Architect, AWS Machine Learning Specialty). Familiarity with popular AI/ML frameworks and libraries such as Hugging Face Transformers, TensorFlow, PyTorch, LangChain, or similar. Experience in Agile development methodologies. Excellent communication skills, capable of conveying complex technical concepts clearly and effectively. Strong analytical and problem-solving capabilities, with the ability to navigate ambiguous technical challenges. Benefits Employee Benefits 1. Culture: a. Open Door Policy: Encourages open communication and accessibility to management. b. Open Office Floor Plan: Fosters a collaborative and interactive work environment. c. Flexible Working Hours: Allows employees to have flexibility in their work schedules. d. Employee Referral Bonus: Rewards employees for referring qualified candidates. e. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: a. Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. b. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: a. GMC and Term Insurance: Offers medical coverage and financial protection. b. Health Insurance: Provides coverage for medical expenses. c. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: a. Company-sponsored family events: Creates opportunities for employees and their families to bond. b. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. c. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: a. Company-sponsored outings: Organizes recreational activities for employees. b. Gratuity: Provides a monetary benefit as a token of appreciation. c. Provident Fund: Helps employees save for retirement. d. Generous PTO: Offers more than the industry standard for paid time off. e. Paid sick days: Allows employees to take paid time off when they are unwell. f. Paid holidays: Gives employees paid time off for designated holidays. g. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: a. L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. b. Mentorship Program: Offers guidance and support from experienced professionals. c. Job Training: Provides training to enhance job-related skills. d. Professional Certification Reimbursements: Assists employees in obtaining professional certifications. e. Promote from Within: Encourages internal growth and advancement opportunities.

Posted 11 hours ago

Apply

15.0 years

0 Lacs

Gurgaon

On-site

DESCRIPTION At AWS, we are looking for a Delivery Practice Manager with a successful record of leading enterprise customers through a variety of transformative projects involving IT Strategy, distributed architecture, and hybrid cloud operations. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from across the AWS customer base and are obsessed about strong success for the Customer. Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs. 10034 Key job responsibilities - Engage customers - collaborate with enterprise sales managers to develop strong customer and partner relationships and build a growing business in a geographic territory, driving AWS adoption in key markets and accounts. - Drive infrastructure engagements - including short on-site projects proving the value of AWS services to support new distributed computing models. - Coach and teach - collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Amazon Databases – RDS/Aurora/DynamoDB/Redshift, Amazon Elastic Compute Cloud (EC2), Amazon Simple Storage Service (S3), AWS Identity and Access Management(IAM), etc. - Deliver value - lead high quality delivery of a variety of customized engagements with partners and enterprise customers in the commercial and public sectors. - Lead great people - attract top IT architecture talent to build high performing teams of consultants with superior technical depth, and customer relationship skills - Be a customer advocate - Work with AWS engineering teams to convey partner and enterprise customer feedback as input to AWS technology roadmaps Build organization assets – identify patterns and implement solutions that can be leveraged across customer base. Improve productivity through tooling and process improvements. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field. 15+ years of IT implementation and/or delivery experience, with 5+ years working in an IT Professional Services and/or consulting organization; and 5+ years of direct people management leading a team of consultants. Deep understanding of cloud computing, adoption strategy, transition challenges. Experience managing a consulting practice or teams responsible for KRAs. Ability to travel to client locations to deliver professional services as needed PREFERRED QUALIFICATIONS Demonstrated ability to think strategically about business, product, and technical challenges. Vertical industry sales and delivery experience of contemporary services and solutions.Experience with design of modern, scalable delivery models for technology consulting services. Business development experience including complex agreements w/ integrators and ISVs .International sales and delivery experience with global F500 enterprise customers and partners Direct people management experience leading a team of at least 20 or manager of manager experience in a consulting practice. Use of AWS services in distributed environments with Microsoft, IBM, Oracle, HP, SAP etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

5.0 years

0 Lacs

Gurgaon

On-site

DESCRIPTION The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements and business objectives. You'll be a key player in driving customer success through their cloud journey, providing technical expertise and best practices throughout the project lifecycle. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Possessing a deep understanding of AWS products and services, as a Delivery Consultant you will be proficient in architecting complex, scalable, and secure solutions tailored to meet the specific needs of each customer. You’ll work closely with stakeholders to gather requirements, assess current infrastructure, and propose effective migration strategies to AWS. As trusted advisors to our customers, providing guidance on industry trends, emerging technologies, and innovative solutions, you will be responsible for leading the implementation process, ensuring adherence to best practices, optimizing performance, and managing risks throughout the project. The AWS Professional Services organization is a global team of experts that help customers realize their desired business outcomes when using the AWS Cloud. We work together with customer teams and the AWS Partner Network (APN) to execute enterprise cloud computing initiatives. Our team provides assistance through a collection of offerings which help customers achieve specific outcomes related to enterprise cloud adoption. We also deliver focused guidance through our global specialty practices, which cover a variety of solutions, technologies, and industries. 10034 Key job responsibilities As an experienced technology professional, you will be responsible for: Designing and implementing complex, scalable, and secure AWS solutions tailored to customer needs Providing technical guidance and troubleshooting support throughout project delivery Collaborating with stakeholders to gather requirements and propose effective migration strategies Acting as a trusted advisor to customers on industry trends and emerging technologies Sharing knowledge within the organization through mentoring, training, and creating reusable artifacts About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS Bachelor's degree in computer science, engineering, related field, or equivalent experience 5+ years of hands-on experience in migration and modernization of mainframe applications to cloud platforms using refactoring approach Strong hands-on experience in Java and Spring Boot framework development and experience with RESTful web services using Spring Boot Proficiency in Spring framework components (Spring MVC, Spring Data, Spring Security) and Experience with ORM frameworks like Hibernate/JPA Hands-on experience in mainframe technologies including COBOL, JCL, DB2, CICS, IMS, VSAM, PL/1, Assembler, REXX, etc. Knowledge of various modernization strategies such as rehosting, replatforming, and refactoring AWS experience required, with proficiency in services such as EC2, S3, RDS, DynamoDB, Lambda, IAM, VPC, and CloudFormation Experience with build tools like Maven, Gradle and working in agile software development environments utilizing automated build-test-deploy pipelines Strong communication skills, ability to explain complex technical concepts to both technical and non-technical audiences PREFERRED QUALIFICATIONS AWS Professional level certifications (e.g., Solutions Architect Professional, DevOps Engineer Professional) AWS Blu Age L3 certification Knowledge of testing frameworks like JUnit, Mockito Knowledge of mainframe modernization tools like Micro Focus, Blu Age, Astadia, AWS Mainframe Modernization Service Familiarity with containerization of Spring Boot applications using Docker Exposure to Generative AI coding assistants such as Amazon Q Developer, GitHub Copilot Experience with automation and scripting (e.g., Python, Shell scripting) Experience in mainframe database migration to cloud databases (e.g., DB2 to Amazon Aurora) Knowledge of security and compliance standards (e.g., HIPAA, GDPR) Conduct technical workshops, training sessions, and knowledge-sharing initiatives to upskill teams Experience in writing technical documentation and providing mentorship Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

6.0 years

4 - 7 Lacs

Gurgaon

On-site

DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector Are you a Cloud Consultant who has hands-on experience with building cloud-native applications? Would you like to work with our customers to help them architect, develop and re-engineer applications to fully leverage the AWS Cloud? Do you like to work on a variety of latest technology stack, business-critical projects at the forefront of application development and cloud technology adoption? AWS ProServe India LLP is looking for an experienced cloud consultant, you will work with our internal customers in architecting, developing and re-engineering applications that can fully leverage the AWS Cloud in India. You will work on a variety of game changing projects, at the forefront of application development and cloud technology adoption. Achieving success will require coordination across many internal AWS teams and external AWS Partners, with impact and visibility at the highest levels of the company. In order for applications to be cloud optimized they need to be architected correctly enabling them to reap the benefits of elasticity, horizontal scalability, automation and high availability. On the AWS platform services such as Amazon EC2, Auto Scaling, Elastic Load Balancing, AWS Elastic Beanstalk, Serverless Architectures, Amazon Elastic Container Services to name just a few, provide opportunities to design and build cloud ready applications. Key job responsibilities We are looking for hands on application developers with: Full stack app developer with hand-on experience in design and development front-end and back-end for web applications, APIs, microservices, and data integrations Proficiency in at least one programming language such as Java, Python, Go (Golang), or JavaScript/TypeScript, along with practical experience in modern frameworks and libraries like Angular, ReactJS, Vue.js, or Node.js. Working knowledge of AWS services, experience with both SQL and NoSQL databases, and familiarity with modern communication protocols such as gRPC, WebSockets, and GraphQL. Knowledge of cloud-native design patterns, including microservices architecture and event-driven systems. Demonstrated experience building scalable and highly available applications on AWS, leveraging services such as Lambda, ECS, API Gateway, DynamoDB, S3, etc. Preferred experience in optimizing cloud-based architectures for scalability, security, and high performance. Experience working in Agile development environments, with a strong focus on iterative delivery and continuous improvement. Ability to advise on and implement AWS best practices across application development, deployment, and monitoring About the team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship and Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. BASIC QUALIFICATIONS 6+ years of experience in application technologies with 4+ years on any Cloud Platform. Programming Language experience (e.g. JavaScript Frameworks, Java, Python, Golang, etc.) with good understanding of OOAD principles Experience developing Microservices architecture and API Frameworks supporting application development. Experience in designing architecture for highly available systems that utilize load balancing, horizontal scalability and high availability. Hands-on experience using AI-powered developer tools PREFERRED QUALIFICATIONS Experience leading the design, development and deployment of business software at scale or recent hands-on technology infrastructure, network, compute, storage, and virtualization experience Experience and technical expertise (design and implementation) in cloud computing technologies Experience leading the design, development and deployment of business software at scale or recent hands-on technology infrastructure, network, compute, storage, and virtualization experience Experience and technical expertise (design and implementation) in cloud computing technologies A passion for exploring and adopting emerging technologies, with a growth mindset and curiosity to experiment and innovate. Ability to think strategically across business needs, product strategy, and technical implementation, contributing to high-impact decisions. Code generation platforms (e.g. GitHub, AmazonQ Developer). Automated test case generation and AI-assisted code reviews. Integrating machine learning models into applications e.g., recommendation engines, NLP-based search, predictive analytics. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

0 years

0 Lacs

Gurgaon

On-site

DESCRIPTION Amazon Business Customer Support (ABCS) is looking for a Business Intelligence Engineer to help build next generation metrics and drive business analytics that have measurable impact. The successful candidate will have a strong understanding of different businesses and customer profiles - the underlying analytics, and the ability to translate business requirements into analysis, collect and analyze data, and make recommendations back to the business. BIEs also continuously learn new systems, tools, and industry best practices to help design new studies and build new tools that help our team automate, and accelerate analytics. As a Business Intelligence Engineer, you will develop strategic reports, design UIs and drive projects to support ABCS decision making. This role is inherently cross-functional — you will work closely with finance teams, engineering, and leadership across Amazon Business Customer Service. A successful candidate will be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail). You should be skilled in database design, be comfortable dealing with large and complex data sets, and have experience building self-service dashboards and using visualization tools especially Tableau. You should have strong analytical and communication skills. You will work with a team of analytics professionals who are passionate about using machine learning to build automated systems and solve problems that matter to our customers. Your work will directly impact our customers and operations. Members of this team will be challenged to innovate use the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to complex problems while working hard, having fun, and making history, this may be the opportunity for you. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards on the key drivers of our business. Key job responsibilities Scope, Design, and build database structure and schema. Create data pipelines using ETL connections/ SQL queries. Retrieve and analyze data using a broad set of Amazon's data technologies. Pull data on an ad-hoc basis using SQL queries. Design, build and maintain automated reporting and dashboards Conduct deep dives to identify root causes of pain points and opportunities For improvement: Become a subject matter expert in AB CS data, and support team members in Dive deep: Work closely with CSBI teams to ensure ABCS uses Globally alleged standard Metrics and definition: Collaborate with finance, business to gather data and metrics requirements. A day in the life We thrive on solving challenging problems to innovate for our customers. By pushing the boundaries of technology, we create unparalleled experiences that enable us to rapidly adapt in a dynamic environment. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! BASIC QUALIFICATIONS Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

6.0 years

0 Lacs

Meerut

On-site

DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Are you a Cloud Consultant who has hands-on experience with building cloud-native applications? Would you like to work with our customers to help them architect, develop and re-engineer applications to fully leverage the AWS Cloud? Do you like to work on a variety of latest tech stack, business-critical projects at the forefront of application development and cloud technology adoption? AWS ProServe India LLP is looking for an experienced cloud consultant, you will work with our internal customers in architecting, developing and re-engineering applications that can fully leverage the AWS Cloud in India. You will work on a variety of game changing projects, at the forefront of application development and cloud technology adoption. Achieving success will require coordination across many internal AWS teams and external AWS Partners, with impact and visibility at the highest levels of the company. In order for applications to be cloud optimized they need to be architected correctly enabling them to reap the benefits of elasticity, horizontal scalability, automation and high availability. On the AWS platform services such as Amazon EC2, Auto Scaling, Elastic Load Balancing, AWS Elastic Beanstalk, Serverless Architectures, Amazon Elastic Container Services to name just a few, provide opportunities to design and build cloud ready applications. Key job responsibilities We are looking for hands on application developers with: Full stack app developer with hand-on experience in design and development front-end and back-end for web applications, APIs, microservices, and data integrations Proficiency in at least one programming language such as Java, Python, Go (Golang), or JavaScript/TypeScript, along with practical experience in modern frameworks and libraries like Angular, ReactJS, Vue.js, or Node.js. Working knowledge of AWS services, experience with both SQL and NoSQL databases, and familiarity with modern communication protocols such as gRPC, WebSockets, and GraphQL. Knowledge of cloud-native design patterns, including microservices architecture and event-driven systems. Demonstrated experience building scalable and highly available applications on AWS, leveraging services such as Lambda, ECS, API Gateway, DynamoDB, S3, etc. Preferred experience in optimizing cloud-based architectures for scalability, security, and high performance. Experience working in Agile development environments, with a focus on iterative delivery and continuous improvement. Ability to advise on and implement AWS best practices across application development, deployment, and monitoring About the team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship and Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. BASIC QUALIFICATIONS 6+ years of experience in application technologies with 4+ years on any Cloud Platform. Programming Language experience (e.g. JavaScript Frameworks, Java, Python, Golang, etc.) with good understanding of OOAD principles Experience developing Microservices architecture and API Frameworks supporting application development. Experience in designing architecture for highly available systems that utilize load balancing, horizontal scalability and high availability. Hands-on experience using AI-powered developer tools PREFERRED QUALIFICATIONS Experience leading the design, development and deployment of business software at scale or recent hands-on technology infrastructure, network, compute, storage, and virtualization experience Experience and technical expertise (design and implementation) in cloud computing technologies A passion for exploring and adopting emerging technologies, with a growth mindset and curiosity to experiment and innovate. Ability to think strategically across business needs, product strategy, and technical implementation, contributing to high-impact decisions. Code generation platforms (e.g. GitHub, AmazonQ Developer). Automated test case generation and AI-assisted code reviews. Integrating machine learning models into applications e.g., recommendation engines, NLP-based search, predictive analytics. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 11 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🚀 Hiring: Java Developer | Hyderabad | 4–8 Years Experience Location: Hyderabad Experience Required: 4–8 years Employment Type: Full-time We are looking for a skilled and motivated Java Developer with 4 to 8 years of experience to join our team in Hyderabad . 🔹 Key Skills & Experience: Solid understanding of Service-Oriented Architecture – SOAP & RESTful Web Services, Microservices API Design & Development Strong hands-on experience with Java/J2EE for web-based applications Proficiency in Angular 15 or above for building highly responsive web applications Familiar with JavaScript build tools (npm, bower, grunt, gulp) Experience using version control systems like Git, SVN, or CVS Good knowledge of HTML5, CSS, JavaScript, AJAX, and XML Strong understanding of SQL and NoSQL databases (DynamoDB, Aurora, MongoDB, etc.) Exposure to AWS Lambda and serverless architecture (AWS SAM is a plus) Ability to apply design patterns and independently design small to medium modules Experience working in Agile environments Strong communication and collaboration skills 📩 Interested candidates can DM me on lraveena@charterglobal.com

Posted 11 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company: Continental Location: Bangalore Expenience: 3-6 yrs Job Description Overview of the role: We are looking for Backend Developer who is extremely passionate about backend software development for our wide range of cloud-based web / mobile B2B & B2C products targeted towards a wide range of industries and applications. You will be based in our new ground-breaking innovation hub with an atmosphere of start-up and get an opportunity to work and collaborate with internal and external technology specialist in areas of AI, IOT, VR/AR. Do you want to be a part of this exciting journey and make a difference? Key Responsibilities: Design, develop, and maintain scalable and reliable backend systems and APIs using modern technologies and best practices. Own a module and work closely with TL Identify, prioritize and execute tasks in the software development life cycle Write clean, efficient, and maintainable code following best practices and coding standards Optimize backend systems for performance, scalability, and reliability. Implement security best practices to protect sensitive data and ensure compliance with security standards. Troubleshoot and debug issues and provide timely resolutions to ensure smooth operation of backend systems. Work closely with DevOps and Infrastructure teams to deploy and monitor backend services . Qualifications Technical Skills Strong experience in Node.js as the backend technology. Experience in Java and Python is a plus Strong experience in working with Microservice Architecture Good experience in working with AWS cloud platform Good experience CI/CD tools and methodologies is a plus Experience in using various relevant tools for Unit Testing, Code quality etc Strong experience in building web front end using angular with relevant framework as the front-end technology is plus Other skills Experience of agile software development methodologies Excellent communication skills in English (spoken and written) Great team player and ability to work in a highly international team Willingness to sometimes travel nationally and internationally to various Continental R&D centers and external development partner locations Willingness to learn new things Experience Around 3 to 6 years of experience overall 3+ years experience in backend development 3+ years in AWS services such as ApiGateway, Lambda, Dynamodb, s3 etc 2+ years in CICD topics 1+ years in using tools like Sonarqube, Datadog, Appdynamics etc 3+ years in Agile delivery AWS certification is a plus Experience working in tools like JIRA, Confluence, GIT, Jenkins etc BE in engineering with focus on computer science / software engineering MCA with professional experience Other relevant education streams with strong tech experiences can be considered

Posted 15 hours ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Join us as a Solution Architect This is an opportunity for an experienced Solution Architect to help us define the high-level technical architecture and design for a key data analytics and insights platform that powers the personalised customer engagement initiatives of the business You’ll define and communicate a shared technical and architectural vision of end-to-end designs that may span multiple platforms and domains Take on this exciting new challenge and hone your technical capabilities while advancing your career and building your network across the bank We're offering this role at vice president level What you'll do We’ll look to you to influence and promote the collaboration across platform and domain teams on the solution delivery. Partnering with platform and domain teams, you’ll elaborate the solution and its interfaces, validating technology assumptions, evaluating implementation alternatives, and creating the continuous delivery pipeline. You’ll also provide analysis of options and deliver end-to-end solution designs using the relevant building blocks, as well as producing designs for features that allow frequent incremental delivery of customer value. On Top Of This, You’ll Be Owning the technical design and architecture development that aligns with bank-wide enterprise architecture principles, security standards, and regulatory requirements Participating in activities to shape requirements, validating designs and prototypes to deliver change that aligns with the target architecture Promoting adaptive design practices to drive collaboration of feature teams around a common technical vision using continuous feedback Making recommendations of potential impacts to existing and prospective customers of the latest technology and customer trends Engaging with the wider architecture community within the bank to ensure alignment with enterprise standards Presenting solutions to governance boards and design review forums to secure approvals Maintaining up-to-date architectural documentation to support audits and risk assessment The skills you'll need As a Solution Architect, you’ll bring expert knowledge of application architecture, and in business data or infrastructure architecture with working knowledge of industry architecture frameworks such as TOGAF or ArchiMate. You’ll also need an understanding of Agile and contemporary methodologies with experience of working in Agile teams. A certification in cloud solutions like AWS Solution Architect is desirable while an awareness of agentic AI based application architectures using LLMs like OpenAI and agentic frameworks like LangGraph, CrewAI will be advantageous. Furthermore, You’ll Need Strong experience in solution design, enterprise architecture patterns, and cloud-native applications including the ability to produce multiple views to highlight different architectural concerns A familiarity with understanding big data processing in the banking industry Hands-on experience in AWS services, including but not limited to S3, Lambda, EMR, DynamoDB and API Gateway An understanding of big data processing using frameworks or platforms like Spark, EMR, Kafka, Apache Flink or similar Knowledge of real-time data processing, event-driven architectures, and microservices Conceptual understanding of data modelling and analytics, machine learning or deep-learning models The ability to communicate complex technical concepts clearly to peers and leadership level colleagues

Posted 15 hours ago

Apply

5.0 years

0 Lacs

India

Remote

Job Role: Python Developer– UPI Systems & Serverless Architecture Location: Remote Experience: 5 years Start Date: Immediate About the Role: We are seeking a highly skilled Python contractor with hands-on experience in Unified Payments Interface (UPI) systems and modern Python development. The ideal candidate will be proficient in building scalable, serverless microservices and integrating secure payment APIs. You will work closely with our engineering team to enhance and maintain our UPI-based transaction platform. Key Responsibilities: Design, develop, and maintain UPI-integrated microservices using modern Python 3. Implement robust and reusable code using data structures, decorators, context managers, and functional programming paradigms. Build and consume HTTP APIs using Python libraries like requests. Ensure high-quality logging and observability using Python’s logging module. Develop and deploy serverless applications on AWS (Lambda, API Gateway, DynamoDB, etc.). Write and maintain unit tests and integration tests using Pytest. Collaborate with cross-functional teams to ensure secure and efficient payment processing. Optimize performance and scalability of microservices in production environments. Required Skills & Experience: Strong proficiency in Python 3 , with deep understanding of advanced features. Experience with UPI systems and payment gateway integrations. Solid grasp of data structures, decorators, context managers, and functional programming. Expertise in building and consuming RESTful APIs using Python. Familiarity with Python logging best practices. Hands-on experience with AWS serverless architecture (Lambda, S3, DynamoDB, CloudWatch). Experience in designing and deploying microservices. Strong Testing skills using Pytest and other unit testing frameworks. Ability to work independently and deliver high-quality code on time Preferred Qualifications: Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform, AWS SAM). Knowledge of security best practices in financial applications. Familiarity with event-driven architectures and message queues (e.g., SQS, SNS)

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About Company: They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. Job Title:AWS Infra Location: Mumbai, Pune, Chennai, Bangalore Work Mode: Hybrid Mode Experience: 5+years (5years Relevant) Job Type: Contract to hire (C2H) Notice Period: - Immediate joiners or 15 Days Key Responsibilities: Lead efforts to troubleshoot and resolve AWS Infrastructure and operational issues ensuring minimal downtime and optimal performance. Architect and deploy scalable secure and efficient solutions on AWS that align with business objectives. Provide hands-on support for migrating Azure and on-premises system to AWS ensuring smooth transitions and minimizing disruptions. Monitor assess and enhance the performance of AWS environments using tools like CloudWatch AWS Trusted Advisor and Cost Explorer. Automate AWS infrastructure provisioning and management using CloudFormation and Terraform. Monitor and optimize cloud costs and implement best practices for security using AWS IAM KMS Guard Duty and other security tools. Collaborate with development DevOps and operational teams to ensure seamless integration of AWS services and support day to day operations. Create and maintain technical documentation and ensure that the operational team follows AWS best practices. Qualifications: 1. 6 years of experience in AWS cloud architecture and operations 2. Expertise in AWS Services such as EC2 Lambda S3 RDS DynamoDB VPC Route53 and more 3. Proven experiences in migrating on-premises and Azure cloud to AWS using tools 4. Strong understanding of AWS networking including VPCs VPNs and Direct Connect 5. AWS Certified Solutions Architect Professional and AWS DevOps certifications preferred DO NOT share profile which has : CI/CD AWS Devops Jenkins , Python Ansible

Posted 18 hours ago

Apply

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. · Job Title: Microservice Architect · Location: PAN INDIA · Experience: 15+Years · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: Microservices architecture , API development and REST services , Docker and containerization, Redis/DynamoDB , OAuth and OpenID Connect protocols Very Good communication skills Job Summary: Job description Required Technical Skills: - Experience in microservices architecture and development - Strong containerization knowledge (Docker) - Experience with API development and REST services - Knowledge of caching solutions (Redis/DynamoDB) - Experience with: - User profile management systems - Shopping cart implementations - Product catalog services - Understanding of OAuth and OpenID Connect protocols - Experience with distributed application logging - Proficiency in unit testing and integration testing - Knowledge of application security best practices - Experience with session management optimization - Understanding of cloud-native application patterns Seniority Level Director Industry IT Services and IT Consulting Employment Type Contract Job Functions Business Development Consulting Skills Microservices architecture OAuth OpenID Connect protocols Docker Rest API

Posted 18 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 18 hours ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM

Posted 18 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 19 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Solution Designer (Cloud Data Integration) at Barclays within the Customer Digital and Data Business Area, you will play a vital role in supporting the successful delivery of location strategy projects. Your responsibilities will include ensuring projects are delivered according to plan, budget, quality standards, and governance protocols. By spearheading the evolution of the digital landscape, you will drive innovation and excellence, utilizing cutting-edge technology to enhance our digital offerings and deliver unparalleled customer experiences. To excel in this role, you should possess hands-on experience working with large-scale data platforms and developing cloud solutions within the AWS data platform. Your track record should demonstrate a history of driving business success through your expertise in AWS, distributed computing paradigms, and designing data ingestion programs using technologies like Glue, Lambda, S3, Redshift, Snowflake, Apache Kafka, and Spark Streaming. Proficiency in Python, PySpark, SQL, and database management systems is essential, along with a strong understanding of data governance principles and tools. Additionally, valued skills for this role may include experience in multi-cloud solution design, data modeling, data governance frameworks, agile methodologies, project management tools, business analysis, and product ownership within a data analytics context. A basic understanding of the banking domain, along with excellent analytical, communication, and interpersonal skills, will be crucial for success in this position. Your main purpose as a Solution Designer will involve designing, developing, and implementing solutions to complex business problems by collaborating with stakeholders to understand their needs and requirements. You will be accountable for designing solutions that balance technology risks against business delivery, driving consistency and aligning with modern software engineering practices and automated delivery tooling. Furthermore, you will be expected to provide impact assessments, fault finding support, and architecture inputs required to comply with the bank's governance processes. As an Assistant Vice President in this role, you will be responsible for advising on decision-making processes, contributing to policy development, and ensuring operational effectiveness. If the position involves leadership responsibilities, you will lead a team to deliver impactful work and set objectives for employees while demonstrating leadership behaviours focused on listening, inspiring, aligning, and developing others. Alternatively, as an individual contributor, you will lead collaborative assignments, guide team members, identify new directions for projects, consult on complex issues, and collaborate with other areas to support business activities. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive. By demonstrating these values and mindset, you will contribute to creating an environment where colleagues can thrive and deliver consistently excellent results.,

Posted 20 hours ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Solution Architect in the Pre-Sales department, with 4-6 years of experience in cloud infrastructure deployment, migration, and managed services, your primary responsibility will be to design AWS Cloud Professional Services and AWS Cloud Managed Services solutions tailored to meet customer needs and requirements. You will engage with customers to analyze their requirements, ensuring cost-effective and technically sound solutions are provided. Your role will also involve developing technical and commercial proposals in response to various client inquiries such as Requests for Information (RFI), Requests for Quotation (RFQ), and Requests for Proposal (RFP). Additionally, you will prepare and deliver technical presentations to clients, highlighting the value and capabilities of AWS solutions. Collaborating closely with the sales team, you will work towards supporting their objectives and closing deals that align with business needs. Your ability to apply creative and analytical problem-solving skills to address complex challenges using AWS technology will be crucial. Furthermore, you should possess hands-on experience in planning, designing, and implementing AWS IaaS, PaaS, and SaaS services. Experience in executing end-to-end cloud migrations to AWS, including discovery, assessment, and implementation, is required. You must also be proficient in designing and deploying well-architected landing zones and disaster recovery environments on AWS. Excellent communication skills, both written and verbal, are essential for effectively articulating solutions to technical and non-technical stakeholders. Your organizational, time management, problem-solving, and analytical skills will play a vital role in driving consistent business performance and exceeding targets. Desirable skills include intermediate-level experience with AWS services like AppStream, Elastic BeanStalk, ECS, Elasticache, and more, as well as IT orchestration and automation tools such as Ansible, Puppet, and Chef. Knowledge of Terraform, Azure DevOps, and AWS development services will be advantageous. In this role based in Noida, Uttar Pradesh, India, you will have the opportunity to collaborate with technical and non-technical teams across the organization, ensuring scalable, efficient, and secure solutions are delivered on the AWS platform.,

Posted 20 hours ago

Apply

3.0 years

0 Lacs

India

On-site

Description About Norstella At Norstella, our mission is simple: to help our clients bring life-saving therapies to market quicker—and help patients in need. Founded in 2022, but with history going back to 1939, Norstella unites best-in-class brands to help clients navigate the complexities at each step of the drug development life cycle —and get the right treatments to the right patients at the right time. Each Organization (Citeline, Evaluate, MMIT, Panalgo, The Dedham Group) Delivers Must-have Answers For Critical Strategic And Commercial Decision-making. Together, Via Our Market-leading Brands, We Help Our Clients Citeline – accelerate the drug development cycle Evaluate – bring the right drugs to market MMIT – identify barrier to patient access Panalgo – turn data into insight faster The Dedham Group – think strategically for specialty therapeutics By combining the efforts of each organization under Norstella, we can offer an even wider breadth of expertise, cutting-edge data solutions and expert advisory services alongside advanced technologies such as real-world data, machine learning and predictive analytics. As one of the largest global pharma intelligence solution providers, Norstella has a footprint across the globe with teams of experts delivering world class solutions in the USA, UK, The Netherlands, Japan, China and India. Job Description We are looking for an experienced Technical Lead with great communication skills, deep experience in software engineering, and most importantly, the ability and willingness to keep learning in this ever-changing technology landscape Reporting into Senior Director, the Technical Lead will be responsible for leading and managing a team of developers and engineers to deliver high-quality software products or services. In addition to technical expertise, the Tech Lead should have experience managing people, providing leadership, and creating a positive team culture. The Technical lead will also work with external developers responsible for the development and subsequent support of the various platform services components underpinning all the content applications across Norstella. As part of a large technology group, the Technical Lead will work with product management, architecture, and other software engineering teams in support of the product development roadmap. Key Duties & Responsibilities Technical Lead – Product Engineering Build Strong engineering team and culture. Work on projects with significant complexity Have a strong sense of ownership of the solutions that your team works on Be willing to work with and invest yourself in learning new technologies, programming languages, databases etc. Communicate effectively Be an "agile" person. You desire a fast-paced dynamic work environment Review existing technologies for suitability and make recommendations for change Develop, test, and maintain high-quality software applications using Spring Boot, AWS, and Database technologies Work with designers and project managers to understand client requirements and translate them into technical specifications Collaborate with developers on the team to ensure code quality and consistency Help Troubleshoot and debug issues in production and non-production environments Participate in code reviews and contribute to improving our coding practices and standards Key Requirements 3+ years of experience designing, developing and architecting backend microservices with RESTful API and API security frameworks, service-oriented and/or microservices architecture. 3+ years of people management experience. 9+ years of experience in programming with .NET. 4+ years of experience working with Node.js and the associated ecosystem. 2+ years of experience with AWS, GCP, Azure, or another cloud service 4+ years of experience in open-source frameworks (React, etc) 2+ years of experience in Agile practices 2+ years of experience with modern relational and NoSQL databases (Experience with relational and/or non-relational databases like PostgreSQL, DynamoDB, MySQL etc is a big plus) 4+ years of experience using modern build and deployment tools such as Jenkins and Docker Experience with “Test First” (TDD) software development process Experience within pharma/healthcare sector is a plus. The Guiding Principles For Success At Norstella 01: Bold, Passionate, Mission-First We have a lofty mission to Smooth Access to Life Saving Therapies and we will get there by being bold and passionate about the mission and our clients. Our clients and the mission in what we are trying to accomplish must be in the forefront of our minds in everything we do. 02: Integrity, Truth, Reality We make promises that we can keep, and goals that push us to new heights. Our integrity offers us the opportunity to learn and improve by being honest about what works and what doesn’t. By being true to the data and producing realistic metrics, we are able to create plans and resources to achieve our goals. 03: Kindness, Empathy, Grace We will empathize with everyone's situation, provide positive and constructive feedback with kindness, and accept opportunities for improvement with grace and gratitude. We use this principle across the organization to collaborate and build lines of open communication. 04: Resilience, Mettle, Perserverance We will persevere – even in difficult and challenging situations. Our ability to recover from mis-steps and failures in a positive way will help us to be successful in our mission. 05: Humility, Gratitude, Learning We will be true learners by showing humility and gratitude in our work. We recognize that the smartest person in the room is the one who is always listening, learning, and willing to shift their thinking. Benefits Health Insurance Provident Fund Reimbursement of Certification Expenses Gratuity 24x7 Health Desk Norstella is an equal opportunities employer and does not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, color, nationality, ethnic or national origin, religion or belief, disability or age. Our ethos is to respect and value people’s differences, to help everyone achieve more at work as well as in their personal lives so that they feel proud of the part they play in our success. We believe that all decisions about people at work should be based on the individual’s abilities, skills, performance and behavior and our business requirements. Norstella operates a zero tolerance policy to any form of discrimination, abuse or harassment. Sometimes the best opportunities are hidden by self-doubt. We disqualify ourselves before we have the opportunity to be considered. Regardless of where you came from, how you identify, or the path that led you here- you are welcome. If you read this job description and feel passion and excitement, we’re just as excited about you.

Posted 20 hours ago

Apply

8.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

About The Role We are seeking a highly skilled and experienced AWS Solution Architect to lead the design and implementation of scalable, secure, and high-performing cloud architectures. The ideal candidate will possess deep technical knowledge of AWS services, cloud-native design principles, and enterprise architecture. Responsibilities This is a critical role that will work cross-functionally with engineering, DevOps, product management, and business stakeholders to define and drive the cloud strategy across the Responsibilities Design & Development : Design end-to-end cloud architecture solutions on AWS aligned with enterprise requirements, business goals, and security policies. Define solution blueprints including infrastructure, application, data, and security layers using AWS Well-Architected Framework. Create architecture diagrams, integration patterns, and technical documentation for proposed Strategy & Roadmapping : Collaborate with business stakeholders and engineering teams to define cloud adoption strategy and migration roadmaps. Conduct AWS readiness assessments, gap analysis, and technology evaluations to recommend modernization Oversight : Guide development and DevOps teams throughout the lifecycle of cloud solutions from proof-of-concept to deployment and optimization. Ensure best practices in CI/CD pipelines, infrastructure-as-code (IaC), observability, and cloud operations. Conduct design and code reviews to ensure architectural integrity and operational & Compliance : Define and enforce cloud security architectures including IAM, encryption, network security, and data privacy compliance (e.g., GDPR, HIPAA). Implement and review automated security controls, policy-based governance, and audit readiness Optimization & Cost Management : Identify performance bottlenecks and implement scalable solutions for compute, storage, and networking. Design for cost-efficiency using services such as Auto Scaling, Spot Instances, Reserved Instances, and Savings Engagement & Technical Leadership : Serve as a trusted advisor to internal teams and external clients on AWS technology stacks and cloud-native practices. Conduct architecture reviews, knowledge-sharing sessions, and enablement workshops for developers and Skills and Qualifications : 8+ years of overall IT experience with at least 4+ years in cloud architecture and AWS cloud engineering. Proven experience designing and implementing solutions using core AWS services such as : Compute : EC2, Lambda, ECS, : S3, EBS, : RDS, DynamoDB, : VPC, Route 53, API Gateway, : IAM, KMS, WAF, : CodePipeline, CodeDeploy, CloudFormation, Terraform, CloudWatch Strong understanding of distributed systems, microservices architecture, RESTful APIs, event-driven architecture, and serverless design patterns. Hands-on experience with Infrastructure as Code (IaC) using CloudFormation, Terraform, or CDK. Proficient in scripting or programming languages such as Python, Bash, or Node.js. Familiarity with containerization and orchestration using Docker and Kubernetes (EKS preferred). Experience with application modernization, cloud migrations, and hybrid cloud architectures. Deep understanding of network security, authentication protocols, encryption, and identity federation (ref:hirist.tech)

Posted 21 hours ago

Apply

Exploring DynamoDB Jobs in India

DynamoDB is a popular NoSQL database service offered by Amazon Web Services (AWS) that is widely used by companies in India. The job market for dynamodb professionals in India is currently booming, with many opportunities available for skilled individuals.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for dynamodb professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

A typical career path in dynamodb may involve progressing from roles such as Junior Developer to Senior Developer and eventually to a Tech Lead position. Opportunities for specialization in areas like database architecture or cloud solutions may also arise.

Related Skills

In addition to expertise in DynamoDB, professionals in this field are often expected to have knowledge of related technologies and tools such as AWS services, NoSQL databases, data modeling, and serverless architecture.

Interview Questions

  • What is DynamoDB and how does it differ from traditional relational databases? (basic)
  • Explain the primary key structure in DynamoDB. (basic)
  • What are the different types of DynamoDB primary keys? (basic)
  • How does DynamoDB handle read and write capacity units? (medium)
  • Can you explain the concepts of eventual consistency and strong consistency in DynamoDB? (medium)
  • How can you optimize DynamoDB performance? (medium)
  • What is the difference between partition key, sort key, and secondary index in DynamoDB? (medium)
  • How does DynamoDB handle data partitioning? (advanced)
  • Explain the DynamoDB Streams feature and its use cases. (advanced)
  • How can you implement transactions in DynamoDB? (advanced)
  • Describe the benefits and limitations of using DynamoDB Global Tables. (advanced)
  • How does DynamoDB handle data backups and restores? (medium)
  • Can you explain the capacity modes in DynamoDB? (medium)
  • What is the difference between a scan and a query operation in DynamoDB? (medium)
  • How can you troubleshoot performance issues in DynamoDB? (advanced)
  • Explain the concepts of read/write sharding in DynamoDB. (advanced)
  • How does DynamoDB handle conflicts in concurrent write operations? (advanced)
  • What are the best practices for designing DynamoDB tables? (medium)
  • How can you monitor and optimize costs in DynamoDB? (medium)
  • Describe the differences between DynamoDB and Aurora. (medium)
  • What are the security features available in DynamoDB? (medium)
  • How does DynamoDB handle data durability and availability? (medium)
  • Can you explain how to implement data encryption in DynamoDB? (medium)
  • Describe the DynamoDB Accelerator (DAX) service and its benefits. (medium)
  • How can you integrate DynamoDB with other AWS services like Lambda or S3? (medium)

Closing Remark

As you explore opportunities in the dynamodb job market in India, remember to stay updated on the latest trends and technologies in the field. Prepare thoroughly for interviews by honing your skills and showcasing your expertise confidently. Good luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies