Jobs
Interviews

2969 Dynamodb Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

At Crimson Enago, we are dedicated to developing AI-powered tools and services that enhance the productivity of researchers and professionals. Through our flagship products Trinka and RAx, we aim to streamline the stages of knowledge discovery, acquisition, creation, and dissemination. Trinka is an AI-driven English grammar checker and writing assistant tailored for academic and technical writing. Crafted by linguists, scientists, and language enthusiasts, Trinka identifies and rectifies numerous intricate writing errors, including contextual spelling mistakes, advanced grammar issues, and vocabulary enhancements in real-time. Moreover, Trinka offers writing suggestions to ensure professional, concise, and engaging content. With subject-specific corrections, Trinka guarantees that the writing aligns with the subject matter, and its Enterprise solutions provide unlimited access and customizable features. RAx is a smart workspace designed to support researchers, including students, professors, and corporate researchers, in their projects. Powered by proprietary AI algorithms, RAx serves as an integrated workspace for research endeavors, connecting various sources of information to user behaviors such as reading, writing, annotating, and discussions. This synergy reveals new insights and opportunities in the academic realm, revolutionizing traditional research practices. Our team comprises passionate researchers, engineers, and designers united by the vision of transforming research-intensive projects. By alleviating cognitive burdens and facilitating the conversion of information into knowledge, we strive to simplify and enrich the research process. With a focus on scalability, data processing, AI integration, and global user interactions, our engineering team aims to empower individuals worldwide in their research pursuits. As a Principal Engineer Fullstack at Trinka, you will lead a team of web developers, driving top-tier engineering standards and overseeing end-to-end project development and delivery. Collaborating with the Engineering Manager, Principal Engineer, SDE-3 leads, and Technical Project Manager, you will play a pivotal role in team management, recruitment, and training. Your primary focus will be hands-on coding, constituting a significant portion of your daily responsibilities. Ideal candidates for this role at Trinka possess over 7 years of enterprise frontend-full-stack web experience, with expertise in the AngularJS-Java-AWS stack. Key characteristics we value include exceptional research skills, a commitment to testing and code quality, a penchant for scalable solutions, adeptness at project estimation and communication, and proficiency in cloud infrastructure optimization. Additionally, candidates should exhibit a keen eye for detail, a passion for user experience excellence, and a collaborative spirit essential for high-impact project delivery. Experience requirements for this role encompass proven expertise in solution architecting, frontend-full-stack development, backend technologies, AWS services, HTML5, CSS3, CSS frameworks, developer workflows, testing practices, and collaborative software engineering. A deep-rooted interest in profiling, impact analysis, root cause analysis, and Elasticsearch server cluster optimization is advantageous, reflecting a holistic approach to software development and problem-solving. Join us at Crimson Enago and be part of a dynamic team committed to reshaping research practices and empowering professionals worldwide with innovative tools and services.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be working as a Senior Software Engineer with 3-6 years of experience in Noida, UP. Your primary responsibilities will include developing and building highly reliable, scalable, and high-performing web applications for clients. It will be essential to review and comprehend business requirements, ensuring timely completion of development tasks with thorough testing to minimize defects. Collaboration across departments and customer interaction will be crucial in defining, designing, and presenting new concepts and solutions. Working closely with other developers is necessary to meet client needs effectively. You will be part of a rapid and agile development process to expedite time-to-market while maintaining appropriate controls. Implementing robust development and testing standards is a key aspect to ensure the quality of deliverables. To qualify for this role, you need to have a B.Tech/MCA degree with a minimum of 3 years of relevant experience. Familiarity with MVC frameworks like Spring, ORM tools like Hibernate, and a strong grasp of OOPS concepts, microservices, and Java programming language is necessary. Proficiency in programming on relational platforms such as MySQL and Oracle is required, while exposure to non-relational platforms like DynamoDB/MongoDB (NoSQL) would be beneficial. Knowledge of Javascript, JQuery, HTML, and XML will be advantageous. You must possess sound analytical skills, excellent communication skills, and experience with an agile development methodology, preferably Scrum. Additionally, experience in cloud computing or Linux, prior involvement in client-handling roles, being a proactive self-starter with a results-oriented mindset, and flexibility with good interpersonal skills are desirable attributes for this position.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Blockchain Developer with a minimum of 5 years of experience, you will be responsible for designing and developing blockchain-based solutions that meet the requirements of the business. Your expertise in Blockchain and Blockchain Networks such as Hyperledger Fabric, Stellar, and Solana will be crucial for this role. Your proficiency in Golang, with at least 4 years of experience, will enable you to develop and optimize smart contracts and decentralized applications effectively. Hands-on experience with NoSQL databases like CouchDB, Cassandra, or DynamoDB is preferred, along with familiarity with CI/CD tools and processes such as Docker, Kubernetes, Jenkins, and GIT. Your key responsibilities will include collaborating with cross-functional teams to deliver high-quality solutions, implementing security best practices across blockchain solutions, and analyzing and enhancing blockchain network performance. Your strong understanding of cryptography, resource optimization, and data clustering will be valuable in this role. Moreover, your project knowledge in Crypto Exchange platforms, Smart Contract Development, Tokenization, and NFT projects will further enhance your ability to contribute effectively to the development and optimization of blockchain solutions.,

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description Amazon Time & Pay Innovation (T&PA) organisation is looking for Software Development Engineers who enjoy working in a green field environment building large-scale intelligent products and services that offer a consumer grade user experience using a service oriented architecture, based on native AWS components. T&PA mission is to build world-class time systems that are intuitive and friction-free for all users, enable management of time and pay with close to zero defects, are available on-demand as new businesses launch, and flexibly support future business innovation. T&PA serves all Amazon employees across more than 25 countries and a dozen lines of business. We are tackling new, hard problems that Amazon has not solved at scale, creating fundamentally improved ways for employees to record time and reduce pay defects. T&PA has it all - early stage hustle, operational excellence, technical complexity and global scope across multiple Amazon businesses. Key job responsibilities We are looking for a strong Software Development Engineer who can take the lead in identifying and solving ambiguous problems. You will own and heavily influence the architecture and design, and will write a significant portion of the “critical-path” code. You will solve problems at Amazon scale, an order of magnitude larger than supported by any commodity solutions and growing exponentially. You will use highly scalable AWS technologies such as Lambda, EMR, DynamoDB, S3, Kinesis, ECS and many others to do this. Successful candidates will be strong leaders who can prioritize well, communicate clearly, and have a consistent track record of delivering large scale solutions. You will contribute to all aspects of an agile software development lifecycle including design, architecture, development, documentation, testing and operations. You will push your design and architecture by owning all aspects of solutions end-to-end, through full stack software development. A day in the life The day-to-day activities of a Software Development Engineer will include: Designing, implementing, and testing software solutions using native AWS technologies Collaborating with product managers and customers to manage requirements for new features and updates to existing systems Partnering with internal Customers to translate complex business logic into reusable and scalable code Periodically providing on-call support Coaching and mentoring junior engineers on the team Working with some of the most talented and dedicated team members you can find Having fun Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3039206

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 professionals spanning across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant- Databricks Lead Developer. As a Databricks Developer in this role, you will be tasked with solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Keep abreast of new and emerging technologies and assess their potential application for service offerings and products. - Collaborate with architects and lead engineers to devise solutions that meet functional and non-functional requirements. - Demonstrate proficiency in understanding relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess experience in the Data Engineering domain. Qualifications we are looking for: Minimum qualifications: - Bachelor's Degree or equivalency in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - <<>> years of experience in IT. - Familiarity with new and emerging technologies and their possible applications for service offerings and products. - Collaboration with architects and lead engineers to develop solutions meeting functional and non-functional requirements. - Understanding of industry trends and standards. - Strong analytical and technical problem-solving abilities. - Proficiency in either Python or Scala, preferably Python. - Experience in the Data Engineering domain. Preferred qualifications: - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience with CI/CD for building Databricks job pipelines. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. If you are a proactive individual with a passion for innovation and a strong commitment to continuous learning and upskilling, we invite you to apply for this exciting opportunity to join our team at Genpact.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The ideal candidate for this position should have a Bachelor's or Master's degree in Computer Science or Computer Engineering, or an equivalent field. You should possess at least 2-6 years of experience in server side development using technologies such as GoLang, Node.JS, or Python. You should demonstrate proficiency in working with AWS services like Lambda, DynamoDB, Step Functions, and S3. Additionally, you should have hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. A strong background in database technologies including MongoDB and DynamoDB is preferred. You should also have experience with CI/CD pipelines and automation processes. Any experience in Video Transcoding / Streaming on Cloud would be considered a plus. Problem-solving skills are essential for this role as you may encounter various challenges while working on projects.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are looking for a highly skilled and motivated Python Backend Engineer with a strong background in AWS to join our dynamic team. As an ideal candidate, you should have a deep understanding of AWS services and infrastructure and a passion for developing scalable backend solutions. You should have at least 5 years of experience in backend development using Python and a solid grasp of designing and developing RESTful APIs. Proficiency in working with relational and NoSQL databases like PostgreSQL, MySQL, and DynamoDB is essential. Strong problem-solving skills and the ability to work independently as well as collaboratively with a team are key attributes we are looking for. Excellent communication skills are crucial as you will be required to articulate technical concepts to non-technical stakeholders. If you are enthusiastic about Python backend development and have a strong AWS background, we encourage you to apply and become a part of our team. Your responsibilities will include designing, developing, and maintaining backend systems using Python and related technologies. You will be responsible for implementing and managing various AWS services such as EC2, S3, RDS, Lambda, and API Gateway to support scalable and reliable applications. Collaboration with cross-functional teams to define, design, and deliver new features is an integral part of the role. Optimizing application performance, ensuring infrastructure security and scalability, developing and maintaining RESTful APIs, participating in code reviews, troubleshooting production issues, and automating deployment processes using CI/CD tools are some of the tasks you will be involved in. Documenting system designs, processes, and workflows for future reference and knowledge sharing will also be a part of your responsibilities. At GlobalLogic, you will have the opportunity to work on exciting projects for industries such as High-Tech, communication, media, healthcare, retail, and telecom. You will collaborate with a diverse team of highly talented individuals in a laidback environment and have the chance to work from one of our global centers or client facilities. We prioritize work-life balance by offering flexible work schedules, opportunities to work from home, paid time off, and holidays. Our dedicated Learning & Development team organizes various training programs to enhance your professional skills and personal growth. We provide competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS (National Pension Scheme), health awareness programs, extended maternity leave, performance bonuses, and referral bonuses. In addition, you can enjoy sports events, cultural activities, food at subsidized rates, corporate parties, and discounts at popular stores and restaurants within our vibrant offices. GlobalLogic is a leader in digital engineering, helping global brands design and build innovative products, platforms, and digital experiences. With headquarters in Silicon Valley, we operate design studios and engineering centers worldwide, serving customers in various industries. As a Hitachi Group Company, we contribute to driving innovation through data and technology in the Social Innovation Business, aiming to create a sustainable society with a higher quality of life.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

ABOUT TRIBUTE TECHNOLOGY: At Tribute Technology, we make end-of-life celebrations memorable, meaningful, and effortless through thoughtful and innovative technology solutions. Our mission is to help communities around the world celebrate life and pay tribute to those we love. Our comprehensive platform brings together software and technology to provide a fully integrated experience for all users, whether that is a family, a funeral home, or an online publisher. We are the market leader in the US and Canada, with global expansion plans and a growing international team of more than 400 individuals in the US, Canada, Philippines, Ukraine and India. ABOUT YOU: We are seeking an experienced Software Engineer to help develop our innovative video creation platform. You will focus on building cloud-native services capable of merging audio and visual assets into high-quality videos. Experience with AWS cloud-native architectures, serverless computing, and media processing tools (FFmpeg) is essential. ESSENTIAL DUTIES AND RESPONSIBILITIES: Design and implement scalable, efficient video processing and asset-merging pipelines using FFmpeg. Develop and optimize cloud-native applications leveraging AWS services such as Lambda, ECS, EKS, Step Functions, S3, DynamoDB, and CloudFront. Create robust APIs and integration services to manage video workflows, ensuring high availability, reliability, and scalability. Implement advanced rendering techniques optimized for both web (HTML5, WebAssembly, WebGL) and native platforms (iOS, Android, desktop). Ensure system performance through rigorous monitoring, logging, and proactive optimization. Contribute to continuous integration and continuous deployment (CI/CD) pipelines, automation, and tooling to accelerate delivery. Collaborate closely with product management, frontend engineers, designers, and DevOps teams to translate business requirements into technical solutions. QUALIFICATIONS: Strong written and verbal communication skills working with software development teams 5+ years of experience developing cloud-native applications, preferably on AWS. Strong hands-on experience using FFmpeg for video and audio processing. Proficiency in one or more programming languages: Go, Rust, Node.js, or C#. Solid understanding of web and native video rendering technologies. Familiarity with infrastructure-as-code (Terraform, CloudFormation) and CI/CD tooling. Excellent problem-solving skills, attention to detail, and ability to work collaboratively in agile teams. PREFFERED EXPERIENCE: High-performance programming experience in Rust or Go. Experience optimizing cloud resource usage and performance tuning for large-scale multimedia applications. Deep knowledge of video streaming, codecs, and performance optimization techniques. BENEFITS: Competitive salary Fully remote across India An outstanding collaborative work environment Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position.

Posted 1 week ago

Apply

9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! What You'll Do Lead an engineering team at Adobe, developing adaptable web services for business needs. What You Need To Succeed 9+ years of experience in software development, with a proven track record in a leadership or management role. Proficiency in backend development. Strong database skills with expertise in SQL and NoSQL databases (e.g., MySQL, DynamoDB). Deep understanding of modern software architecture, including microservices, event-driven systems, and API-first development. Experience with cloud platforms (AWS) and their services for enterprise applications. Proficiency in version control (Git), CI/CD pipelines, and DevOps practices for software delivery. Familiarity with Docker, Kubernetes, and Infrastructure as Code (IaC) tools. Proven understanding of software development methodologies (Agile, Scrum, or Kanban) and design patterns. Hands-on experience with scalability and performance tuning for large applications. Ability to lead multi-functional teams, mentor engineers, and build high-performing development teams. Proven problem-solving, analytical, and decision-making skills with a strong bias for action. Excellent communication, collaboration, and management skills. Experience in hiring, mentoring, and career development of software engineers. Passion for building high-quality software and improving engineering processes. BS/MS or equivalent experience in Computer Science or a related field. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We’re Hiring : AWS Data Engineers at Coforge Ltd . Immediate Joiners Preferred. Locations: Hyderabad Only. Experience Required: 5 to 7 Years Send your CV to Gaurav.2.Kumar@coforge.com About the Role: As an AWS Data Engineer, you will be responsible for designing, developing, and optimizing robust, scalable, and secure data pipelines and infrastructure on AWS. You’ll work with diverse datasets and collaborate with cross-functional teams to enable data-driven decision-making across the organization. Key Responsibilities Design and develop end-to-end ETL/ELT pipelines using AWS Glue, Lambda, Step Functions, Kinesis, and Airflow Leverage AWS services like S3, Redshift, RDS, DynamoDB, EMR, Athena, and Lake Formation Write and optimize Python code for data processing and automation (PySpark experience preferred) Design and manage relational and NoSQL databases (MS SQL Server, PostgreSQL, MongoDB) Implement data modeling, warehousing, and governance best practices Monitor and optimize performance, scalability, and cost of data infrastructure Collaborate with data scientists, analysts, and engineering teams Maintain clear technical documentation and stay updated with AWS innovations. Required Skills & Qualifications Bachelor’s degree in Computer Science or related field 6–8 years of hands-on experience in data engineering on AWS Strong proficiency in Python and AWS data services Expertise in MS SQL Server and other databases Experience with orchestration tools like Step Functions and Airflow Solid understanding of data lakes, warehousing, and ETL/ELT methodologies Excellent communication and problem-solving skills

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Description Amazon Business Customer Support (ABCS) is looking for a Business Intelligence Engineer to help build next generation metrics and drive business analytics that have measurable impact. The successful candidate will have a strong understanding of different businesses and customer profiles - the underlying analytics, and the ability to translate business requirements into analysis, collect and analyze data, and make recommendations back to the business. BIEs also continuously learn new systems, tools, and industry best practices to help design new studies and build new tools that help our team automate, and accelerate analytics. As a Business Intelligence Engineer, you will develop strategic reports, design UIs and drive projects to support ABCS decision making. This role is inherently cross-functional — you will work closely with finance teams, engineering, and leadership across Amazon Business Customer Service. A successful candidate will be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail). You should be skilled in database design, be comfortable dealing with large and complex data sets, and have experience building self-service dashboards and using visualization tools especially Tableau. You should have strong analytical and communication skills. You will work with a team of analytics professionals who are passionate about using machine learning to build automated systems and solve problems that matter to our customers. Your work will directly impact our customers and operations. Members of this team will be challenged to innovate use the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to complex problems while working hard, having fun, and making history, this may be the opportunity for you. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards on the key drivers of our business. Key job responsibilities Scope, Design, and build database structure and schema. Create data pipelines using ETL connections/ SQL queries. Retrieve and analyze data using a broad set of Amazon's data technologies. Pull data on an ad-hoc basis using SQL queries. Design, build and maintain automated reporting and dashboards Conduct deep dives to identify root causes of pain points and opportunities For Improvement Become a subject matter expert in AB CS data, and support team members in Dive Deep Work closely with CSBI teams to ensure ABCS uses Globally alleged standard Metrics And Definition Collaborate with finance, business to gather data and metrics requirements. A day in the life We thrive on solving challenging problems to innovate for our customers. By pushing the boundaries of technology, we create unparalleled experiences that enable us to rapidly adapt in a dynamic environment. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! Basic Qualifications Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Haryana Job ID: A3038973

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description Amazon Business Customer Support (ABCS) is looking for a Business Intelligence Engineer to help build next generation metrics and drive business analytics that have measurable impact. The successful candidate will have a strong understanding of different businesses and customer profiles - the underlying analytics, and the ability to translate business requirements into analysis, collect and analyze data, and make recommendations back to the business. BIEs also continuously learn new systems, tools, and industry best practices to help design new studies and build new tools that help our team automate, and accelerate analytics. As a Business Intelligence Engineer, you will develop strategic reports, design UIs and drive projects to support ABCS decision making. This role is inherently cross-functional — you will work closely with finance teams, engineering, and leadership across Amazon Business Customer Service. A successful candidate will be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail). You should be skilled in database design, be comfortable dealing with large and complex data sets, and have experience building self-service dashboards and using visualization tools especially Tableau. You should have strong analytical and communication skills. You will work with a team of analytics professionals who are passionate about using machine learning to build automated systems and solve problems that matter to our customers. Your work will directly impact our customers and operations. Members of this team will be challenged to innovate use the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to complex problems while working hard, having fun, and making history, this may be the opportunity for you. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards on the key drivers of our business. Key job responsibilities Scope, Design, and build database structure and schema. Create data pipelines using ETL connections/ SQL queries. Retrieve and analyze data using a broad set of Amazon's data technologies. Pull data on an ad-hoc basis using SQL queries. Design, build and maintain automated reporting and dashboards Conduct deep dives to identify root causes of pain points and opportunities For Improvement Become a subject matter expert in AB CS data, and support team members in Dive Deep Work closely with CSBI teams to ensure ABCS uses Globally alleged standard Metrics And Definition Collaborate with finance, business to gather data and metrics requirements. A day in the life We thrive on solving challenging problems to innovate for our customers. By pushing the boundaries of technology, we create unparalleled experiences that enable us to rapidly adapt in a dynamic environment. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! Basic Qualifications Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Haryana Job ID: A3038973

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

8-10years of relevant work experience showing growth as a Data Engineer. Hands On programming experience Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS LakeFormation. Experience of performance optimization in Batch and Real time processing applications Expertise in Data Governance and Data Security Implementation. Good hands-on design and programming skills building reusable tools and products Experience developing in AWS or similar cloud platforms. Preferred:, ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, QuickSight or similar. Familiarity with systems with very high volume of transactions, micro service design, or data processing pipelines (Spark). Knowledge and hands-on experience with server-less technologies such as Lambda, MSK, MWAA, Kinesis Analytics a plus. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Strong programming skills in SQL, Python and PySpark for data processing and automation. Experience with Databricks and Snowflake (preferred) for building and maintaining data pipelines. Understanding of Machine Learning and AI techniques, especially for data quality and anomaly detection. Experience with cloud platforms such as Azure and AWS and familiarity with Azure Web Apps Knowledge of Data Quality and Data Governance concepts (Preferred) Nice to have: Power BI dashboard development experience.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! About the team: Our team integrates technology, architecture, knowledge, tools, and processes to deliver all sub-systems. What You Need To Succeed What you’ll do The Opportunity Essential Skills 3 - 6 years of experience with full-stack software development using: Backend technologies like Java, Go, REST API Databases: NoSQL (DynamoDB), SQL DevOps tools: AWS, Kubernetes, Docker etc. Hands-on experience in crafting & developing complex, large-scale microservices and RESTful APIs for real-time responsive applications. Proficiency in data structures and Low Level Design Strong verbal and written communication skills Design, build, and deploy high-quality code for the Adobe Unified platform. Innovate within the current system to improve robustness, ease, and convenience. Articulate design and code choices to cross-functional teams. Review and provide feedback on features, technology, architecture, designs, time & budget estimates, and test strategies. Develop and evolve engineering and business processes to optimize team efficiency. Partner with senior engineers across Adobe to achieve common goals. Be part of the Adobe Unified Platform by building next-generation products and platforms! Develop and maintain RESTful APIs and microservices. Write clean, well-documented, and reusable code that follows industry standards. Participate in code reviews, testing, and debugging to ensure the quality and reliability of our applications. Collaborate with other developers, project managers, and collaborators to deliver high-quality software solutions. Stay up-to-date with the latest industry trends and technologies, and share your knowledge with the team. Experience working on large software applications including cloud services. Proficient in web and cloud technologies. Good understanding of architecture, design, performance, and reliability issues in global, high-volume applications. Expertise in object-oriented design and knowledge of product life cycles. Excellent written and verbal communication skills. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Company Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region. About the Role We are seeking a highly skilled and experienced Java AWS React Developer to join our dynamic development team. The ideal candidate will have a strong background in Java backend development, cloud-native application design using AWS, and modern front-end development with React.js. You will be responsible for designing, developing, and maintaining scalable web applications that deliver high performance and reliability. Responsibilities Design, develop, and maintain scalable and secure backend services using Java (Spring Boot). Build responsive and dynamic user interfaces using React.js. Develop and deploy cloud-native applications on AWS using services like Lambda, API Gateway, S3, DynamoDB, ECS, etc. Collaborate with cross-functional teams including product managers, designers, and QA engineers. Write clean, maintainable, and efficient code following best practices. Participate in code reviews, unit testing, and integration testing. Troubleshoot and resolve technical issues across the stack. Ensure application performance, uptime, and scale, maintaining high standards of code quality and thoughtful design. Qualifications 5–8 years of hands-on experience in software development. Strong proficiency in Java, especially with Spring Boot. Solid experience with React.js, JavaScript (ES6+), HTML5, and CSS3. Proficiency in AWS services (EC2, Lambda, S3, RDS, DynamoDB, CloudFormation, etc.). Experience with RESTful APIs and microservices architecture. Familiarity with CI/CD pipelines and DevOps practices. Strong understanding of software engineering principles, design patterns, and best practices. Excellent problem-solving and communication skills. Preferred Skills AWS Certification (e.g., AWS Certified Developer or Solutions Architect). Experience with containerization tools like Docker and orchestration with Kubernetes. Familiarity with Agile/Scrum methodologies. Knowledge of security best practices in cloud and web development. Qualifications BE/B.Tech/MCA

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Strong programming skills in SQL, Python and PySpark for data processing and automation. Experience with Databricks and Snowflake (preferred) for building and maintaining data pipelines. Understanding of Machine Learning and AI techniques, especially for data quality and anomaly detection. Experience with cloud platforms such as Azure and AWS and familiarity with Azure Web Apps Knowledge of Data Quality and Data Governance concepts (Preferred) Nice to have: Power BI dashboard development experience. Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417177 Relocation Package No

Posted 1 week ago

Apply

4.0 years

1 - 3 Lacs

Hyderābād

On-site

Java AWS engineer with experience in building AWS services like Lambda, Batch, SQS, S3, DynamoDB etc. using AWS Java SDK and Cloud formation templates. 4 to 8 years of experience in design, development and triaging for large, complex systems. Experience in Java and object-oriented design skills 3-4+ years of microservices development & Mutlithreading 2+ years working in Spring Boot Experienced using API dev tools like IntelliJ/Eclipse, Postman, Git, Cucumber Hands on experience in building microservices based application using Spring Boot and REST, JSON DevOps understanding – containers, cloud, automation, security, configuration management, CI/CD Experience using CICD processes for application software integration and deployment using Maven, Git, Jenkins. Experience dealing with NoSQL databases like Cassandra Experience building scalable and resilient applications in private or public cloud environments and cloud technologies Experience in Utilizing tools such as Maven, Docker, Kubernetes, ELK, Jenkins Agile Software Development (typically Scrum, Kanban, Safe) Experience with API gateway and API security.

Posted 1 week ago

Apply

1.0 years

9 - 12 Lacs

Delhi

On-site

Backend developer Quick Apply https://goodspace.ai/jobs/backend-developer?id=28368&source=campaign_Indeed-Kritika_Backenddeveloper-2836 Years of Experience-1-3 Years LocationDelhi, IndiaKey SkillsJavaNode.jsPythonSqlDockerRESTful APIs Job Description About the Role: We are looking for a skilled Backend Developer to join our dynamic team and take ownership of the backend infrastructure for our application. The ideal candidate will have experience working with AWS services, Node.js, and the Serverless framework, and be capable of designing scalable, secure, and efficient backend solutions. Key Responsibilities: * Develop, deploy, and maintain serverless backend APIs using AWS Lambda and API Gateway * Design and manage scalable databases using DynamoDB * Handle file storage and retrieval with AWS S3 * Implement backend logic and integrations using Node.js * Ensure best practices for security, performance, and scalability * Collaborate with the frontend team to integrate APIs and troubleshoot issues * Monitor, debug, and optimize backend performance Required Skills: * Strong proficiency in Node.js * Hands-on experience with AWS services: Lambda, API Gateway, DynamoDB, S3 * Proficiency with the Serverless Framework * Knowledge of RESTful APIs and best practices * Familiarity with CI/CD, logging, and monitoring tools * Good problem-solving and communication skills Quick apply: Job Type: Full-time Pay: ₹900,000.00 - ₹1,200,000.00 per year Location Type: In-person Schedule: Day shift Work Location: In person Speak with the employer +91 9646963015

Posted 1 week ago

Apply

3.0 years

8 - 10 Lacs

Bengaluru

On-site

A Software Development Engineer may assist their colleagues and more junior Software Development Engineer team members by solving problems, providing technical guidance, training and mentoring others. Exceptional attention to detail, strong analytical skills, excellent communication skills, an innovative mindset, the ability to solve complex problems and deep technical ability in software development programming within an agile environment will be key for success. We are actively seeking an exceptionally motivated individual who thrives on continuous learning and embraces the dynamic environment of a high-velocity team. Joining the Content Productization & Delivery (CPD) organization at Thomson Reuters, you will play a pivotal role in ensuring the quality, reliability, and availability of critical systems. These systems provide a suite of infrastructure services supporting a common set of search and information retrieval capabilities necessary for Thomson Reuters's research-based applications and APIs across its core products. Your responsibilities will encompass delivering content via shared services that underpin all our Tax and Legal Research products. About the role: In this opportunity as a Software Engineer, you will : Actively participates and collaborates in meetings, processes, agile ceremonies, and interaction with other technology groups. Works with Lead Engineers and Architects to develop high performing and scalable software solutions to meet requirement and design specifications. Provides technical guidance, mentoring, or coaching to software or systems engineering teams that are distributed across geographic locations. Proactively share knowledge and best practices on using new and emerging technologies across all the development and testing groups. Assists in identifying and correcting software performance bottlenecks. Provides regular progress and status updates to management. Provides technical support to operations or other development teams by assisting in troubleshooting, debugging, and solving critical issues in the production environment promptly to minimize user and revenue impact. Ability to interpret code and solve problems based on existing standards. Creates and maintains all required technical documentation / manual related to assigned components to ensure supportability. About You: You're a fit for the role of Software Engineer, if your background includes: Bachelor’s or master’s degree in computer science, engineering, information technology or equivalent experience 3 years of professional software development experience 3+ years of experience with Java and REST based services Ability to debug and diagnose issues. Experience with version control (Git, GitHub) Experience working with various AWS technologies (DynamoDB, S3, EKS) Experience with Linux Infrastructure as Code, CICD Pipelines #LI-HG1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY – Consulting – AWS Data Engineering Manager The Opportunity We are seeking an experienced and visionary AWS Data Engineering Manager to lead our data engineering initiatives within the Consulting practice and have about 7+ years of experience. This role is ideal for a strategic thinker with a strong technical foundation in AWS and data engineering, who can guide teams, architect scalable solutions, and drive innovation in data platforms. You will play a pivotal role in shaping data strategies, mentoring teams, and delivering impactful solutions for our clients. Key Responsibilities Lead the design and implementation of scalable data pipelines using AWS technologies, supporting both batch and real-time data processing. Architect robust data lake solutions based on the Medallion Architecture using Amazon S3 and integrate with Redshift and Oracle for downstream analytics. Oversee the development of data ingestion frameworks from diverse sources including on-premise databases, batch files, and Kafka streams. Guide the development of Spark streaming applications on Amazon EMR and batch processing using AWS Glue and Python. Manage workflow orchestration using Apache Airflow and ensure operational excellence through monitoring and optimization. Collaborate with cross-functional teams including data scientists, analysts, and DevOps to align data solutions with business goals. Provide technical leadership, mentorship, and performance management for a team of data engineers. Engage with clients to understand business requirements, define data strategies, and deliver high-quality solutions. Required Skills And Experience Proven leadership experience in managing data engineering teams and delivering complex data solutions. Deep expertise in AWS services including S3, Redshift, Glue, EMR, and Oracle. Strong programming skills in Python and Spark, with a solid understanding of data modeling and ETL frameworks. Hands-on experience with Kafka for real-time data ingestion and processing. Proficiency in workflow orchestration tools like Apache Airflow. Strong understanding of Medallion Architecture and data lake best practices. Preferred / Nice-to-Have Skills Experience with Infrastructure as Code (IaC) using Terraform. Familiarity with additional AWS services such as SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Knowledge of monitoring and alerting tools like CloudWatch, Datadog, or Splunk. Understanding of data security best practices for data at rest and in transit. Qualifications BTech / MTech / MCA / MBA or equivalent. AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

7.0 years

6 - 10 Lacs

Noida

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY – Consulting – AWS Data Engineering Manager The Opportunity We are seeking an experienced and visionary AWS Data Engineering Manager to lead our data engineering initiatives within the Consulting practice and have about 7+ years of experience. This role is ideal for a strategic thinker with a strong technical foundation in AWS and data engineering, who can guide teams, architect scalable solutions, and drive innovation in data platforms. You will play a pivotal role in shaping data strategies, mentoring teams, and delivering impactful solutions for our clients. Key Responsibilities Lead the design and implementation of scalable data pipelines using AWS technologies, supporting both batch and real-time data processing. Architect robust data lake solutions based on the Medallion Architecture using Amazon S3 and integrate with Redshift and Oracle for downstream analytics. Oversee the development of data ingestion frameworks from diverse sources including on-premise databases, batch files, and Kafka streams. Guide the development of Spark streaming applications on Amazon EMR and batch processing using AWS Glue and Python. Manage workflow orchestration using Apache Airflow and ensure operational excellence through monitoring and optimization. Collaborate with cross-functional teams including data scientists, analysts, and DevOps to align data solutions with business goals. Provide technical leadership, mentorship, and performance management for a team of data engineers. Engage with clients to understand business requirements, define data strategies, and deliver high-quality solutions. Required Skills and Experience Proven leadership experience in managing data engineering teams and delivering complex data solutions. Deep expertise in AWS services including S3, Redshift, Glue, EMR, and Oracle. Strong programming skills in Python and Spark, with a solid understanding of data modeling and ETL frameworks. Hands-on experience with Kafka for real-time data ingestion and processing. Proficiency in workflow orchestration tools like Apache Airflow. Strong understanding of Medallion Architecture and data lake best practices. Preferred / Nice-to-Have Skills Experience with Infrastructure as Code (IaC) using Terraform. Familiarity with additional AWS services such as SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Knowledge of monitoring and alerting tools like CloudWatch, Datadog, or Splunk. Understanding of data security best practices for data at rest and in transit. Qualifications BTech / MTech / MCA / MBA or equivalent. AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... Demonstrates up-to-date expertise and applies this to the development, execution, and improvement of action plans by providing expert advice and guidance to others in the application of information and best practices; supporting and aligning efforts to meet customer and business needs; and building commitment for perspectives and rationales. Provides and supports the implementation of business solutions by building relationships and partnerships with key stakeholders; identifying business needs; determining and carrying out necessary processes and practices; monitoring progress and results; recognizing and capitalizing on improvement opportunities; and adapting to competing demands, organizational changes, and new responsibilities. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity by incorporating these into the development and implementation of business plans; using the Open Door Policy; and demonstrating and assisting others with how to apply these in executing business processes and practices. What you'll do... About the Team: The Data and Customer Analytics Team is a strategic unit dedicated to transforming data into actionable insights that drive customer-centric decision-making across the organization. Our mission is to harness the power of data to understand customer behavior, optimize business performance, and enable personalized experiences. Our team is responsible for building and maintaining a centralized, scalable, and secure data platform that consolidates customer-related data from diverse sources across the organization. This team plays a foundational role in enabling data-driven decision-making, advanced analytics, and personalized customer experiences. This team plays a critical role in building trust with customers by implementing robust privacy practices, policies, and technologies that protect personal information throughout its lifecycle. What You’ll Do Design, build, test and deploy cutting edge solutions at scale, impacting multi-billion-dollar business. Work closely with product owner and technical lead and play a major role in the overall delivery of the assigned project/enhancements. Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Provide business insights while leveraging internal tools and systems, databases and industry data. Drive the success of the implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives. What You’ll Bring 3-5 years experience building highly scalable, high performance, responsive web applications. Experience building customizable, reusable, and dynamic API components using Java, NodeJS, Serverless API, RESTful API and Graph QL. Experience with web Java Spring boot API deployment for server-side development with design principles Familiarity with React Navigation for handling screens & navigation Understanding of RESTful APIs & GraphQL Strong Work experience in Google Cloud platform services Strong creative, collaboration, and communication skills Ability to multitask between several different requirements and features concurrently. Familiarity with CI/CD, unit testing, automated frontend testing Build high quality code by conducting unit testing and enhancing design to prevent re-occurrences of defects Ability to perform in a team environment. Understanding of native development (Swift for iOS, Kotlin for Android) is a plus Strong expertise in Java, Spring Boot, Spring MVC, and Spring Cloud. Hands-on experience with Apache Kafka (topics, partitions, consumer groups, Kafka Streams). Solid understanding of microservices architecture and event-driven systems. Experience with RESTful APIs, OAuth, JWT, and API gateways. Proficiency in SQL (PostgreSQL, MySQL, Big Query, Big Lake GCP services) and NoSQL (MongoDB, Cassandra, DynamoDB). Knowledge of Docker, Kubernetes, and cloud platforms (Azure, AWS, or GCP). Strong debugging and performance optimization skills About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 2years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 4 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, Network+, GISF, GSEC, CISSP, or CCSP, Master’s degree in Computer Science, Information Technology, Engineering, Information Systems, Cybersecurity, or related area Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India R-2221347

Posted 1 week ago

Apply

7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY – Consulting – AWS Data Engineering Manager The Opportunity We are seeking an experienced and visionary AWS Data Engineering Manager to lead our data engineering initiatives within the Consulting practice and have about 7+ years of experience. This role is ideal for a strategic thinker with a strong technical foundation in AWS and data engineering, who can guide teams, architect scalable solutions, and drive innovation in data platforms. You will play a pivotal role in shaping data strategies, mentoring teams, and delivering impactful solutions for our clients. Key Responsibilities Lead the design and implementation of scalable data pipelines using AWS technologies, supporting both batch and real-time data processing. Architect robust data lake solutions based on the Medallion Architecture using Amazon S3 and integrate with Redshift and Oracle for downstream analytics. Oversee the development of data ingestion frameworks from diverse sources including on-premise databases, batch files, and Kafka streams. Guide the development of Spark streaming applications on Amazon EMR and batch processing using AWS Glue and Python. Manage workflow orchestration using Apache Airflow and ensure operational excellence through monitoring and optimization. Collaborate with cross-functional teams including data scientists, analysts, and DevOps to align data solutions with business goals. Provide technical leadership, mentorship, and performance management for a team of data engineers. Engage with clients to understand business requirements, define data strategies, and deliver high-quality solutions. Required Skills And Experience Proven leadership experience in managing data engineering teams and delivering complex data solutions. Deep expertise in AWS services including S3, Redshift, Glue, EMR, and Oracle. Strong programming skills in Python and Spark, with a solid understanding of data modeling and ETL frameworks. Hands-on experience with Kafka for real-time data ingestion and processing. Proficiency in workflow orchestration tools like Apache Airflow. Strong understanding of Medallion Architecture and data lake best practices. Preferred / Nice-to-Have Skills Experience with Infrastructure as Code (IaC) using Terraform. Familiarity with additional AWS services such as SNS, SQS, DynamoDB, DMS, Athena, and Lake Formation. Knowledge of monitoring and alerting tools like CloudWatch, Datadog, or Splunk. Understanding of data security best practices for data at rest and in transit. Qualifications BTech / MTech / MCA / MBA or equivalent. AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

2.0 years

12 - 15 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of Weekday's clients Salary range: Rs 1200000 - Rs 1500000 (ie INR 12-15 LPA) Min Experience: 2 years Location: Bengaluru JobType: full-time Requirements Key Responsibilities: Design, develop, test, and maintain scalable fullstack applications using modern technologies Implement responsive and cross-browser compatible UI components with HTML, CSS (SASS/LESS), and JavaScript Develop robust backend services and APIs using Node.js, Express.js, JavaScript/TypeScript Collaborate with cross-functional teams including designers, product managers, and other developers Ensure code quality through best practices, including unit testing and code reviews Contribute to architectural decisions and provide innovative solutions to complex challenges. Must-Have Qualifications: 2-3 years of professional experience as a Fullstack Engineer Strong proficiency in HTML, CSS (LESS/SASS), and JavaScript with a deep understanding of responsive and cross-browser web design principles Hands-on experience with modern front-end frameworks such as React or Angular Solid backend development experience in Node.js, Express.js, and either JavaScript or TypeScript Strong grasp of software engineering fundamentals, including data structures, algorithms, and problem-solving skills. Good-to-Have Skills: Experience building applications that integrate AI tools or AI-driven workflows Exposure to backend development using Python or Java Knowledge of databases, including MongoDB, DynamoDB, MySQL, Redis, ElastiCache, and ElasticSearchDB Experience designing and developing RESTful APIs, preferably with metric-driven API Gateway integrations Familiarity with AWS services, Kubernetes, microservices, and domain-driven architecture Excellent written and verbal communication skills, with the ability to clearly present technical concepts to stakeholders and team members

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies