Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
4 - 8 Lacs
hyderabad, bengaluru
Work from Office
Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune
Posted Date not available
3.0 - 6.0 years
4 - 8 Lacs
hyderabad, bengaluru
Work from Office
Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune
Posted Date not available
3.0 - 7.0 years
6 - 10 Lacs
hyderabad, bengaluru
Work from Office
Your Role You would be implementing and supporting following Enterprise Planning & Budgeting Cloud Services (EPBCS) modules - Financials, Workforce, Capital, and Projects Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Oracle EPM Cloud Implementation Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad, Bangalore, Chennai (ex Madras), Mumbai (ex Bombay), Pune
Posted Date not available
3.0 - 7.0 years
6 - 10 Lacs
hyderabad, bengaluru
Work from Office
Your Role You would be implementing and supporting following Enterprise Planning & Budgeting Cloud Services (EPBCS) modules - Financials, Workforce, Capital, and Projects Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Oracle EPM Cloud Implementation Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad, Bangalore, Chennai (ex Madras), Mumbai (ex Bombay), Pune
Posted Date not available
5.0 - 10.0 years
15 - 25 Lacs
kolkata
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted Date not available
5.0 - 10.0 years
15 - 25 Lacs
kolkata
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted Date not available
10.0 - 15.0 years
11 - 16 Lacs
mumbai
Work from Office
Grade Level (for internal use): 11 S&P Global Dow Jones Indices The Role : Senior Lead Development Engineer - Python Full Stack S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Senior Lead Development Engineer with full stack experience to join our technology team. As full stack engineer role, both front-end and back-end skills will be utilized. The Team : You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. Responsibilities and Impact : You will be working on one of the key systems that is responsible for calculating re-balancing weights and asset selections for S&P indices. Ultimately, the output of this team is used to maintain some of the most recognized and important investable assets globally. Design and development of RESTful web services and with closely with UI developers Interfacing with various AWS infrastructure and services, deploying to Containerization technology environment. Coding, Documentation, Testing, Debugging, Documentation and tier-3 support. Taking ownership of code modules and leading code review processes. Work directly with stakeholders and technical architect to formalize/document requirements for both supporting existing application as well as new initiatives. Perform Application & System Performance tuning and troubleshoot performance issues. Define and refine agile stories and task, delegate to team, conduct code reviews pull requests. Coordinately closely with the QA team and the scrum master to optimize team velocity and task flow. Helps establish and maintain technical standards via code reviews and pull requests Whats in it for you :This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. You will use your full range of skills as a full stack developer. What Were Looking For: Basic Qualifications : 10 - 15 years of IT experience in application development and support. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience. Expert in modern Python 3.10+ (minimum 5 years dedicated, recent Python experience). AWS services experience including API Gateway, ECS Containerization technology, DynamoDB, S3, Distributed streaming platform, SQS. SQL database experience including proficiency on Postgres. Python libraries experience including Pandas, Numpy, Pydantic, SQLAlchemy. Demonstrated experience at creating RESTful endpoints in Python (Flask, FastAPI, Sanic). JavaScript and UI development experience including one of the following: (Vue 3, React, Angular) REST API testing experience with Postman or Bruno. Strong CI/CD build process experience using Jenkins. Backend services development including distributed libraries and packages in Python. Experience with software testing (unit testing, integration testing, test driven development). Strong Work Ethic, Communication and Thoughtfulness Additional Preferred Qualifications: Good understanding of financial markets and investing (stocks, funds, indices, etc.) Experience working in mission-critical enterprise organizations A passion for creating high quality code and broad unit test coverage. Ability to understand complex business problems, break into smaller executable parts, and delegate.
Posted Date not available
5.0 - 7.0 years
15 - 30 Lacs
kochi, bengaluru, thiruvananthapuram
Work from Office
Job Title: AWS Serverless Developer Location: Kochi Experience: 5-9 years Work Mode: [Hybrid] Job Summary: We are seeking an experienced AWS Serverless Developer with strong proficiency in TypeScript and expertise in building scalable cloud-native applications using AWS services. The ideal candidate will be hands-on with AWS Lambda , API Gateway , and DynamoDB , and will have a deep understanding of serverless architecture and best practices. Key Responsibilities: Design, develop, and maintain serverless applications on AWS. Write clean, scalable, and efficient code in TypeScript . Build and manage APIs using API Gateway and Lambda functions . Design and manage data persistence using DynamoDB . Implement and maintain CI/CD pipelines and IaC (Infrastructure as Code) for AWS environments. Work closely with product managers, architects, and other developers to deliver high-quality solutions. Participate in code reviews and ensure adherence to best practices. Monitor application performance and troubleshoot production issues. Required Skills: Strong hands-on experience with AWS Serverless architecture . Proficient in TypeScript . Solid experience with AWS Lambda , API Gateway , and DynamoDB . Familiar with AWS tools and services such as CloudWatch, IAM, and CloudFormation or CDK. Good understanding of RESTful API design and microservices architecture. Strong problem-solving and debugging skills. Nice to Have: Experience with DevOps practices in AWS environments. Familiarity with GraphQL. Prior experience in Agile/Scrum environments. Education: Bachelor's degree in Computer Science, Engineering, or related field. Required Skills Aws Cloud,React.Js,Terraform,Jira
Posted Date not available
6.0 - 11.0 years
3 - 7 Lacs
pune
Work from Office
Job Purpose ICE Mortgage Technology is the leading cloud-based platform provider for the mortgage finance industry. ICE Mortgage Technology solutions enable lenders to originate more loans, reduce origination costs, and reduce the time to close, all while ensuring the highest levels of compliance, quality, and efficiency. We're looking for motivated, results-oriented people to join our team. Sr. Engineer is a key member of the technology organization, contributing to multiple shared services. The ideal candidate should be self-directed, team-oriented, know and care what the customer wants from our service. The candidate will contribute to ICE mortgage technologys product development team in moving our leading mortgage software solutions to the next level. Analyze, design, develop and unit test software applications with high quality and on schedule, including business critical Web services to be consumed by internal/external applications. Use state-of-the-art technologies and best practices to deliver your implementation Collaborate closely with groups in and outside the development team (eg. QA, Product Management, UE, Tech Pub) to achieve high quality, predictable results. As a Senior Engineer, you will be working in a dynamic product development team while collaborating with other developers, management, and customer support teams. You will have an opportunity to participate in designing and developing services utilized across product lines. All our products are deployed in public (AWS) and/or private cloud environments. Responsibilities Build scalable services and applications optimized for the best customer experience, with scale, performance, security, and availability considerations. Be able to lead an effort to design, architect and write software components. Be able to independently handle activities related to builds and deployments. Create design documentation for new software development and subsequent versions. Identify opportunities to improve and optimize applications. Diagnose complex developmental & operational problems and recommend upgrades & improvements at a component level. Collaborate with global stakeholders and business partners for product delivery. Follow company software development processes and standards. Work on POC or guide the team members. Unblock the team members from technical and solutioning perspective. If required collaborate among different teams. Provide required support and assistance for production outage. Knowledge and Experience Bachelors or masters degree in computer science, Engineering, or a related field 6+ years of software product development experience. Solid experience in object-oriented design and development with Java languages is a must. Solid knowledge of high-scale, multi-tenant Web service development, including REST/JSON and Microservice patterns Spring Boot and similar application framework experience. Strong experience with database concepts and databases such as MS SQL, Mongo, MySQL, PostgreSQL, or DynamoDB Experience in large-scale, multi-tenant microservice deployments that leverage REST/JSON Must be able to deliver high quality code on schedule, communicate with groups in and outside the development team. Experience in UI development frameworks like ReactJS will be preferred. Experience with at least one public cloud AWS, GCP, Azure (AWS preferred) and Exposure to serverless. Solid understanding of security concerns for web-based applications Proficiency in the development environment, IDEs, web & application server, GIT, Continuous Integration, unit-testing tools, Kafka, AWS SQS, containerization and container orchestration like Docker, ECS and Kubernetes and defect management tools. Solid experience with Agile methodology, familiar with Continuous Integration such as Jenkins, Hudson, etc. Another desirable technical knowledge (Nice to have): Kubernetes, Docker. Participate in the agile feature/product design process working with cross-functional teams. Self-starter with strong work ethic with a passion for problem-solving
Posted Date not available
4.0 - 9.0 years
4 - 8 Lacs
hyderabad
Work from Office
As a Senior Java/AWS Developer, you will be part of a team responsible for contributing to the design, development, maintenance and support of ICE Digital Trade, a suite of highly configurable enterprise applications. The ideal candidate must be results-oriented, self-motivated and can thrive in a fast-paced environment. This role requires frequent interactions with project and product managers, developers, quality assurance and other stakeholders, to ensure delivery of a world class application to our users. Responsibilities Reviewing application requirements and interface designs. Contributing to the design and development of enterprise Java applications Developing and implementing highly responsive user interface components using react concepts. Writing application interface codes using JavaScript following react.js workflows. Troubleshooting interface software and debugging application code. Developing and implementing front-end architecture to support user interface concepts. Monitoring and improving front-end performance. Documenting application changes and developing updates. Collaborate with QA team to ensure quality production code. Support and enhance multiple mission-critical enterprise applications. Write unit and integration tests for new and legacy code. Take initiative and work independently on some projects while contributing to a large team on others. Provide second-tier production support for 24/7 applications. Follow team guidelines for quality and consistency within the design and development phases of the application. Identify opportunities to improve and optimize the application. Knowledge and Experience Bachelors degree in computer science or information technology. 4+ years of full stack development experience. In-depth knowledge of Java, JavaScript, CSS, HTML, and front-end languages. Knowledge of performance testing frameworks, Proven success with test-driven development Experience with browser-based debugging and performance testing software. Excellent troubleshooting skills. Good Object-oriented concepts and knowledge of core Java and Java EE. First-hand experience with enterprise messaging (IBM WebSphere MQ or equivalent) Practical knowledge of Java application servers (JBoss, Tomcat) preferred. Spring Framework working knowledge. Experience with the core AWS services Experience with the serverless approaches using AWS resources. Experience in developing infrastructure as code using CDK by efficient usage of AWS services. Experience in AWS services such as API Gateway, Lambda, DynamoDB, S3, Cognito and AWS CLI. Experience in using AWS SDK Understanding of distributed transactions Track record of completing assignments on time with a high degree of quality Experience and/or knowledge of all aspects of the SDLC methodology and related concepts and practices. Experience with Agile development methodologies preferred Knowledge of Gradle / Maven preferred Experience working with commodity markets or financial trading environments preferred Open to learn and willing to participate in development using new frameworks, programming languages. Good to Have Knowledge of REACT tools including React.js, TypeScript and JavaScript ES6, Webpack, Enzyme, Redux, and Flux. Experience with user interface design. experience in AWS Amplify, RDS, EventBridge, SNS, SQS and SES
Posted Date not available
3.0 - 8.0 years
2 - 6 Lacs
pune
Work from Office
ICE Mortgage Technology is driving value to every customer through our effort to automate everything that can be automated in the residential mortgage industry. Our integrated solutions touch each aspect of the loan lifecycle, from the borrower's "point of thought" through e-Close and secondary solutions. Drive real automation that reduces manual workflows, increases productivity, and decreases risk. You will be working in a dynamic product development team while collaborating with other developers, management, and customer support teams. You will have an opportunity to participate in designing and developing services utilized across product lines. The ideal candidate should possess a product mentality, have a strong sense of ownership, and strive to be a good steward of his or her software. More than any concrete experience with specific technology, it is critical for the candidate to have a strong sense of what constitutes good software; be thoughtful and deliberate in picking the right technology stack; and be always open-minded to learn (from others and from failures). Responsibilities Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule. Turn ideas and concepts into carefully designed and well-authored quality code. Articulate the interdependencies and the impact of the design choices. Develop APIs to power data driven products and external APIs consumed by internal and external customers of data platform. Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results. Improve and develop new engineering processes & tools. Knowledge and Experience 3+ years of building Enterprise Software Products. Experience in object-oriented design and development with languages such as Java. J2EE and related frameworks. Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred) Knowledge in Java/J2EE frameworks like Spring Boot, Microservice, JPA, JDBC and related frameworks is must. Built high throughput real-time and batch data processing pipelines using Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift. (Should know basics at least) Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres Proven ability to deliver working solutions on time Strong analytical thinking to tackle challenging engineering problems. Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills. Experience with working in DevOps environment you build it, you run it Demonstrated ability to set priorities and work in a fast-paced, dynamic team environment within a start-up culture. Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc (Nice to have) Experience with handling large data sets using technologies like HDFS, S3, Avro and Parquet (Nice to have)
Posted Date not available
1.0 - 6.0 years
30 - 35 Lacs
pune
Work from Office
Front-End Development: Design and implement responsive, user-friendly interfaces using Modern frameworks (e.g.,React, Angular). Collaborate with designers to translate Figma/wireframes into functional code. Back-End Development:Develop robust, secure, and scalable back-end services using technologies like Node.js, Python. Create and maintain RESTful APIs for seamless data exchange between client and server. Database Management:Design & implement SQL and NoSQL (e.g. MongoDB, DynamoDB) databases. Optimize queries for performance and scalability. Cloud and Deployment:Manage cloud platforms (e.g., AWS, Azure). Use containerization tools like Docker and orchestration tools like Kubernetes for deployment. Integrate CI/CD pipelines to automate builds, testing, and deployments. Testing and Debugging:Conduct unit, integration, and end-to-end testing. Collaboration and Documentation:Work closely with product managers, designers, and other developers to define project requirements and deliverables. Document code, APIs, and technical processes for maintainability and knowledge sharing. Requirements Creating modular and reusable components Translating UI/UX reference design into code as closely as possible Creating more than basic CRUD applications Unit Testing and E2E testing automation Maintaining documentation Nice to have: Know about deploying to any hosting platform Experience working with Linux/Unix environment Experience working with Threejs Must have: HTML/CSS JavaScript Any data storage like sql, mongo, etc Fluency in English communication and comprehension.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City