Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Job Area: Information Technology Group, Information Technology Group > IT Software Developer General Summary: Qualcomm OneIT team is looking for a talented senior Full-Stack Developer to join our dynamic team and contribute to our exciting projects. The ideal candidate will have strong understanding of Java, Spring Boot, Angular/React and AWS technologies as well as experience in designing, managing and deploying applications to the cloud.Key Responsibilities: Design, develop and maintain web applications using Java, Spring Boot, and Angular/React. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Minimum Qualifications: 3+ years of IT-relevant work experience with a Bachelor's degree in a technical field (e.g., Computer Engineering, Computer Science, Information Systems). OR 5+ years of IT-relevant work experience without a Bachelor’s degree. 3+ years of any combination of academic or work experience with Full-stack Application Development (e.g., Java, Python, JavaScript, etc.) 1+ year of any combination of academic or work experience with Data Structures, algorithms, and data stores. Candidate should have: Bachelor's degree in Computer Science, Engineering, or a related field. 5-7 years of experience with minimum 3 years as a Full-Stack developer using Java, Spring Boot and Angular/React. Strong proficiency in Java and Spring Boot. Experience with front-end frameworks such as Angular or React. Familiarity with RESTful APIs and web services. Knowledge of database systems like Oracle, MySQL, PostgreSQL, or MongoDB. Experience with AWS services such as EC2, S3, RDS, Lambda, and API Gateway. Understanding of version control systems, preferably Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience with any other programming language like C#, Python Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines and DevOps practices. Experience with Agile/Scrum/SAFe methodologies. Bachelors or Master’s degree in information technology, computer science or equivalent.
Posted 6 hours ago
3.0 - 5.0 years
27 - 32 Lacs
Bengaluru
Work from Office
Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 8 hours ago
0.0 - 2.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 8 hours ago
3.0 - 5.0 years
3 - 5 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 8 hours ago
2.0 - 5.0 years
30 - 32 Lacs
Bengaluru
Work from Office
Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 8 hours ago
3.0 - 5.0 years
30 - 35 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 8 hours ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 8 hours ago
15.0 - 20.0 years
15 - 19 Lacs
Ahmedabad
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Amazon Web Services (AWS) Good to have skills : Java Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of fulltime education.Role:Technology Architect Project Role Description:Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have Skills :Amazon Web Services (AWS), SSINON SSI:Good to Have Skills :SSI:Java Full Stack Development NON SSI :Job :'',//field Key Responsibilities:1 Experience of designing multiple Cloud-native Application Architectures2 Experience of developing and deploying cloud-native application including serverless environment like Lambda 3 Optimize applications for AWS environment 4 Design, build and configure applications on AWS environment to meet business process and application requirements5 Understanding of security performance and cost optimizations for AWS6 Understanding to AWS Well-Architected best practices Technical Experience:1 8/15 years of experience in the industry with at least 5 years and above in AWS 2 Strong development background with exposure to majority of services in AWS3 AWS Certified Developer professional and/or AWs specialty level certification DevOps /Security 4 Application development skills on AWS platform with either Java SDK, Python SDK, Reactjs5 Strong in coding using any of the programming languages like Python/Nodejs/Java/Net understanding of AWS architectures across containerization microservices and serverless on AWS 6 Preferred knowledge in cost explorer, budgeting and tagging in AWS 7 Experience with DevOps tools including AWS native DevOps tools like CodeDeploy, Professional Attributes:a Ability to harvest solution and promote reusability across implementations b Self Motivated experts who can work under their own direction with right set of design thinking expertise c Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Educational Qualification:15 years of fulltime education. Additional Info:1 Application developers skills on AWS platform with either Java SDK, Python SDK, Nodejs, ReactJS 2 AWS services Lambda, AWS Amplify, AWS App Runner, AWS CodePipeline, AWS Cloud nine, EBS, Faregate,Additional comments:Only Bangalore, No Location Flex and No Level Flex Qualification 15 years of fulltime education.
Posted 9 hours ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microservices and Light Weight Architecture Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time educationModernization Lead:Lead modernization initiatives by re-architecting legacy systems using Java, applying modern software design principles and AWS-based architecture patterns.Drive end-to-end modernization efforts, including re-architecture, refactoring of legacy systems, and cloud migration strategies.Provide architectural guidance and mentorship to engineering teams, fostering best practices in code quality, design, testing, and deployment.Apply Domain-Driven Design (DDD) principles to structure systems aligned with core business domains, ensuring modular and maintainable solutions.Design and implement scalable, decoupled services leveraging AWS services such as EKS, Lambda, API Gateway, SQS/SNS, and Oralce/RDS.Drive system decomposition, refactoring, and migration planning with a clear understanding of system interdependencies and data flows.Promote infrastructure-as-code, CI/CD automation, and observability practices to ensure system reliability, performance, and operational readiness.Proficient in architecting applications with Java and AWS technology stack, microservices, containers Qualification 15 years full time education
Posted 9 hours ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Engineering Sr. Advisor demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery.The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence on delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills.Delivery Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions.Domain Expertise Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on.Problem Solving Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable ones& Responsibilities:The candidate will be responsible to deliver business needs end to end from requirements to development into production.Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns.The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset.The applicant will ensure adherence to enterprise architecture direction and architectural standards.The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others.Experience Required:More than 12 years of experience in software engineering, building data engineering pipelines, middleware and API development and automationMore than 3 years of experience in Databricks within an AWS environmentData Engineering experienceExperience Desired:Expertise in Agile software development principles and patternsExpertise in building streaming, batch and event-driven architectures and data pipelinesPrimary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc.Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, GlueGood understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curationExpertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundryExperience in multi-cloud software-as-a-service products such as Databricks, SnowflakeExperience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformationExperience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNSExperience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFrontExperience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languagesExperience in building CI/CD pipelines using Jenkins, Github ActionsStrong expertise with source code management and its best practicesProficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD)Knowledge on Behavioral Driven Development (BDD) approachAdditional Skills: Ability to perform detailed analysis of business problems and technical environmentsStrong oral and written communication skillsAbility to think strategically, implement iteratively and estimate financial impact of design/architecture alternativesContinuous focus on an on-going learning and development Qualification 15 years full time education
Posted 9 hours ago
7.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Project Role : Cloud Native Engineer Project Role Description : Select and deploy appropriate cloud-native tools to accelerate application development. Knowledge of the target cloud-native tools is necessary, and this role can specialize in one specific native cloud, ex. Azure, AWS, GCP, etc. Must have skills : Amazon Connect Good to have skills : Java, Python (Programming Language), Node.jsMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Native Engineer, you will be responsible for selecting and deploying suitable cloud-native tools to expedite application development. Having expertise in the target cloud-native tools is crucial, and this role may focus on a specific native cloud platform like Azure, AWS, GCP, etc. Roles & Responsibilities:-Manage and guide a team of engineers to design, build, and deploy cloud-native solutions using AWS services.-Own project planning, resource allocation, and delivery management while ensuring adherence to timelines and quality standards.-Architect scalable and secure solutions using AWS Connect, Amazon Lex, Lambda, and API Gateway.-Act as the primary point of contact for technical and delivery matters, liaising with cross-functional teams and business stakeholders.Promote best practices in cloud architecture, including the AWS Well-Architected Framework.-Drive team performance through regular coaching, performance reviews, and mentorship.-Ensure continuous integration and delivery pipelines are in place, and champion DevOps culture.-Oversee the implementation of Infrastructure as Code using Terraform or similar tools.Must have:- Deep experience with AWS Connect, Amazon Lex, AWS Lambda, and API GatewayStrong programming knowledge in at least one language:Python, Node.js, or JavaProven leadership experience managing cloud/DevOps teamsSolid grasp of serverless architecture, microservices, and cloud-native design patternsExcellent communication, planning, and stakeholder management skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Amazon Connect.- Good To Have Skills: Experience with Python (Programming Language), Node.js, Java.- Strong understanding of cloud-native architecture principles.- Hands-on experience in deploying cloud-native applications.- Knowledge of containerization technologies like Docker and Kubernetes.Familiarity with DevOps tools and practices-Hands-on experience with Terraform for managing infrastructure-In-depth understanding of the AWS Well-Architected Framework-AWS Certified Solutions Architect Associate/Professional-Certifications:Any Associate/Professional/Specialty level certification is Mandatory Additional Information:- The candidate should have a minimum of 7.5 years of experience in Amazon Connect.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 9 hours ago
4.0 - 7.0 years
15 - 22 Lacs
Bengaluru
Work from Office
We are looking for a top-notch Senior Software Engineer who is passionate about writing clean, scalable, and secure code. If you take pride in building sustainable applications that meet customer needs and thrive in a collaborative, agile environment, this role is for you. You’ll work with experienced engineers across the enterprise and gain exposure to a variety of automation and cloud technologies. As a Python developer, you will contribute to complex assignments involving cloud-native architectures, automation pipelines, serverless computing, and object-oriented programming. Technical Skills: Proficiency in Python and cloud platforms (AWS, Azure) Experience with MLFlow, Kubernetes, Terraform, AWS SageMaker, Lambda, Step Functions Familiarity with configuration management tools (Terraform, Ansible, CloudFormation) Experience with CI/CD pipelines (e.g., Jenkins, Groovy scripts) Containerization and orchestration (Docker, Kubernetes, ECS, ECR) Understanding of serverless architecture and cloud-native application design Knowledge of infrastructure as code (IaC), IaaS, PaaS, and SaaS models Exposure to AI/ML technologies and model management is a plus Strong verbal and written communication skills Qualifications: Bachelor’s degree in Computer Science, Information Systems, or a related field 4+ years of experience in architecting, designing, and implementing cloud solutions on AWS and/or Azure Proven experience with both relational and non-relational database systems Experience leading data architecture or cloud transformation initiatives Strong troubleshooting and analytical skills Relevant certifications in AWS or Azure preferred Roles and Responsibilities Analyze and translate business requirements into scalable and resilient designs Own and continuously improve parts of the application in an agile environment Develop high-quality, maintainable products using best engineering practices Collaborate with other developers and share design philosophies across the team Work in cross-functional teams including DevOps, Data, UX, and QA Build and manage fully automated build/test/deployment environments Ensure high availability and provide rapid response to production issues Contribute to the design of useful, usable, and desirable products Adapt to new programming languages, platforms, and frameworks as needed
Posted 10 hours ago
4.0 - 9.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role: AWS Data Engineer Location: Hyderabad, Bangalore, Chennai, Mumbai, Pune, Kolkata, Gurgaon Experience: 4-8 years Work Mode: Hybrid Job Description: We are seeking a skilled AWS Data Engineer to design, develop, and support large-scale data solutions on AWS. The ideal candidate will have hands-on expertise in data engineering, automation, and cloud technologies, enabling data-driven decision-making and operational excellence. Contract to Hire Key Responsibilities: Design, develop, and deploy data pipelines and solutions using AWS services such as S3, Glue, Lambda, API Gateway, and SQS . Write clean, efficient code using Python , PySpark , and SQL to process and transform data. Implement batch job scheduling , manage data dependencies, and ensure reliable data processing workflows. Develop and maintain Spark and Airflow jobs for large-scale data processing and orchestration. Automate repetitive tasks and build reusable frameworks to enhance efficiency and reduce manual intervention. Provide Run/DevOps support , monitor pipelines, and manage the ongoing operation of data services on AWS. Ensure high standards for data quality, reliability, and performance. Collaborate with data scientists, analysts, and other engineers to support business initiatives. Must-Have Skills: Strong hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, SQS Proficiency in Python, PySpark, and SQL Experience with batch job scheduling and managing data dependencies Strong knowledge of Spark and Airflow for data processing and orchestration Solidunderstanding of DevOps practices and operational support for cloud data services Good to Have: Experience with containerization (Docker, Kubernetes) Exposure to monitoring/logging tools (CloudWatch, Datadog, etc.) AWS certifications (e.g., Solutions Architect, Data Analytics Specialty)
Posted 11 hours ago
5.0 - 10.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : DevOps Architecture Good to have skills : AWS Architecture, Microsoft Azure ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary : Roles & Responsibilities:The Cloud Solution Consultant is a member of the Global Digital organization and is responsible for professional/technical work related to public cloud platform usage across the enterprise. This is an advanced position, primarily responsible for the duties in the development/support of implementation of solutions using public cloud platforms and the underlying architecture. These implementations span across a broad-range of vendor supplied global services and solutions that are being utilized across the enterprise.The incumbent leads/participates in a broad spectrum of cloud topics (securing alignment, reusability and partnership) across the enterprise where public cloud services are being utilized. He/She will work in close collaboration with the Cyber and Global Infrastructure Services teams to lead the delivery solutions across the enterprise. This role will directly support digital transformation, technology strategy and cloud adoption as part of the Global Digital team. This team is constantly confronted with business challenges and the Cloud Solution Consultant will help cultivate the understanding of high-level, loosely specified requirements to transform them into value for our business through direct contribution to the design, architecture an implementation of solutions in the public cloud platforms. This associate is responsible to communicate/influence technology decisions to both technical/non-technical audiences. This position will support the Operations team by instilling the right mindset and securing adequate tools, methodology and metrics. the incumbent must embrace corporate values and digital/analytic principles. In addition, he/she needs to secure adherence to the vision and enable an inclusive (open to diversity) working environment - Valuing network/communities of practice Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Infrastructure - Strong understanding of cloud computing platforms on AWS and Azure.- Experienced in Multi-Cloud Architecture & Deployment:Demonstrated experience designing, deploying, and managing complex cloud architectures across both AWS (e.g., VPC, EC2, Lambda) and Azure (e.g., Virtual Networks, VMs, Azure Functions).- Proficiency with IaC toolssuch as Terraform, AWS CloudFormation, or Azure Resource Manager (ARM) templates.- Proven track record implementing best practices for identity and Well Designed Architecture- Hands-on experience with CI/CD pipelines and automation tools like Jenkins or GitLab.- Knowledge of containerization technologies like Docker and Kubernetes.- Security and Compliance:Proven track record implementing best practices for identity and Well Designed Architecture Additional Information:- The candidate should have a minimum of 5 years of experience in Cloud Infrastructure- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 day ago
2.0 - 3.0 years
3 - 7 Lacs
Ahmedabad
Work from Office
Enterprise-Level Development: Design, develop, and deploy full-stack web applications for enterprise systems, gaming platforms, and social media applications. Backend Development: Build and maintain scalable backend services using Node.js, Next.js, Nest.js, and related frameworks. Frontend Development: Create dynamic and responsive user interfaces using React.js . Familiarity with Next.js and Vue.js is a plus. Third-Party Integrations: Integrate APIs, payment gateways, cloud services, and social media platforms effectively. Database Management: Design, optimize, and manage both relational (PostgreSQL, MySQL) and NoSQL (MongoDB) databases for large-scale data handling. Cloud Infrastructure: Deploy and manage applications on AWS (e.g., EC2, S3, Lambda, RDS) for scalable and reliable performance. Containerization: Use Docker for application containerization and ensure smooth deployment across environments. Team Leadership & Communication: Lead development teams, mentor junior developers, and maintain clear communication with clients to gather requirements and ensure delivery satisfaction. Performance Optimization: Improve application speed, scalability, and security to ensure high availability and excellent user experience. Agile Collaboration: Work in Agile teams, participating in sprint planning, reviews, and delivering consistent, high-quality results. AI Integration: Collaborate on AI features such as chatbots, recommendation systems, sentiment analysis, or content generation using tools like OpenAI, AWS AI/ML, or Google Cloud AI. Required Skills & Experience Node.js: 2+ years of hands-on experience with scalable backend development. React.js & Next.js: Strong expertise in building modern frontends, including SSR and SSG. Database Expertise: Solid experience with PostgreSQL, MySQL, and MongoDB, including schema design and performance tuning. Cloud Platforms: Proficient in AWS services (EC2, S3, Lambda, RDS) for hosting and scaling applications. Docker: Deep understanding of containerization with Docker; familiarity with orchestration tools like Kubernetes. AI Tooling: Exposure to AI platforms such as OpenAI, Google Cloud AI, or similar tools. Why Join Webs Optimization Software Solution Working Days: 5 days a week Company History: Incorporated since 2013 Team: An ever-growing team of 80+ highly talented professionals Work Schedule: Flexible working hours Health Benefits: Medical insurance Work Culture: Positive atmosphere and culture promoting personal growth Job Satisfaction: Job satisfaction and stability with a suitable leave policy Company Activities: Fun company activities Benefit of WFH Policy
Posted 1 day ago
4.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 1 day ago
4.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what?s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 14 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,186M annual revenue (13.2% Y-o-Y). Along with our growth, we?ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,850+ people located in 21 countries across the globe. Throughout this market-leading growth, we?ve maintained strong employee satisfaction - over 94% of our employees approve of the CEO and 89% recommend working at Persistent to a friend. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 1 day ago
4.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 1 day ago
7.0 - 13.0 years
9 - 15 Lacs
Bengaluru
Work from Office
Location: Bangalore. About LeadSquared. One of the fastest-growing SaaS Unicorn companies in the CRM space, LeadSquared empowers organizations with the power of automation. More than 2000 customers with 2 lakhs+ users across the globe utilize the LeadSquared platform to automate their sales and marketing processes and run high-velocity sales at scale, We are backed by prominent investors such as Westbridge Capital, Stakeboat Capital, and Gaja Capital to name a few. We are expanding rapidly and our 1300+ strong and still growing workforce is spread across India, the U.S, the Middle East, ASEAN, ANZ, and South Africa, Among Top 50 fastest growing tech companies in India as per Deloitte Fast 50 programs. Frost and Sullivan's 2019 Marketing Automation Company of the Year award. Among Top 100 fastest growing companies in FT 1000: High-Growth Companies Asia-Pacific. Listed as Top Rates Product on G2Crowd, GetApp, and TrustRadius. Engineering @ LeadSquared. At LeadSquared, we like being up to date with the latest technology and utilizing the trending tech stacks to build our product. By joining the engineering team, you get to work first-hand with the latest web and mobile technologies and solve the challenges of scale, performance, security, and cost optimization. Our goal is to build the best SaaS platform for sales execution in the industry and what better place than LeadSquared for an exciting career?. The Role. LeadSquared platform and product suite are 100% on the cloud and currently all on AWS. The product suite comprises of a large number of applications, services, and APIs built on various open-source and AWS native tech stacks and deployed across multiple AWS accounts, The role involves leading the mission-critical responsibility of ensuring that all our online services are available, reliable, secure, performant, and running at optimal costs. We firmly believe in a code and automation-driven approach to Site Reliability, Responsibilities. Taking ownership of release management with effective build and deployment processes by collaborating with development teams, Infrastructure and configuration management of production systems, Be a stakeholder in product scoping, performance enhancement, cost optimization, and architecture discussions with the Engineering leaders, Automate DevOps functions and full control of source code repository management with continuous integration, Strong understanding of Product functionality, customers’ use cases, and architecture, Prioritize and meet the SLA for incidents and service management; also, to ensure that projects are managed and delivered on time and quality, Recommend new technologies and tools that will automate manual tasks, better observability, and faster troubleshooting, Need to make sure the team adheres to compliance and company policies with regular audits, Motivating, empowering, and improving the team’s technical skills, Requirements. 13+ years’ experience in building, deploying and scaling software applications on AWS cloud. (Preferably in SaaS). Deep understanding of observability and cost optimization of all major AWS services – EC2, RDS, Elasticsearch, Redis, SQS, API Gateway, Lambda, etc, AWS certification is a plus, Experience in building tools for deployment automation and observability response management for AWS resources. Dot Net, Python, and CFTs or Terraform are preferred, Operational experience in deploying, operating, scaling, and troubleshooting large-scale production systems on the cloud, Strong interpersonal communication skills (including listening, speaking, and writing) and ability to work well in a diverse, team-focused environment with other DevOps and engineering teams, Function well in a fast-paced, rapidly changing environment, 5+ years’ experience in people management, Why Should You Apply?. Fast-paced environment. Accelerated Growth & Rewards. Easily approachable management. Work with the best minds and industry leaders. Flexible work timings. Interested?. If this role sounds like you, then apply with us! You have plenty of room for growth at LeadSquared, Show more Show less
Posted 4 days ago
2.0 - 6.0 years
3 - 6 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
hackajob is collaborating with LexisNexis Risk Solutions to connect them with exceptional tech professionals for this role.. Rust Developer @ IDVerse. Full time — Remote — AU / US. Join a strong team of passionate engineers and build a world-class platform to fight identity fraud at a global scale. In Rust.. The Position. You will work in close collaboration with our SVP of Architecture, our engineering team, and our product team to:. Write and re-write hardened, documented, and tested weband API-based applications. Build our growing collection of libraries. Define and enforce best practices of an expanding Rust team A fair chunk is green-field work (and no, it's not crypto-currency/blockchain related ) (even our front-end applications are written in Rust, using Leptos for the WASM (and Tailwind for CSS)). We prefer event-based architectures, cloud (AWS) and serverless. Only the good stuff.. Needed Qualifications. Whilst technical competence is critical, we place great emphasis on passion, communication and collaboration across the business.. You have solid experience creating and maintaining web-based and API-based applications (in Rust or not). You can demonstrate having built non-trivial Rust projects, ideally web-related. You are comfortable with JavaScript/TypeScript. You are able to communicate clearly, both in writing and orally, and collaborate effectively with a remote team. You understand that documentation is half the battle, and that untested code is broken code. You know it takes time to build anything correctly, and you also know how to "get things done" when the situation calls for it. You are autonomous, but also know it's better to ask than to guess. You are dependable, responsible, and committed.. Nice-to-Haves. It would be even more awesome if you have experience:. Building front-end WebAssembly applications (with Leptos, maybe?). Solving problems with machine learning. Developing for/with AWS Serverless technologies (API Gateway, Lambda, DynamoDB...). Location and Time Zone. Our team is globally distributed and fully remote. The higher concentration is based around the Australian / East Asia time zones. For this role, we'll be looking at any location, but will favour either the American or European time zones.. About Us. IDVerse is a Sydney-based start-up that is a global pioneer in the development of digital identity verification technology. We've built everything from the ground up and have a broad range of blue-chip customers across banking, telecommunications, government, and more. We've perfected the technology locally in Australia and New Zealand, and are quickly expanding into the northern hemisphere. We're still a small team, and take pride in making it smart and inclusive. The position is remote and the work week can be flexible. Remuneration will be competitive and based on experience. We encourage people from all backgrounds and genders to apply to this position. As an early member of the team, you will have a great impact on its future shape.. Instructions On How To Apply. Send an email to devjobs@idverse.com with Rust Up! in the title (be exact, we have automated filters that will discard anything else. This is your first test!). Write a few lines about you and attach your rsum. Add any link you think will help us assess both your soft and hard skills. If you pique our interest, we'll set up a video call and go from there.. Show more Show less
Posted 4 days ago
1.0 - 4.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Role Overview. We are seeking a skilled Backend Developer with expertise in TypeScript and AWS to design and implement scalable, event-driven microservices. The ideal candidate will have a strong background in serverless architectures and backend development.. Key Responsibilities. Backend Development: Develop and maintain server-side applications using TypeScript and Node.js.expertia.ai. API Design: Create and manage RESTful APIs adhering to OpenAPI specifications.. Serverless Architecture: Implement serverless solutions using AWS Lambda, API Gateway, and DynamoDB.. Event-Driven Systems: Design and build event-driven architectures utilizing AWS SQS and SNS.vitiya99.medium.com. Microservices: Develop microservices that are scalable and maintainable.. Collaboration: Work closely with frontend developers and other stakeholders to integrate APIs and ensure seamless functionality.. Code Quality: Write clean, maintainable code and conduct code reviews.iihglobal.com. Continuous Improvement: Stay updated with the latest industry trends and technologies to continuously improve backend systems.. Required Skills & Qualifications. Experience: 7–10 years in backend development with a focus on TypeScript and Node.js.. AWS Expertise: Proficiency in AWS services such as Lambda, API Gateway, DynamoDB, SQS, and SNS.. API Development: Experience in designing and implementing RESTful APIs.. Event-Driven Architecture: Familiarity with building event-driven systems using AWS services.. Microservices: Experience in developing microservices architectures.. Version Control: Proficiency in using Git for version control.. CI/CD: Experience with continuous integration and continuous deployment pipelines.. Collaboration: Strong communication skills and ability to work in a team environment.. Preferred Skills. Infrastructure as Code: Experience with tools like Terraform or AWS CloudFormation.. Containerization: Familiarity with Docker and container orchestration tools.. Monitoring & Logging: Experience with monitoring and logging tools to ensure system reliability.. Agile Methodologies: Experience working in Agile development environments.. Show more Show less
Posted 4 days ago
4.0 - 7.0 years
9 - 13 Lacs
Kolkata
Work from Office
Join our Team. About this opportunity:. We are looking for skilled Java Developer at all levels (2-8 years) to join our team. The ideal candidate will have strong expertise in Spring Boot, Kafka, AWS, Docker, and Kubernetes, with a passion for building scalable and efficient backend systems. Knowledge of Generative AI (GenAI) would be a big plus!. We are open for Noida, Gurgaon , Kolkata , Pune , Bangalore and Chennai locations.. Key Responsibilities:. Design, develop, and maintain backend services using Java and Spring Boot.. Implement event-driven architectures using Kafka.. Deploy and manage applications on AWS, leveraging cloud-native services.. Containerize applications using Docker and orchestrate deployments with Kubernetes.. Write efficient, scalable, and secure code following best practices.. Collaborate with cross-functional teams, including frontend developers, DevOps, and product teams.. Optimize application performance, troubleshoot issues, and ensure high availability.. Stay updated with emerging technologies, particularly Generative AI trends.. Requirements:. 2-8 years of experience in Java and Spring Boot.. Hands-on experience with Kafka for real-time data streaming.. Knowledge of AWS services (EC2, S3, Lambda, etc.).. Experience with Docker and Kubernetes for containerized deployments.. Understanding of microservices architecture and distributed systems.. Familiarity with RESTful APIs, database management (SQL/NoSQL), and caching strategies.. Strong problem-solving skills and a passion for writing clean, maintainable code.. Preferred Qualifications:. Knowledge of Generative AI (GenAI) and AI/ML models.. Experience with CI/CD pipelines and DevOps practices.. Familiarity with monitoring and logging tools. Exposure to Agile methodologies and team collaboration tools.. Show more Show less
Posted 4 days ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Location: OnebyZero Bangalore, India/ Ho Chi Minh, Vietnam/Bangkok, Thailand/Makati, Philippines. Work Set-up: Hybrid. The Role: DevSecOps Engineer. We are looking for a skilled DevSecOps Engineer with over 3 years of experience and expertise in AWS security. This role focuses on ensuring the security of our cloud infrastructure and applications while fostering collaboration between development, operations, and security teams. In addition to security, the role involves managing cloud infrastructure using Terraform and contributing to overall DevOps practices.. What You’ll do. Cloud Security Design & Implementation: Design, implement, and manage secure AWS cloud infrastructure, ensuring adherence to best practices in security, scalability, and availability.. Infrastructure as Code (IaC): Develop and maintain cloud infrastructure using Terraform, ensuring version control, scalability, and ease of deployment.. Security Automation: Develop and maintain CI/CD pipelines with integrated security checks to enable secure and rapid software delivery.. Risk Assessment: Identify vulnerabilities, assess risks, and implement security measures to protect cloud environments.. Compliance Management: Ensure compliance with regulatory standards and internal policies (e.g., GDPR, HIPAA, ISO 27001) across the cloud infrastructure.. Monitoring & Incident Response: Monitor and respond to security incidents using AWS services like CloudTrail, GuardDuty, and Security Hub.. Collaboration & Training: Work with development and operations teams to implement secure coding practices and conduct security training.. DevOps Practices: Collaborate with teams to ensure smooth integration of security into the DevOps pipeline, enabling automated deployments and scaling.. Requirements. Basic Qualifications. 3+ years of hands-on experience in a DevOps Engineer role, SecOps or cloud security roles.. Extensive experience with AWS services (EC2, S3, VPC, Lambda, RDS, etc.). Strong proficiency in Infrastructure as Code (IaC) using Terraform, AWS CDK, or CloudFormation.. Demonstrated expertise in building, managing, and automating CI/CD pipelines (e.g., GitHub Actions, Jenkins).. Advanced scripting skills in Python and Bash for automation and tool development.. Expertise in Linux system administration (Ubuntu, CentOS, etc.).. Deep understanding of networking, security practices, and monitoring in cloud environments.. Experience with containerization and orchestration tools such as Docker and Kubernetes.. Knowledge of security testing tools (e.g., OWASP ZAP, Snyk, or Burp Suite).. Skills. Cloud Platforms: Advanced AWS Cloud expertise (EC2, VPC, S3, Lambda, RDS, CloudFront, etc.). IaC Tools: Terraform, AWS CDK, CloudFormation. CI/CD Tools: GitHub Actions, Jenkins. Scripting Languages: Python, Bash. Containerization: Docker, Kubernetes. Operating Systems: Linux (Ubuntu, CentOS, etc.). Version Control: Git, GitHub, GitLab. Show more Show less
Posted 4 days ago
3.0 - 5.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Job Description. We are seeking a highly skilled and motivated Cloud Data Engineer with a strong background in computer science or statistics, coupled with at least 5 years of professional experience. The ideal candidate will possess a deep understanding of cloud computing, particularly in AWS, and should have a proven track record in data engineering, Big Data applications, and AI/ML applications.. Responsibilities. Cloud Expertise:. Proficient in AWS services such as EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable cloud-based solutions.. Execute efficient and secure cloud infrastructure configurations.. Data Engineering:. Develop, construct, test, and maintain architectures, such as databases and processing systems.. Utilize coding skills in Spark and Python for data processing and manipulation.. Administer multiple ETL applications to ensure seamless data flow.. Big Data Applications:. Work on end-to-end Big Data application projects, from conception to deployment.. Optimize and troubleshoot Big Data solutions to ensure high performance.. AI/ML Applications:. Experience in developing and deploying AI/ML applications based on NLP, CV, and GenAI.. Collaborate with data scientists to implement machine learning models into production environments.. DevOps and Infrastructure as a Service (IaaS):. Possess knowledge and experience with DevOps applications for continuous integration and deployment.. Set up and maintain infrastructure as a service, ensuring scalability and reliability.. Qualifications. Bachelor’s degree in computer science, Statistics, or a related field.. 5+ years of professional experience in cloud computing, data engineering, and related fields.. Proven expertise in AWS services, with a focus on EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, etc.. Proficient coding skills in Spark and Python for data processing.. Hands-on experience with Big Data application projects.. Experience in AI/ML applications, particularly in NLP, CV, and GenAI.. Administration experience with multiple ETL applications.. Knowledge and experience with DevOps tools and processes.. Ability to set up and maintain infrastructure as a service.. Soft Skills. Strong analytical and problem-solving skills.. Excellent communication and collaboration abilities.. Ability to work effectively in a fast-paced and dynamic team environment.. Proactive mindset with a commitment to continuous learning and improvement.. Show more Show less
Posted 4 days ago
8.0 - 13.0 years
25 - 30 Lacs
Mumbai
Work from Office
Our mission is to make meaningful learning a part of your everyday ????. The shelf life of our skills is now less than 5 years. So, if you stopped learning today, your skills would soon be irrelevant. Think that’s a big problem? You’d be right.. Enter HowNow. Founded in 2019, our Learning and Skills Platform is disrupting the way people learn and upskill through technology. Whether it's finding a quick answer, learning skills or tapping into shared knowledge, we make it easy for people to learn what they need, when they need it.. Already used by fast-growing scale-ups and global enterprises, such as Trainline, Depop and Sanofi, we’re pushing the boundaries of how people learn.. Hi I'm Naaz the People Advisor at HowNow ???????? I’m looking for a Senior DevOps Engineer to join us.. Joining us as the first DevOps engineer offers a unique opportunity to shape our culture and practices from the ground up. You'll have the autonomy to drive innovation and make a visible impact on the company’s growth and success, setting the foundation for our future.. As the company grows, so will your opportunities. You'll be in a prime position to evolve into a leadership role, guiding the DevOps function as it becomes central to our operations. This role is perfect for someone eager to make a lasting impact and grow alongside a dynamic company.. Alongside the opportunities to develop and grow your career, we're a fun and friendly bunch. Have a watch of the video below to get an understanding of what it's like to work here.. Day-to-day tasks will include ????. You’ll design and manage scalable and highly available cloud infrastructure on AWS, leveraging services such as EC2, RDS, ElastiCache, OpenSearch, Lambda, and ECS. Implement and maintain EC2 Auto Scaling and load balancing to ensure seamless performance.. You’ll develop, refine, and oversee continuous integration and deployment pipelines, ensuring rapid and reliable code deployment using Jenkins, Terraform, and CloudFormation.. You’ll drive automation initiatives using advanced scripting in Bash, Python, and PowerShell to improve system deployments, upgrades, and day-to-day operations.. You’ll utilize AWS CloudWatch and custom monitoring tools to track system performance and health. Proactively handle incidents, troubleshoot system issues, and perform root cause analysis to prevent future disruptions.. You’ll work closely with engineering and product teams to define and implement infrastructure solutions that support the development of new features and products.. The key things that we will be looking for in applicants ????. You have minimum of 5 years in DevOps or similar roles with a strong background in software development and system administration.. You have bachelor’s degree in Computer Science, Engineering, or related field; or equivalent practical experience.. You have proficiency with AWS, including managing databases such as MongoDB and MySQL on RDS. Strong experience in building and managing containerized applications using Docker and Kubernetes.. You have excellent analytical and troubleshooting skills, with the capability to work under pressure and manage multiple priorities.. You have strong interpersonal and communication skills, capable of working in a team environment and interacting with all levels of management.. Nice To Have. Familiarity with the Ruby on Rails ecosystem.. AWS Certification is highly regarded.. Active contributions to open-source projects.. What You’ll Get. ???? Our salaries are calculated using a SaaS benchmarking tool called (Figures). Happy to disclose upon application. You’ll also receive a performance based bonus on top.. ???? Hybrid working (in our offices 3x a week Mon-Wed). ???? Wind-down Fridays. No meetings from 3pm onwards on Fridays, for you to wind down for the weekend. Our HowNow’ers use this time to exercise, study, or spend some time with their family and friends, which you can read about here. ???????? Enhanced maternity and paternity policies which you can read about here. ???? 25 days holiday, plus bank holidays and your birthday day off. ???? An annual £200 learning and development budget. ???? Dog friendly offices we love our pets! ????. ???? Monthly socials and team dinners which have included Bounce, Mystery Rooms, ITC Maratha, JW Marriot and many more. ???? Access to the very best learning platform out there (HowNow+) to keep you at the top of your game. What's next? ????. Once you've applied, we'll get back in touch with you. This is usually within the next 5 working days. Sometimes it can take slightly longer, but we will get back to you irregardless of what the outcome of your application is.. You'll be invited to a 30-minute video call with Naaz, our People Operations Coordinator to discuss your experiences and the role.. You'll be invited to a 60-minute technical Interview. You'll be invited to a 60-minute video call with Hardik, our Backend Team Lead and Ashish, our CTO and Co-Founder.. Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Lambda expressions have become increasingly popular in the tech industry, with many companies in India actively seeking professionals with this skill set. Job seekers with expertise in lambda expressions can find promising opportunities in various sectors across the country.
The salary range for lambda expressions professionals in India varies based on experience level. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 10-15 lakhs per annum.
In the lambda expressions field, a typical career path may involve starting as a Junior Developer, progressing to a Senior Developer, and eventually moving up to a Tech Lead position. With continuous learning and skill development, professionals can advance their careers in this domain.
Apart from lambda expressions, professionals in this field are often expected to have knowledge of the following related skills: - Java programming - Functional programming concepts - Spring framework - Distributed systems - Problem-solving abilities
As you prepare for lambda expressions job opportunities in India, remember to showcase your expertise and skills confidently during interviews. Keep learning and honing your abilities to stay competitive in this dynamic field. Best of luck in your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France