Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
25 - 30 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 3 days ago
3.0 - 7.0 years
8 - 12 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 3 days ago
2.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Job Description. Responsibilities:. Design, implement, and manage cloud infrastructure using AWS services, including EC2, Lambda, API Gateway, Step Functions, EKS clusters, and Glue. Develop and maintain Infrastructure as Code (IaC) using Terraform to ensure consistent and reproducible deployments. Set up and optimize CI/CD pipelines using tools such as Azure Pipelines and AWS Pipelines to automate software delivery processes. Containerize applications using Docker and orchestrate them with Kubernetes for efficient deployment and scaling. Write and maintain Python scripts to automate tasks, improve system efficiency, and integrate various tools and services. Develop shell scripts for system administration, automation, and troubleshooting. Implement and manage monitoring and logging solutions to ensure system health and performance. Collaborate with development teams to improve application deployment processes and reduce time-to-market. Ensure high availability, scalability, and security of cloud-based systems. Troubleshoot and resolve infrastructure and application issues in production environments. Implement and maintain backup and disaster recovery solutions. Stay up-to-date with emerging technologies and industry best practices in DevOps and cloud computing. Document processes, configurations, and system architectures for knowledge sharing and compliance purposes. Mentor junior team members and contribute to the overall growth of the DevOps practice within the organization. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less
Posted 3 days ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Join our Team. About this opportunity:. We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker & Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers.. What you will do:. Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks.. Design, develop and maintain scalable Java components using REST or SOAP based Web Services. ? Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ & Kafka. ? Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration.. ? Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set.. ? Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat). ? Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools.. The skills you bring:. Strong expertise in Java/J2ee, Springboot & Micriservices.. ? Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design). ? Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch). ? Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces. ? Proficiency in Java Web Application Development using Spring MVC & Spring Boot. ? Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA). ? Familiar with Database concepts with knowledge in RDBMS/SQL. ? Good understanding of Monolithic & Microservice Architecture. What happens once you apply?. Click Here to find all you need to know about what our typical hiring process looks like.. We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop.. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more.. Primary country and city: India (IN) || Kolkata. Job details: Software Developer. Job Stage: Job Stage 4. Primary Recruiter: Avishek Lama. Hiring Manager: Suranjit Dutta. Show more Show less
Posted 3 days ago
5.0 - 10.0 years
14 - 17 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa
Posted 4 days ago
4.0 - 9.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Designs, deveops and supports appications soutions with focus on HANA version of Advanced Business Appication Programming (ABAP). This speciaty may design, deveop and/or re-engineer highy compex appication components, and integrate software packages, programs and reusabe objects residing on mutipe patforms. This speciaty may additionay have working knowedge of SAP HANA Technica Concept and Architecture, Data Modeing using HANA Studio, ABAP Deveopment Toos (ADT), Code Performance Rues and Guideines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live modes consumption Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 4 -12 years of experience required. The ABAP on HANA Appication Deveopers woud possess the knowedge of the foowing topics and appy them to bring in vaue and innovation to cient engagementsSAP HANA Technica Concept and Architecture, Data Modeing using HANA Studio, ABAP Deveopment Toos (ADT), Code Performance Rues and Guideines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live modes consumption. Designing and deveoping, data dictionary objects, data eements, domains, structures, views, ock objects, search heps and in formatting the output of SAP documents with mutipe options. Modifying standard ayout sets in SAP Scripts, Smart forms & Adobe Forms Deveopment experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports) Preferred technica and professiona experience Experience in working in Impementation, Upgrade, Maintenance and Post Production support projects woud be an advantage Understanding of SAP functiona requirement, conversion into Technica design and deveopment using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in impementation or support projects
Posted 4 days ago
5.0 - 10.0 years
14 - 17 Lacs
Mumbai
Work from Office
As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations. Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa
Posted 4 days ago
8.0 - 13.0 years
25 - 30 Lacs
Chennai
Work from Office
Join our Team Join our Team Job Description We are looking for and experienced Java Developer with strong technical expertise to design and lead development of scalable, high-performance Java / python applications The ideal candidate should have Indepth understanding of Java / python technologies, Design Pattern, Microservice Architecture, Docker &Kubernetes, Integration Framework This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers, What You Will Do Design and implement back-end solutions using Java, Spring Boot and related frameworks, Design, develop and maintain scalable Java components using REST Microservices Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, Hornet Q &Kafka Work experience in AWS/GCP/Azure or Ericsson internal ADP Services Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration, Make effective use of Caching Technologies (like Hazle cast /Redis /Infini span /EH Cache /Memcached) in application to handle large volume of data set, Deploy the application in Middleware or App Server (like Jboss/WebLogic/tomcat) Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools, Communicate effectively with a diverse set of technical audiences to convey complex concepts, You will bring Strong expertise in Java/J2ee technologies Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design) Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch) Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces Proficiency in Java Web Application Development using Spring MVC & Spring Boot Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA) Familiar with Database concepts with knowledge in NoSQL Good understanding of Microservice Architecture Familiarity in working with Design Pattern like Creational, Behavioral, Structural & Dependency Injection(Spring IoC) Proficiency in exposing & consuming using REST based Web Services Expertise in Messaging or Streaming Framework (any one like Active MQ/Hornet Q/Kafka) Knowledge about Caching Technologies (any one like Hazle cast/Redis/Infini span/EH Cache/Mem Cache) Working experience of Integration Framework (any one like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration) Hands on experience in any Middleware or App Server (any one like Jboss/tomcat) Knowledge of Java Enterprise Technologies (like Filters, Interceptor) Familiar with JEE Security (like Encryption/Decryption, Spring Security, SSL/TLS) Understanding of high-performance real time and distributed transactional processing Experience in Unit Testing (like Junit/nUnit/Mockito) and code coverage (like JaCoCo/Any Other) Knowledge of Cloud Technologies (like Docker & Kubernetes) Familiar with DevOps Tools like Git, GitLab, bitbucket, SVN etc, Static Code review with SonarQube & Agile process/tools like Jira, Good to have Telecom Domain Knowledge Good to have knowledge in Distributed log management (ELK, Splunk) Good to have scripting proficiency in Unix/Linux platform Good to have understanding on Security (Vulnerability/Privacy/Hardening etc) and their tools, Why join Ericsson At Ericsson, you?ll have an outstanding opportunity The chance to use your skills and imagination to push the boundaries of what?s possible To build solutions never seen before to some of the worlds toughest problems You?ll be challenged, but you wont be alone You?ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next, What happens once you apply Click Here to find all you need to know about what our typical hiring process looks like, Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team Ericsson is proud to be an Equal Opportunity Employer learn more, Primary country and city: India (IN) || Chennai Req ID: 765318
Posted 5 days ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.
Posted 5 days ago
8.0 - 13.0 years
15 - 30 Lacs
Pune
Work from Office
Job Description SecurityHQ is a global cybersecurity company. Our specialist teams design, engineer and manage systems that promote clarity and an inclusive culture of trust, build momentum around improving security posture, and increase the value of cybersecurity investment. Around the clock, 365 days per year, our customers are never alone. Were SecurityHQ. Were focused on engineering cybersecurity, by design Responsibilities Lead response to complex, high-impact security incidents in AWS, including unauthorized access, data breaches, malware infections, DDoS attacks, phishing, APTs, zero-day exploits, and cloud misconfigurations. Perform in-depth analysis of security incidents, including advanced log analysis, digital forensic investigation, and root cause analysis. Develop and implement containment, eradication, and recovery plans for complex security incidents, minimizing disruption and improving security posture. Coordinate with internal and external stakeholders during incident response activities. Document incident details, analysis findings, and remediation actions, including detailed forensic reports and security posture assessments. Identify and recommend security improvements to prevent future incidents and enhance cloud security posture, including: AWS security best practices Security tool implementation and configuration (with a focus on CSPM tools) Vulnerability management Security awareness training Threat hunting strategies Security architecture enhancements CSPM implementation and optimization Develop and maintain AWS-specific incident response plans, playbooks, and procedures, emphasizing automation, orchestration, and continuous security posture improvement. Stay current on cloud security, digital forensics, and cloud security posture management. Mentor junior security analysts in incident response and security posture management. Participate in on-call rotation, providing expert-level support and guidance on security posture. Develop and deliver training on incident response, forensic best practices, and cloud security posture management. Conduct proactive threat hunting and security posture assessments. Contribute to the development of security tools and automation to improve incident response efficiency, effectiveness, and security posture. Essential Skills Expert-level understanding of AWS services, including: EC2, S3, RDS, VPC, Lambda CloudTrail, CloudWatch, Config, Security Hub, GuardDuty IAM, KMS AWS Organizations, AWS Control Tower Extensive experience with SIEM systems (e.g., Datadog, Qradar, Azure Sentinel) in a cloud environment, with a focus on security posture monitoring. Mastery of log analysis, network analysis, and digital forensic investigation techniques, including experience with specialized forensic tools (e.g., EnCase, FTK, Autopsy, Velociraptor) and CSPM tools. Strong experience with scripting (e.g., Python, PowerShell) for automation, analysis, tool development, and security posture management. Deep familiarity with security tools and technologies, including: IDS/IPS EDR Vulnerability scanners Firewalls Network forensics tools CSPM tools Excellent communication and interpersonal skills, with the ability to convey highly technical information to technical and non-technical audiences, including executive leadership and legal counsel, regarding incident response and security posture. Exceptional problem-solving and analytical skills; ability to remain calm, focused, and decisive under high-pressure situations, including those involving significant security posture deficiencies. Ability to work independently, lead a team, and collaborate effectively to improve the organization's security posture. Expert-level understanding of AWS services, including: EC2, S3, RDS, VPC, Lambda CloudTrail, CloudWatch, Config, Security Hub, GuardDuty IAM, KMS AWS Organizations, AWS Control Tower Extensive experience with SIEM systems (e.g., Datadog, Qradar, Azure Sentinel) in a cloud environment, with a focus on security posture monitoring. Mastery of log analysis, network analysis, and digital forensic investigation techniques, including experience with specialized forensic tools (e.g., EnCase, FTK, Autopsy, Velociraptor) and CSPM tools. Strong experience with scripting (e.g., Python, PowerShell) for automation, analysis, tool development, and security posture management. Deep familiarity with security tools and technologies, including: IDS/IPS EDR Vulnerability scanners Firewalls Network forensics tools CSPM tools Excellent communication and interpersonal skills, with the ability to convey highly technical information to technical and non-technical audiences, including executive leadership and legal counsel, regarding incident response and security posture. Exceptional problem-solving and analytical skills; ability to remain calm, focused, and decisive under high-pressure situations, including those involving significant security posture deficiencies. Ability to work independently, lead a team, and collaborate effectively to improve the organization's security posture. Education Requirements & Experience Master's degree in Computer Science, Cybersecurity, or a related field. AWS Security certifications (e.g., AWS Certified Security - Specialty). Relevant security certifications (e.g., CISSP, GCIH, GCIA, GREM, GNFA, OSCP). Experience leading incident response teams and security posture improvement initiatives. Experience with cloud automation and orchestration (e.g., AWS Systems Manager, Lambda) for incident response and security posture management. Knowledge of DevSecOps principles and practices, including security integration into CI/CD pipelines and infrastructure as code (IaC) security. Experience with container security (e.g., Docker, Kubernetes) in AWS, including forensic analysis and security posture assessment. Experience with reverse engineering and malware analysis, focused on identifying threats that impact cloud security posture. Strong understanding of legal and regulatory issues related to digital forensics, incident response, and cloud security posture (e.g., data privacy, chain of custody, compliance requirements).
Posted 5 days ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 5 days ago
9.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 5 days ago
10.0 - 15.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Key Responsibilities: Design, develop, and maintain backend services using Python and AWS serverless technologies. Implement event-driven architectures to ensure efficient and scalable solutions. Utilize Terraform for infrastructure as code to manage and provision AWS resources. Configure and manage AWS networking components to ensure secure and reliable communication between services. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code while following best practices. Troubleshoot and resolve issues in a timely manner. Stay up to date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Qualifications: 9 to 15 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Strong understanding of event-driven architecture and its implementation. Hands-on experience with Terraform for infrastructure as code. In-depth knowledge of AWS networking components and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with other programming languages and frameworks. Familiarity with CI/CD pipelines and DevOps practices.
Posted 5 days ago
5.0 - 10.0 years
30 - 40 Lacs
Pune, Ahmedabad
Work from Office
We are seeking an experienced Sr. Java Developer with expertise in Java Spring and Spring Boot frameworks, Rest API, and Cloud. The ideal candidate will have 6+ years of hands-on experience in developing scalable and robust applications. Experience with any cloud services (AWS/Azure/GCP). Job Title: Sr. Java Developer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Responsible for the complete software development life cycle, including requirement analysis, design, development, deployment, and support. Responsible for developing software products for Agentic AI Security. Write clean, testable, readable, scalable and maintainable Java code. Design, develop and implement highly scalable software features and infrastructure on our security platform ready for cloud native deployment from inception to completion. Participate actively and contribute to design and development discussions. Develop solid understanding and be able to explain advanced cloud computing and cloud security concepts to others. Work cross-functionally with Product Management, SRE, Software, and Quality Engineering teams to deliver new security-as-a-service offerings to the market in a timely fashion with excellent quality. Be able to clearly communicate goals and desired outcomes to internal project teams. Work closely with customer support teams to improve end-customer outcomes. Required Skill: Strong programming skills in Java, with experience in building distributed systems. 6+ years of experience in software engineering, with a focus on cloud-native application development, at large organizations or innovative startups. 3+ Experience and deep understanding for building connectors for Low Code/noCode and Agentic AI platforms like Microsoft Copilot Studio, Microsoft Power Platform, Salesforce Agentforce, Zappier, Crew AI, Marketo etc. 5+ Experience building connectors for SaaS Applications like Microsoft O365, Power Apps, Salesforce, ServiceNow etc. Preferred experience with security products-data and DLP, CASB security, SASE, and integration with third party APIs and services. 5+ years of experience with running workloads on cloud-based architectures. (AWS/GCP experience preferred) 5+ years of experience in cloud technologies like ElasticSearch, Redis, Kafka, Mongo DB, Spring Boot . Experience with Docker and Kubernetes or other container orchestration platforms. Excellent troubleshooting abilities. Isolate issues found during testing and verify bug fixes once they are resolved. Experience with backend development (REST APIs, databases, and serverless computing) of distributed cloud applications. Experience with building and delivering services and workflows at scale, leveraging microservices architectures. Experience with the agile process and working with software development teams involved with building out full-stack products deployed on the cloud at scale. Good understanding of public cloud design considerations and limitations in areas of microservice architectures, security, global network infrastructure, distributed systems, and load balancing. Strong understanding of principles of DevOps and continuous delivery. Can-do attitude and ability to make trade-off judgements with data-driven decision-making. High energy and the ability to work in a fast-paced environment. Enjoys working with many different teams with strong collaboration and communication skills.
Posted 5 days ago
6.0 - 11.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Job description We are looking for a highly motivated individual that is excited about continually learning new things and being part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. As part of the Development team, you will help maintain the quality, reliability and availability of key systems that provide search and information retrieval for our core products. The ideal candidate will have extensive skills in the areas of design, development, delivery, execution, leadership, mentoring and directing co-workers across geographic boundaries. About the Role: In this role you will play a Senior Software Engineer role . Primary responsibilities for the role include working with technology peers and business partners to solve business problems and providing support to ensure the availability of our products are met. Develop high quality code Maintain existing software solutions by fixing bugs and optimizing performance Provide technical support to operations or other development teams by troubleshooting, debugging and solving critical issues in the production environment in a timely manner to minimize user and revenue impact. Work closely with our business partners and stakeholders to identify requirements and priority of new enhancements and features and ensure that the stakeholders needs are being met. Improve system reliability by implementing automated testing and deployment strategies Design microservices architecture for complex applications Participate in design discussions with other engineers Continuously improve your knowledge of programming languages and technologies Lead small projects or tasks within a larger project team Guide junior developers on best practices and coding standards About you Bachelor's degree in computer science, engineering, information technology or equivalent experience Java, Microservices, Spring boot, JavaScript with AWS & Python 6+ years of experience in Java technologies 1+ years of experience in Python 3+ years of experience in developing Spring boot based microservices 2+ years of experience in frontend technologies such as JavaScript 3+ years of background in software development using AWS capabilities (EC2, Lambda, IAM, RDS, CloudFormation) Broad experience in enterprise-class system design and development including use of the following technologiesJava, Oracle, SQL, Messaging technologies experience of working across geographical sites experience with CI/CD Pipelines and Github action s Demonstrated understanding of Linux operating system Excellent interpersonal, verbal, and written communication skills #LI-SA1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
We are seeking an experienced DevOps Engineer to join our team, focusing on automating and optimizing our deployment processes. The candidate will be responsible for setting up GitHub Actions and AWS Code Pipeline to build and deploy Kore Chatbot and Product Training applications in the AWS environment. This position plays a crucial role in ensuring efficient, error-free deployments and upholding strong continuous integration and continuous delivery (CI/CD) practices. About the Role: CI/CD Pipeline Management: Design, implement, and maintain CI/CD pipelines using tools such as GitHub Actions and AWS Code Pipeline. Debug and resolve issues within the CI/CD pipelines to ensure smooth and efficient operations. Develop build and deployment scripts for multiple CI/CD pipelines to support various applications and services. Build and Deploy various technology stacks such as Java, Angular, NodeJS, React etc., Infrastructure as Code (IaC): Utilize IaC tools like CloudFormation and Terraform to automate cloud infrastructure management. Manage and optimize cloud resources on platforms such as AWS (ECS, Lambda, EC2, API Gateway, IAM, S3, CloudFront, RDS) and Google Cloud Platform (GCP). Containerization and Orchestration: Create, manage, and deploy Docker containers and Kubernetes services to support scalable application deployment. Write and maintain Dockerfiles for containerized applications. Collaboration and Support: Collaborate with development teams to understand build and deployment requirements for various technology stacks, including AWS, Angular, React, and others. Ensure all build, deployment, and automation needs are met for platforms such as Optimus, Kore, Docebo, Kaltura, and others. Process Improvement: Continuously improve and streamline existing processes for enhanced efficiency and reliability. Utilize scripting tools like Python and Bash to streamline deployments. Monitoring and Performance: Implement and manage Datadog monitoring to ensure system performance and reliability. Set up metrics, dashboards, and alarms in Datadog. Team members are tasked with deploying packages to Development, QA, and Production environments. Create build and deployment scripts for multiple CI/CD pipelines to support a range of applications and services. Troubleshoot and resolve issues within CI/CD pipelines to ensure smooth and efficient operations. Maintain Version based pipelines and perform any bug fixes if any. Create a base image pipeline as a one-time task. Integrate UI services with Datadog for continuous and detailed monitoring. Set up metrics, dashboards, and alarms in Datadog for UI services. Conduct a proof of concept for the LMS API Gateway. Implement AWS secrets management through the Infrastructure as Code pipeline. Automate the rotation of JFrog keys leveraging AWS Lambda. About You: A bachelors degree in computer science, Engineering, or a related discipline. Over 5 years of experience in DevOps roles, with a focus on CI/CD pipelines and cloudser vices. Proven experience in DevOps or similar role. Strong expertise in CI/CD tools such as GitHub Actions and AWS Code Pipeline. Proficient in using IaC tools like CloudFormation and Terraform. Experience with cloud platforms, particularly AWS and GCP. Solid understanding of Docker and Kubernetes. Strong scripting skills in Python and Bash. Excellent problem-solving and communication skills. Ability to work collaboratively in a team environment. #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
ISRM is looking for a Security Software Engineer to join our ISRM Software & Data Engineering team to develop innovative software solutions . Security Software Engineer are experienced professionals that design, develop, test, deploy, maintain , and enhance security software solutions. They have good knowledge and subject matter expertise in secure software development. The Security Software Engineer interacts with internal and external teams, works on projects independently and collaborates with cross-functional teams to manage project priorities, deadlines, and deliverables. About The Role Security Software Engineer Delivers end-to-end technical solutions for multiple products or complex projects Solves complex problems with minimal guidance Design, develop and test software systems and/or applications for enhance ment s and new products Writes code according to coding specifications established for software solutions. Delivers software features with exceptional quality, meeting designated release plans and delivery commit ment s. Develops software solutions by studying information needs, conferring with users, studying systems flow, data usage, and work processes; investigating problem areas; and following the software develop ment lifecycle. Docu ment s and demonstrates solutions by developing docu ment ation, flowcharts, layouts, diagrams, charts, code com ment s, and clear code. Improves operations by conducting systems analysis and recommending changes in policies and procedures. About You 2 + years of experience in softwar e application develop ment . H ands -on experience with Golang, Vue / React/ Angular, JavaScript / TypeScript. Experience with REST APIs and microservices. Good understanding of CI/CD concepts and pipelines. Working proficiency leveraging and operating the AWS services such as (but not limited to) IAM, SQS, S3, Lambda, ECS , and EC2 or other cloud experience Understanding of containers technologies (Docker). Good written and verbal communication skills. Knowledge of SCRUM Agile methodology . Experience with any SCM like GitHub Experience with automation and scripting with either Bash, PowerShell. Experience with Infrastructure as Code (IaC) such as Terraform. Hands-on security engineering or application security experience a plus. #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
7.0 - 12.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Develops software programs in order to test code and applications. Involved in requirements analysis, test planning activities, test case automation and execution of critical test cases/scripts required to ensure that the software meets business requirements. Functional Testing levels may include application integration, system, system/network integration and acceptance testing; may also include non-functional testing such as performance, volume, load and fallback. Mentors and guides the team to perform functional, non functional testing and other testing activities Software process professionals are involved in driving the corporate CMM process improvement program and in assessing and reporting on the software development processes adopted by projects through audits, reviews and inspections. About The Role: As a Senior QA Engineer, you will Plan, design, maintain, and execute test cases with minimal direction. Write and maintain automation code as new features are added. Deliver end-to-end testing solutions for multiple products or complex projects. Lead routine projects with manageable risks and resource requirements. Solve complex problems with minimal guidance. Train and mentor junior colleagues. About You: You are a fit for this position if your background includes Must Have: Over 7 years of experience in software testing and quality assurance. Excellent problem-solving skills with the ability to work independently. Strong written and verbal communication skills for articulating technical concepts to stakeholders. Proficiency in developing and executing test plans and automation scripts using Python and test frameworks such as Pytest and Robot Framework. Solid experience with GitHub. Proficiency in leveraging and operating AWS services, including but not limited to IAM, SQS, S3, Lambda, CloudFormation, CloudFront, DynamoDB, ECS, and EC2. #LI-AM1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
4.0 - 6.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 09 The Team We are looking for highly motivated, enthusiastic and skilled software engineer to join an agile scrum team developing technology solutions for S&P Global Market Intelligence. The team is responsible for modernizing and maintain the product by utilizing latest technologies. The Impact Contribute significantly to the growth of the firm by: Developing innovative functionality in existing and new products Supporting and maintaining high revenue products Achieve the above intelligently and economically using best practices Whats in it for you Build a career with a global company Work on code that fuels the global financial markets Grow and improve your skills by working on enterprise level products and new technologies Responsibilities Architect, design, and implement software related projects. Perform analysis and articulate solutions. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Collaborate effectively with technical and non-technical stakeholders Active participation in all scrum ceremonies following Agile principles and best practices What Were Looking For: Basic Qualifications: Bachelors /Masters Degree in Computer Science, Information Systems or equivalent. 4-6 years of strong hands-on development experience in React, .Net and SQL Server technologies. Able to demonstrate strong OOP skills Knowledge of Fundamentals, or financial industry Mentoring experience Understanding of database performance tuning in SQL Server Ability to manage multiple priorities efficiently and effectively within specific timeframes Experience in conducting application design and code reviews Experience implementingAWS, Lambda, SES,SQS,SNS,Dynamo DB and Unit Tests Excellent problem-solving & troubleshooting skills Preferred Qualifications: Experience with containerization using Docker, Kubernetes is a plus About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 6 days ago
4.0 - 9.0 years
6 - 15 Lacs
Bengaluru
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an?agile environment. ?The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! Your Role and Responsibilities: Designs, develops and supports applications solutions with focus on HANA version of Advanced Business Application Programming (ABAP). This specialty may design, develop and/or re-engineer highly complex application components, and integrate software packages, programs and reusable objects residing on multiple platforms. This specialty may additionally have working knowledge of SAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4 -12 years of experience required. The ABAP on HANA Application Developers would possess the knowledge of the following topics and apply them to bring in value and innovation to client engagements: SAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption. Designing and developing, data dictionary objects, data elements, domains, structures, views, lock objects, search helps and in formatting the output of SAP documents with multiple options. Modifying standard layout sets in SAP Scripts, Smart forms & Adobe Forms Development experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports) Preferred technical and professional experience Experience in working in Implementation, Upgrade, Maintenance and Post Production support projects would be an advantage Understanding of SAP functional requirement, conversion into Technical design and development using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in implementation or support projects
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Pune
Work from Office
Hello Visionary! We know that the only way a business thrive is if our people are growing. That’s why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for a Golang Developer. You’ll make a difference by: Being proficient in Designing, developing, and maintaining robust backend services using Go, including RESTful APIs and microservices. Being proficient in Develop and maintain automated unit and integration tests to ensure feature reliability and performance. Adhering to established quality processes to support the project in achieving its quality objectives. Investigating and debugging reported software defects effectively, demonstrating strong problem-solving skills. Communicating clearly and professionally across various levels—within the team and across groups—both verbally and in writing, including emails, presentations, and technical documentation. You’ll win us over by: Holding a graduate BE / B.Tech / MCA/M.Tech/M.Sc with good academic record. 3-6 Years of Experience in software development with a strong focus on Go (Golang). Working experience in building and maintaining production-grade microservices and APIs. Strong grasp of cloud platforms (AWS) including services like Lambda, ECS and S3. Hands-on experience with CI/CD, Git, and containerization (Docker). Familiarity with distributed systems, message queues, and API design best practices. Having Experience with observability tools for logging, monitoring, and tracing. Passion for innovation and building quick PoCs in a startup-like environment. Personal Attributes: Excellent problem-solving and communication skills, able to articulate technical ideas clearly to stakeholders. Adaptable to fast-paced environments with a solution-oriented, startup mindset. Proactive and self-driven, with a strong sense of ownership and accountability. Actively seeks clarification and asks questions rather than waiting for instructions. Create a better #TomorrowWithUs! This role, based in Pune, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers atwww.siemens.com/careers
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France