Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
- 5 years
4 - 9 Lacs
Chennai, Delhi / NCR, Mumbai (All Areas)
Work from Office
Experience: 0 to 4 years Qualification: B.Tech/BE/MCA, Skills in one or more of JavaScript,CSS, Web application framework viz. Sencha EXT JS, JQuery etc., Delphi,C,C++,or Java. Cloud Administrator-managing Windows based Servers
Posted 1 month ago
5 - 10 years
10 - 15 Lacs
Bengaluru
Work from Office
Wissen Infotech Pvt Ltd is hiring for AWS Cloud Developer with NodeJS Experience:5+Years Location:Bangalore Notice:Immediate Job Description As part of the End-to-End Digital Customer Relationships, installed base tracking has become one of the key pillars to grow drastically Services revenues: both field services and digital services. Schneider Electric is seeking a highly skilled and experienced AWS Developer with strong AWS Cloud skills, strong development skills in NodeJs, proficiency in GitHub to join our Installed Base Team. The ideal candidate will have a robust understanding and hands-on experience with building, deploying, and maintaining microservices in AWS such as Lambda, DynamoDB, API Gateway, etc. You will play a crucial role in developing services, ensuring our infrastructure is maintained and secure. Experience with IaC/DevOps tools such as Terraform and CloudFormation is a plus. Schneider Installed base (IB) is the central data platform where all the product asset data that we track is collected, qualified, consolidated and exposed. Under IB AWS lead, The IB AWS Cloud Developer is a key team member of installed base custom development team, its role is to: - Work with IB AWS lead on the design of the architectures for the new capabilities hosted in our AWS platform. - Design and develop microservices using NodeJs. - Previous experience with Python following OOP is a plus. - Evaluate product requirements for operational feasibility and create detailed specifications based on user stories. - Write clean, efficient, high quality, secure, testable, maintainable code based on specifications. - Coordinate with stakeholders (Product Owner, Scrum Master, Architect, Quality and DevOps teams) to ensure successful execution of the project. - Troubleshoot and resolve issues related to the infrastructure. - Ensure best practices are followed in cloud services, focusing on scalability, maintainability, and security. - Keep abreast of the latest advancements in AWS cloud technologies and trends to recommend process improvements and technology upgrades. - Mentor and provide guidance to junior team members, fostering a culture of continuous learning and innovation. - Participate in architecture review board meetings and make strategic recommendations for choice of services. Qualifications - 3+ years experience working with NodeJs along with AWS Cloud Services or a similar role. - Masters degree in computer science with a focus on Cloud/Data or equivalent (Or Bachelor with more years of XP) - Comprehensive knowledge and hands-on experience with NodeJs, AWS Cloud services, especially the modules (Lambda, serverless, API Gateway, SQS, SNS, SES, DynamoDB, CloudWatch,). - Knowledge of best practices in Python is a plus. - Knowledge of branching and version control systems like GIT (mandatory) - Experience with IaC tools such as Terraform and/or CloudFormation is a plus. - Proficient in Data Structure and algorithm. - Excellent collaboration skills. - A desire for continuous learning and staying updated with emerging technologies. Skills - Due to the nature of this position sitting on a global team, fluent English communication skills (written & spoken) is required. - Strong interpersonal skills, with ability to communicate and convince at various levels of the organization, and in a multicultural environment. - Ability to effectively multi-task and manage priorities. - Strong analytical and synthesis skills - Initiative to uncover and solve problems proactively. - Ability to understand complex software development environments.
Posted 1 month ago
11 - 17 years
15 - 27 Lacs
Hyderabad, Bengaluru
Work from Office
Role : Solution Architect-Java Exp : 12-18 Years Notice Period : Immediate -15 Days location : Ban MOH : Permanent Interested candidates can share updated cv (or) any references to mansoor@burgeonits.com Job Description: The Solution Architect / Cloud Architect shall have at least a Degree in Computer Science, Computer Electrical Engineering or Information Technology, or related discipline, with at least 8 years of design/ implementation /consulting experience with Cloud Hosting and distributed applications - must have a minimum of 5 years hands-on experience as a technical lead and system architect (i.e. system design, performance tuning, and system prototyping & maintenance). A solid software development background and multiple industry certifications would be advantageous, e.g. AWS Certified Solutions Architect or equivalent, Certified Enterprise Java Architect for J2EE platforms, SEI Software Architect Certification or IASA Certified IT Architect Profession or equivalent. Responsibility:- • Design, build and implement modern Cloud deployable applications to meet business requirements • Gather functional and technical requirements as inputs into the overall solution design • Estimate and plan work for all phases of the project, participating in process and functional design activities, carrying out application design, build, test, deployment and integration activities • Drive user acceptance test planning and execution • Design, development and implementation of back-end application in Java with Spring boot on AWS Cloud • Manage the leads to deliver the requirements as per the design Experience: • 8+ years relevant experience • Experience in design and development of scalable architectures
Posted 1 month ago
4 - 7 years
20 - 25 Lacs
Mumbai
Work from Office
Senior CloudOps Engineer: Congratulations, you have taken the first step towards bagging a career-defining role. Join the team of superheroes that safeguard data wherever it goes. What should you know about us? Seclore protects and controls digital assets to help enterprises prevent data theft and achieve compliance. Permissions and access to digital assets can be granularly assigned and revoked, or dynamically set at the enterprise-level, including when shared with external parties. Asset discovery and automated policy enforcement allow enterprises to adapt to changing security threats and regulatory requirements in real-time and at scale. Know more about us at www.seclore.com You would love our tribe: If you are a risk-taker, innovator, and fearless problem solver who loves solving challenges of data security, then this is the place for you! Role: Senior CloudOps Engineer Experience: 4-7 Years Location: Mumbai A sneak peek into the role: This position is for self-motivated and highly energetic individuals who can think of multiple solutions for a given problem and help in decision-making while working in a super-agile environment. Here's what you will get to explore: Design and Build Scalable Systems: Design, build, and maintain scalable and fault-tolerant software systems that can handle increasing load and recover from failures quickly. Combine technology, tools, and global best practices of cloud operations for innovation, efficiency, and compliance. Understand and manage the existing setup of Seclore Cloud offering as per internal and external SLAs. Update/write automation tools to commission and configure additional infrastructure elements on AWS cloud when needed. Deploy and upgrade Seclore application suite on Cloud and work on customer requests for specific changes or issues, using existing automation scripts, monitoring tools, and manual steps. Continuously improve automation and monitoring tools for better effectiveness and efficiency. Understand Seclore product features, technology platform, deployment, and configuration options. Work closely with other teams to understand requirements, upcoming features, and impact on the cloud infrastructure to ensure DevOps requirements are communicated and understood. Learn and build expertise on the latest technologies and best practices. We can see the next Entrepreneur At Seclore if you have: A technical degree (Engineering, MCA) from a reputed institute with a minimum of 4 years of experience in automation. At least 2 years of experience working with AWS or Azure Cloud. Experience of writing applications or automation tools using one or more of Batch script, PowerShell. Experience of working in a SaaS/managed services or Production environment. Experience with Linux OS.Hands-on experience of application configuration and deployment on cloud-based servers. Experience with tools and processes for application monitoring, incident, and log management. Good knowledge of troubleshooting web-based application deployments and configurations. Adequate knowledge of RDBMS like SQL Server and/or Oracle. Good verbal and written communication skills to interact with technical and non-technical staff. An analytical frame of mind to identify and evaluate multiple solutions to the same problem and come up with a solution roadmap. Why do we call Seclorites Entrepreneurs not Employees? We value and support those who take the initiative and calculate risks. We have an attitude of a problem solver and an aptitude that is tech agnostic. You get to work with the smartest minds in the business. We are thriving not living. At Seclore, it is not just about work but about creating outstanding employee experiences. Our supportive and open culture enables our team to thrive. Excited to be the next Entrepreneur, apply today! Dont have some of the above points in your resume at the moment? Dont worry. We will help you build it. Lets build the future of data security at Seclore together.
Posted 1 month ago
6 - 8 years
12 - 15 Lacs
Bengaluru
Work from Office
5+ Years of experience in Backend Java Developer. Experience in Java, Spring boot, Kafka, Jenkins. • Experience in Databases like My SQL, Postgres, Oracle. • Experience in Microservices. • Experience in AWS. • Experience in Java Integration. Required Candidate profile Mandatory Skills: • Architecture Design. • Code Development • DevOps • Jenkins • Kafka • Microservices • API • AWS Cloud Qualification : • Graduate in Science or Engineering Discipline.
Posted 1 month ago
10 - 18 years
11 - 15 Lacs
Chennai
Work from Office
About Us Why a career in Zuci is unique! Constant attention is the source of our perfection. We fundamentally believe that building a career is all about consistency. If you jog or walk for a few days, it wont bring in big results. If you do the right things every day for hundreds of days, you'll become lighter, more flexible, and you'll start enjoying your work and life more. Our customers trust us because of our unwavering consistency. Enabling us to deliver high-quality work and thereby give our customers and Team Zuci the best shot at extraordinary outcomes. Do you see the big picture? Is Digital Engineering your forte? About The Role Job TitleAWS Solution Architect Job Summary We are seeking an experienced AWS Solution Architect to design, implement, and optimize the cloud infrastructure for the clients enterprise-level web application. The successful candidate will be responsible for architecting, deploying, and managing the end-to-end AWS environment, ensuring it meets the business requirements, security standards, and performance objectives. Responsibilities Design and optimize the overall AWS cloud architecture, including VPC, IAM, S3, RDS, ECS, and other relevant services. Ensure seamless integration and connectivity between the various components of the architecture, such as the databases, load balancers, and monitoring tools. Implement high availability and resilience mechanisms, including auto-scaling, failover, and disaster recovery strategies. Configure and manage security controls, access management, and compliance requirements (e.g., LAVA properties) for the cloud environment. Monitor and optimize the performance of the infrastructure, addressing bottlenecks and improving resource utilization. Establish and maintain CI/CD pipelines for efficient application deployments and infrastructure changes. Collaborate with the development, DevOps, and database teams to understand their requirements and integrate the cloud solutions accordingly. Mentor and upskill the cross-functional team on AWS best practices, architecture patterns, and design principles. Provide ongoing support and troubleshooting for the cloud environment, quickly identifying and resolving issues. Document processes, procedures, and knowledge for maintainability and knowledge sharing. Qualifications 10+ years of experience as an AWS Solution Architect Deep understanding of AWS services, including VPC, EC2, RDS, ECS, CloudWatch, and security services Proficient in designing and implementing highly available, scalable, and secure cloud architectures Strong knowledge of containerization, CI/CD, and Infrastructure as Code (IaC) tools Excellent problem-solving skills and the ability to quickly identify and resolve issues Experience in collaborating with cross-functional teams and providing technical leadership Familiarity with web application development, databases, and monitoring/logging tools Excellent communication and documentation skills Preferred Qualifications AWS Certified Solutions Architect - Professional certification check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
2 - 6 years
5 - 15 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Note- Preferring candidates from product firm only. Hi Candidates, We have an opportunities for Full stack developer for leading product based company. Interested candidates can mail their cv's at Abhishek.saxena@mounttalent.com Job Responsibilities- What we expect from you: Work with the product team on requirements analysis and make technical trade off decisions at application level (e.g., component design). Define and develop solutions to technical problems within Agile framework. Use your expertise to input into reengineering and design. Act as a subject matter expert for focus areas across the technology space. Follow and improve the coding standards with the team, ensure the high code quality. • Actively look for the latest tools and technologies. Actively learn and help implementing the new industry standards related to products, people, and processes. Identify the inefficiencies in the process and help the team to improve them to maximize the teams performance. Help identifying the dependencies, risks, and bottlenecks in the projects proactively. Work actively with the Engineering Manager/Team Lead to resolve them. Skill-Set requirements: Required Skills: Strong understanding of object-oriented programming concepts, data structures, and algorithms. Experience working with either React or Angular (React preferred). Strong proficiency in C# and .NET Core, with a deep understanding of developing RESTful APIs. Experience with unit testing and ensuring code quality. Familiarity with any relational database (Postgres preferred). Knowledge and experience with Docker. Preferred Skills: Experience with Kubernetes (K8s) and cloud platforms such as GCP or Azure. Knowledge of micro-services and event-driven architecture, particularly Pub Sub
Posted 1 month ago
1 - 4 years
11 - 15 Lacs
Hyderabad
Work from Office
Job Area: Engineering Group, Engineering Group > Software Applications Engineering General Summary: Tools team at Qualcomm is looking for experienced Software engineer to help us design and build quality software. Job Overview Develops, creates, and modifies general computer applications software or specialized utility programs. Analyzes user needs and develops software solutions. Designs software or customizes software for client use with the aim of optimizing operational efficiency. Modifies existing software to correct errors, allow it to adapt to new hardware, or to improve its performance. Stores, retrieves, and manipulates data for analysis of system capabilities and requirements. Minimum Qualifications Bachelor's degree in Computer Science Engineering or related field. 3 plus years Software Engineering or related work experience. 3 plus years of experience in C# programming Language is mandatory. 3 plus years of experience in web application development using Angular, ASP. Net and webapis is mandatory. 3 plus years of experience in application development in .net core and .net framework. Knowledge of Python and AWS/Azure cloud. Knowledge of sqlserver database, operating system, and algorithms design. Preferred Qualifications Experience in developing one or more products throughout all phases of the software development lifecycle. Exposure to Web development using C#.net, Python, ASP.net, Angular, Webapis, JavaScript, HTML 5, CSS and REST web services. Experience in working on AWS cloud platforms Having exposure to Business Intelligence & Machine learning will be an added advantage. Having exposure to CMS applications, restructured text, markdown and DITA will be an added advantage. Experience working in an Agile/Scrum development process Education Requirements RequiredBachelor's, Computer Engineering and/or Computer Science PreferredMaster's, Computer Engineering and/or Computer Science Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field.
Posted 1 month ago
5 - 10 years
30 - 32 Lacs
Bengaluru
Work from Office
About The Role Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
5 - 10 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
8 - 13 years
30 - 32 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
2 - 6 years
3 - 7 Lacs
Hyderabad
Work from Office
At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Everything we do centers around people. That means we obsess over how to make the lives of our customers, and their customers, better. And it means we prioritize a diverse F5 community where each individual can thrive. Join a team using leading edge security technology and processes to protect the F5 enterprise and product environment. The Security Engineer position will execute strategic processes and implement technical solutions to enable our information security program and address day-to-day security challenges amidst the industry’s evolving technology landscape. Primary Responsibilities Build and implement new security controls, processes and tools. Identify organizational risks to confidentiality, integrity, and availability, and determine appropriate mitigations. Leverage native Azure, GCP, and AWS cloud services to automate and improve existing security and control activities. Develop or implement open-source/third-party tools to assist in detection, prevention and analysis of security threats. Perform technical security assessments against product and enterprise cloud hosted, virtual, and on-premise systems including static and dynamic analysis, and threat modeling. Review and test changes to services, applications, and networks for potential security impacts. Collaborate with Architecture, Site Reliability Engineering and Operations teams to develop and implement technical solutions and security standards. Stay abreast on security best practices and secure design principles. Review changes to and ongoing operations of enterpise environments and supporting systems for security and compliance impacts. Assist in incident detection and response efforts. Implement zero-trust patterns with cloud agnostic tools to support enterprise business units. Implement, design, develop, administer, and manage enterprise security tooling. Knowledge, Skills and Abilities Experience working with high-availability enterprise production environments Familiarity with scripting languages (e.g., (Go, Python, Ruby, Rust,etc.). and building scripts for process improvements Experience automating security testing and reporting outputs Technical knowledge and hands-on experience with security and networking security, basic networking protocols, cloud security, network security design, intrusion prevention/detection, and firewall architecture Experience assessing and implementing technical security controls Willingness to innovate and learn new technologies Excellent interpersonal and relationship skills with a collaborative mindset Knowledge or familiarity with technological stack (Big-IP, Azure, AWS, GCP, CentOS, Hashicorp Vault, Palo Alto, Qualys). Experience with network and application vulnerability and penetration testing tools. Baseline competency in administration of Microsoft Azure Cloud, Amazon Web Services (AWS), Google Cloud Platform (GCP) or equivalent public cloud infrastructure. Exposure to DevOps tooling, CI/CD pipelines, container orchestration, and infrastructure as code approach (e.g. Puppet, Chef, Ansible, Terraform, Jenkins, CircleCI, Artifactory, Git) Strong written and verbal communication skills. Strong self-directed work habits, exhibiting initiative, drive, creativity, maturity, self-assurance and professionalism. Agile, tactful, and proactive attitude that can manage prioritization and know when to escalate. Qualifications B.S. or M.S. in Computer Science, Engineering, or related field, or equivalent experience. 3+ years of relevant security and networking experience The About The Role is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.
Posted 1 month ago
12 - 15 years
35 - 45 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Strong frontend development experience with ReactJS, JavaScript or TypeScript. Proficiency in HTML5, CSS3 & responsive design best practices. Hands-on exp with AWS Cloud Services, specifically designing systems with SNS, SQS, EC2, Lambda & S3. Required Candidate profile Expert-level exp in backend development using .NetCore, C# & EF Core. Strong expertise in PostgreSQL & efficient database design. Proficient in building & maintaining RESTful APIs at scale.
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology
Posted 1 month ago
2 - 6 years
5 - 9 Lacs
Hyderabad
Work from Office
AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 306668 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job TitleLead Data Engineer (Warehouse) Required Skills and Qualifications - 7+ years of experience in data engineering of which atleast 3+ years as lead / managed team of 5+ data engineering team. - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification - Bachelor's degree in Computer Science, Engineering, or related field Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Position Overview: We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Computer Science, Database, SQL, Consulting, Technology
Posted 1 month ago
4 - 9 years
16 - 20 Lacs
Bengaluru
Work from Office
Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Mumbai
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Specialist to join our team in Mumbai, Maharashtra (IN-MH), India (IN). . Cloud Security Lead - a. Exp06-08 years' experience in cloud/security b. Skills: Actively and aggressively fix security observations on AWS cloud - Security Hub, Guard Duty, Detective, Inspector, etc. Experience working on Checkpoint, Prisma cloud will be advantage. Should be working is definitive targets to improve the security posture. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Consulting, Technology
Posted 1 month ago
3 - 6 years
2 - 6 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an experienced MDM Manager with 10–14 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio. This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong MDM experience along with Data Governance, DQ, Data Cataloging implementation knowledge, hence the candidates must have minimum 6-8 years of core MDM technical experience for this role (Along with total experience in the range of 10-14 years) . Roles & Responsibilities Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation experience D esign and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience Master’s degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS , Eclipse etc Strong experience with external data enrichment services like D&B, Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications Any MDM certification ( e.g. Informatica , Reltio etc ) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
6 - 10 years
8 - 12 Lacs
Hyderabad
Work from Office
Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and deliver ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Digital Product Manager/Content Curator Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a detail-oriented and research-savvy Content Curator to support our enterprise Search Program within the pharmaceutical sector. This role is critical to improving how scientists, researchers, clinicians, and business teams discover relevant, accurate, and well-structured information across vast internal and external data sources. You will curate, classify, and optimize content to ensure it is accessible, contextual, and aligned with regulatory standards. Curate scientific, clinical, regulatory, and commercial content for use within internal search platforms. Sourcing and aggregating relevant content across various platforms. Ensure high-value content is properly tagged, described, and categorized using standard metadata and taxonomies. Identify and fill content gaps based on user needs and search behavior. Organizing and scheduling content publication to maintain consistency. Analyzing content performance and making data-driven decisions to optimize engagement Provide feedback and input on synonym lists, controlled vocabularies, and NLP enrichment tools Apply and help maintain consistent metadata standards, ontologies, and classification schemes (e.g., MeSH, SNOMED, MedDRA). Work with taxonomy and knowledge management teams to evolve tagging strategies and improve content discoverability. Capture and highlight the best content from a wide range of topics Stay up-to-date on best practices and make recommendations for content strategy Edit and optimize content for search engine optimization Perform quality assurance checks on all content before publication Identify and track metrics to measure the success of content curation efforts Review and curate content from a wide variety of categories with a focus Understanding of fundamental data structures and algorithms Understanding how to optimize content for search engines is important for visibility. Experience in identifying, organizing, and sharing content. Ability to clearly and concisely communicate complex information. Ability to analyze data and track the performance of content. Ability to quickly adapt to changing information landscapes and find new resources. A deep understanding of Google Cloud Platform services and technologies is crucial and will be an added advantage Check and update digital assets regularly and, if needed, modify their accessibility and security settings Investigate, secure, and properly document permission clearance to publish data, graphics, videos, and other media Develop and manage a system for storing and organizing digital material Convert collected assets to a different digital format and discard the material that is no longer relevant or needed Investigate new trends and tools connected with the generation and curation of digital material Basic Qualifications: Degree in Data Management, Mass communication and computer science & engineering preferred with 9-12 years of software development experience 5+ years of experience in (digital) content curation or a related position Excellent organizational and time-management skills. Ability to analyze data and derive insights for content optimization. Familiarity with metadata standards, taxonomy tools, and content management systems. Ability to interpret scientific or clinical content and structure it for digital platforms. Ability to analyze data and derive insights for content optimization. Exceptional written and verbal communication skills. Experience in Content Management Systems (CMS), SEO, Google Analytics, GXP Search Engine/ Solr Search, enterprise search platforms, data bricks Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Exceptional written and verbal communication skills. Excellent organizational and time-management skills. Preferred Qualifications: Experience with enterprise search platforms (e.g., Lucene, Elasticsearch, Coveo, Sinequa). Experience with GCP Cloud/AWS cloud /Azure Cloud Experience GXP Search Engine/ Solr Search Experience in Posgres SQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, Dynamo DB, S3 Experience in Agile software development methodologies Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global teams. High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect from us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
14 - 20 years
40 - 45 Lacs
Bengaluru
Work from Office
About the Company Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. With 20+ years of specialization in product and platform engineering, Ness is a global leader in digital transformation. We design, build, & integrate digital platforms and enterprise software that help organizations to engage with customers, differentiate their brands, and drive profitable growth for them. Our experience designers, software engineers, data experts, and business consultants, partner with clients to develop roadmaps that identify ongoing opportunities to increase the value of their digital solutions and enterprise systems. The exciting work happens through 11 innovation hubs with 5000+ Nessians located across the globe. Please visit our website www.ness.com and learn about our wonderful work. We are inviting applications for Engineering Manager. In this role you would be working towards developing the organization’s strategy for using technological resources. Ensuring technologies are used efficiently, profitably, and securely and evaluating and implementing new solutions. Roles & Responsibility: • Should have 14+ years of experience of working in product development organizations with a proven experience of developing enterprise scale products in a highly Agile/Scrum environment. • Should be able to manage releases for products having multiple live versions and multiple releases through the year • Ability to manage delivery with 20+ Engineers, including architecture, design, code reviews & peoples career management; with good exposure to engineering processes, product delivery, playbooks, frameworks etc.) • Specific responsibilities include driving the team on innovation and implementation, creating and reviewing architectural designs, mentoring the team, and honing its engineering skills. • Strong knowledge of Java-Spring based technical stack, databases (SQL Server, Oracle), modern JS frameworks like React, AWS cloud, design and architectural patterns and frameworks • Good understanding of application Security, Performance & Quality, and DevOps process • Very good knowledge of software development tools, patterns and processes (Agile principles, SCRUM, SAFe) • Collaborate with architects, product management, and engineering teams to create solutions that increase the platform's value. • Create technical specifications, prototypes, and presentations to communicate your ideas. • Well-versed in emerging industry technologies and trends and the ability to communicate that knowledge to the team and influence product direction. • Own progress of the product through the development life cycle, identifying risks and opportunities, and ensuring visibility to senior leadership. • Partner with product management to define and refine our product road map, user experience, priorities, and schedule. • Excellent Critical thinking, Analytical, problem solving & Solutioning skills with a customer first mindset. Good to have: • Highly motivated and has the ability to convert vague and ill-defined problems into well-defined problems, take initiative and encourage consensus building in the team. • Strong written and verbal communication skills and articulation skills • Demonstrable project management, stakeholder management and organizational skills. • Proven ability to lead in a matrix environment. • Strong interpersonal and talent management skills, including the ability to identify and develop product management talent.
Posted 1 month ago
8 - 13 years
25 - 40 Lacs
Pune
Work from Office
Role & responsibilities Design and develop infrastructure solutions to support business functions, processes, and applications. Responsible for designing, building AWS cloud Infrastructure solution and Automation framework. Participates in creation of new infrastructure and modification of existing infrastructure environments in the Cloud. Communicate with stakeholders and development teams to assist in coordinating the successful delivery of tools and software. Continuously improve the existing infra-structure by identifying the gaps and ticket trends Develop, evaluate, and make recommendations for alternative infrastructure solutions. Stay current with new technology options and cloud infra solutions Build Infrastructure as a code service IE: CloudFormation and/or terraform • Set up and drive standard enterprise process for cloud infra set up and deployment Help other development and engineering teams resolve application to platform integration issues for Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) services. Work closely with stakeholders and product teams to gather technical and non-technical requirements and translate them into well-architected solutions Be upto date with the latest development in the Cloud services and bring the knowledge and best Preferred candidate profile Hands on experience in Design, implement, and manage cloud-based solutions, ensuring scalability, reliability, security, compliance and performance across various AWS services. Collaborate with development, operations, and architecture teams to integrate with AWS services. Optimize cost management by setting up budgets, analysing cost usage using tools like Cost Explorer, and providing recommendations for cost-saving measures. Develop migration strategies and Lead Execution of migration activities Implement security controls and align the cloud environment with relevant industry regulations (e.g., GDPR, HIPAA) Conduct security assessments and set up of control towers Develop recovery strategies: design disaster recovery (DR) and business continuity (BC) plans Develop and maintain cloud governance frameworks, policies, and procedures. Develop team proficiency in infrastructure automation tools like Terraform, AWS CloudFormation, Azure Resource Manager (ARM) templates Should be proficient in scripting languages like PowerShell, infrastructure automation frameworks and CI/CD pipelines to automate configuration management, provisioning, and deployment processes
Posted 1 month ago
8 - 13 years
30 - 35 Lacs
Pune
Hybrid
We are seeking a highly skilled and experienced Cloud Solution Architect to lead the design, implementation, and management of cloud-based solutions across Azure and AWS. The ideal candidate will have deep expertise in cloud architecture, infrastructure, security, and automation while demonstrating strong capabilities in proposal writing, RFPs, architecture diagrams, and technical presentations. Key Responsibilities: Cloud Architecture & Design: Design and implement scalable, secure, and cost-effective cloud architectures across Azure and AWS. Pre-Sales & Solutioning: Engage with clients to understand business requirements and develop cloud solutions, including responding to RFPs and crafting technical proposals. Technical Documentation & Diagrams: Create high-level and low-level architecture diagrams, design documents, and best practices for cloud adoption. Presentations & Stakeholder Communication: Deliver technical presentations, PPTs , and cloud strategy discussions to clients and internal teams. Migration & Modernization: Assess and execute cloud migration strategies, including lift-and-shift, re-platforming, and re-architecting applications. Security & Compliance: Ensure cloud solutions comply with industry standards such as SOC 2, ISO 27001, NIST, and CIS benchmarks. Automation & Optimization: Utilize Infrastructure-as-Code (IaC) tools like Terraform, ARM Templates, or CloudFormation for automation. Collaboration & Leadership: Work closely with DevOps, engineering, and security teams to drive cloud adoption and best practices. Required Skills & Experience: 15+ years of experience in IT, with 10+ years in cloud architecture (Azure, AWS). Expertise in Azure services (VMs, AKS, AAD, Networking, Security, Storage, etc.) and AWS services (EC2, RDS, Lambda, VPC, IAM, etc.). Strong experience with architecture frameworks like TOGAF, Well-Architected Framework (Azure & AWS). Hands-on experience in Infrastructure as Code (Terraform, ARM, CloudFormation) and automation using PowerShell, Python, or Bash. Knowledge of cloud security, identity & access management, and compliance frameworks. Experience working on RFPs, proposals, and pre-sales activities. Strong communication and presentation skills with the ability to create and deliver PPTs, whitepapers, and technical documentation. Experience with hybrid cloud solutions, multi-cloud strategies, and cloud governance. Understanding of networking, VPNs, firewalls, load balancing, and DNS in a cloud environment. Certifications such as Azure Solutions Architect Expert, AWS Certified Solutions Architect.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane