Home
Jobs

295 Lambda Expressions Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 7 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Cloud Administrator (AWS Operations) JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22705 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Cloud Administrator / AWS We seek a professional IT administrator to join our Pune, India office. This role is responsible for deploying and administering acloud-based computing platformwithin ZS. What Youll Do? ? Participate inAWS deployment, configuration and optimization to provide a cloud-based platform to address business problems across multiple client engagements; Leverage information from requirements-gathering phase and utilize past experience to implement a flexible and scalable solution; Collaborate with other team members (involved in the requirements gathering, testing, roll-out and operations phases) to ensure seamless transitions; Operatescalable, highly available, and fault tolerant systems on AWS; Controlthe flow of data to and from AWS; Use appropriateAWS operational best practices; Configureand understandVPC network and associated nuances for AWS infrastructure; Configureand understandAWS IAM policies; Configuresecured AWS infrastructure; User and access management for different AWS services. What Youll Bring? ? Bachelor's Degree in CS, IT or EE 1-2 years of experience in AWS Administration, Application deployment and configuration management Good knowledge of AWS services (EC2,S3,IAM,VPC,Lambda,EMR,CloudFront,Elastic Load Balancer etc.) Basic knowledge of CI/CD tools like JetBrains TeamCity, SVN,BitBucket etc. Good knowledge of Windows and Linux operating system administration. Basic knowledge of RDBMS and database technologies like SQL server, PostgreSQL. Basic knowledge of web server software like IIS or Apache Tomcat Experience in any scripting knowledge like PowerShell, Python, Bash/Shell scripting Knowledge of DevOps methodology Strong verbal, written and team presentation communication skills Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : DevOps Architecture Good to have skills : Microsoft Azure Architecture, AWS Administration Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary : Roles & Responsibilities:The Cloud Solution Consultant is a member of the Global Digital organization and is responsible for professional/technical work related to public cloud platform usage across the enterprise. This is an advanced position, primarily responsible for the duties in the development/support of implementation of solutions using public cloud platforms and the underlying architecture. These implementations span across a broad-range of vendor supplied global services and solutions that are being utilized across the enterprise.The incumbent leads/participates in a broad spectrum of cloud topics (securing alignment, reusability and partnership) across the enterprise where public cloud services are being utilized. He/She will work in close collaboration with the Cyber and Global Infrastructure Services teams to lead the delivery solutions across the enterprise. This role will directly support digital transformation, technology strategy and cloud adoption as part of the Global Digital team. This team is constantly confronted with business challenges and the Cloud Solution Consultant will help cultivate the understanding of high-level, loosely specified requirements to transform them into value for our business through direct contribution to the design, architecture an implementation of solutions in the public cloud platforms. This associate is responsible to communicate/influence technology decisions to both technical/non-technical audiences. This position will support the Operations team by instilling the right mindset and securing adequate tools, methodology and metrics. the incumbent must embrace corporate values and digital/analytic principles. In addition, he/she needs to secure adherence to the vision and enable an inclusive (open to diversity) working environment - Valuing network/communities of practice Professional & Technical Skills: Must To Have Skills: Proficiency in Cloud Infrastructure Strong understanding of cloud computing platforms on AWS and Azure. Experienced in Multi-Cloud Architecture & Deployment:Demonstrated experience designing, deploying, and managing complex cloud architectures across both AWS (e.g., VPC, EC2, Lambda) and Azure (e.g., Virtual Networks, VMs, Azure Functions). Proficiency with IaC tools"”such as Terraform, AWS CloudFormation, or Azure Resource Manager (ARM) templates. Proven track record implementing best practices for identity and Well Designed Architecture Hands-on experience with CI/CD pipelines and automation tools like Jenkins or GitLab. Knowledge of containerization technologies like Docker and Kubernetes. Security and Compliance:Proven track record implementing best practices for identity and Well Designed Architecture Additional Information: The candidate should have a minimum of 5 years of experience in Cloud Infrastructure This position is based at our Hyderabad office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : DevOps Architecture Good to have skills : AWS Architecture, Microsoft Azure Architecture Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary : Roles & Responsibilities:The Cloud Solution Consultant is a member of the Global Digital organization and is responsible for professional/technical work related to public cloud platform usage across the enterprise. This is an advanced position, primarily responsible for the duties in the development/support of implementation of solutions using public cloud platforms and the underlying architecture. These implementations span across a broad-range of vendor supplied global services and solutions that are being utilized across the enterprise.The incumbent leads/participates in a broad spectrum of cloud topics (securing alignment, reusability and partnership) across the enterprise where public cloud services are being utilized. He/She will work in close collaboration with the Cyber and Global Infrastructure Services teams to lead the delivery solutions across the enterprise. This role will directly support digital transformation, technology strategy and cloud adoption as part of the Global Digital team. This team is constantly confronted with business challenges and the Cloud Solution Consultant will help cultivate the understanding of high-level, loosely specified requirements to transform them into value for our business through direct contribution to the design, architecture an implementation of solutions in the public cloud platforms. This associate is responsible to communicate/influence technology decisions to both technical/non-technical audiences. This position will support the Operations team by instilling the right mindset and securing adequate tools, methodology and metrics. the incumbent must embrace corporate values and digital/analytic principles. In addition, he/she needs to secure adherence to the vision and enable an inclusive (open to diversity) working environment - Valuing network/communities of practice Professional & Technical Skills: Must To Have Skills: Proficiency in Cloud Infrastructure Strong understanding of cloud computing platforms on AWS and Azure. Experienced in Multi-Cloud Architecture & Deployment:Demonstrated experience designing, deploying, and managing complex cloud architectures across both AWS (e.g., VPC, EC2, Lambda) and Azure (e.g., Virtual Networks, VMs, Azure Functions). Proficiency with IaC tools"”such as Terraform, AWS CloudFormation, or Azure Resource Manager (ARM) templates. Proven track record implementing best practices for identity and Well Designed Architecture Hands-on experience with CI/CD pipelines and automation tools like Jenkins or GitLab. Knowledge of containerization technologies like Docker and Kubernetes. Security and Compliance:Proven track record implementing best practices for identity and Well Designed Architecture Additional Information: The candidate should have a minimum of 5 years of experience in Cloud Infrastructure This position is based at our Hyderabad office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Spring Boot Good to have skills : GitHub, Microservices and Light Weight Architecture Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:-4-8 years of professional software development experience in Java-Excellent knowledge of design patterns and architecture-Extensive server-side software development experience-Solid hands-on experience with J2EE frameworks such as Spring and adherence to SOLID principles-Hands-on experience in developing microservices with Spring Boot-Strong experience in writing unit tests using JUnit-Strong SQL skills, with experience in databases including Oracle and MySQL-Experience with JDBC and ORM persistence technologies (JPA, Hibernate)-Good understanding of developing serverless applications using AWS Lambda-Experience with Docker and Kubernetes-Core Java - OOP, Collections, Concurrency-Spring Core and Boot - Dependency Injection, REST, Auto-Configuration-Microservices - Service Discovery, API Gateway, Communication-AWS - Docker, Kubernetes, EC2, Lambda, API Gateway, IAM Roles-Design Patterns-SQL and Databases - Queries, Indexing, Transaction Professional & Technical Skills: Must To Have Skills: Proficiency in Spring Boot Good To Have Skills: Experience with Microservices and Light Weight Architecture Strong understanding of RESTful web services Knowledge of cloud platforms like AWS or Azure Experience with version control systems like Git Additional Information: The candidate should have a minimum of 5 years of experience in Spring Boot. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Engineering Sr. Advisor demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery.The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence on delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills.Delivery ƒ¢¢"š¢" Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions.Domain Expertise ƒ¢¢"š¢" Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on.Problem Solving ƒ¢¢"š¢" Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable onesAbout The Role & Responsibilities:ƒ¢¢"š ¢The candidate will be responsible to deliver business needs end to end from requirements to development into production.ƒ¢¢"š ¢Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns.ƒ¢¢"š ¢The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset.ƒ¢¢"š ¢The applicant will ensure adherence to enterprise architecture direction and architectural standards.ƒ¢¢"š ¢The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others.Experience Required:ƒ¢¢"š ¢More than 12 years of experience in software engineering, building data engineering pipelines, middleware and API development and automationƒ¢¢"š ¢More than 3 years of experience in Databricks within an AWS environmentƒ¢¢"š ¢Data Engineering experienceExperience Desired:ƒ¢¢"š ¢Expertise in Agile software development principles and patternsƒ¢¢"š ¢Expertise in building streaming, batch and event-driven architectures and data pipelinesPrimary Skills: ƒ¢¢"š ¢Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc.ƒ¢¢"š ¢Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, Glueƒ¢¢"š ¢Good understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curationƒ¢¢"š ¢Expertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundryƒ¢¢"š ¢Experience in multi-cloud software-as-a-service products such as Databricks, Snowflakeƒ¢¢"š ¢Experience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformationƒ¢¢"š ¢Experience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNSƒ¢¢"š ¢Experience in API and microservices stack such as Spring Boot, Quarkus, ƒ¢¢"š ¢Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFrontƒ¢¢"š ¢Experience with one or more of the following programming and scripting languages ƒ¢¢"š¢" Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languagesƒ¢¢"š ¢Experience in building CI/CD pipelines using Jenkins, Github Actionsƒ¢¢"š ¢Strong expertise with source code management and its best practicesƒ¢¢"š ¢Proficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD)ƒ¢¢"š ¢Knowledge on Behavioral Driven Development (BDD) approachAdditional Skills: ƒ¢¢"š ¢Ability to perform detailed analysis of business problems and technical environmentsƒ¢¢"š ¢Strong oral and written communication skillsƒ¢¢"š ¢Ability to think strategically, implement iteratively and estimate financial impact of design/architecture alternativesƒ¢¢"š ¢Continuous focus on an on-going learning and development Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 5 years

0 - 0 Lacs

Kochi

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled Senior Python Developer with expertise in Machine Learning (ML) , Large Language Models (LLMs) , and cloud technologies . The ideal candidate will be responsible for end-to-end execution -- from requirement analysis and discovery to the design, development, and implementation of ML-driven solutions. The role demands both technical excellence and strong communication skills to work directly with clients, delivering POCs, MVPs, and scalable production systems. Key Responsibilities: Collaborate with clients to understand business needs and identify ML-driven opportunities. Independently design and develop robust ML models, time series models, deep learning solutions, and LLM-based systems. Deliver Proof of Concepts (POCs) and Minimum Viable Products (MVPs) with agility and innovation. Architect and optimize Python-based ML applications focusing on performance and scalability. Utilize GitHub for version control, collaboration, and CI/CD automation. Deploy ML models on cloud platforms such as AWS, Azure, or GCP . Follow best practices in software development including clean code, automated testing, and thorough documentation. Stay updated with evolving trends in ML, LLMs, and cloud ecosystem. Work collaboratively with Data Scientists, DevOps engineers, and Business Analysts. Must-Have Skills: Strong programming experience in Python and frameworks such as FastAPI, Flask, or Django . Solid hands-on expertise in ML using Scikit-learn, TensorFlow, PyTorch, Prophet , etc. Experience with LLMs (e.g., OpenAI, LangChain, Hugging Face , vector search). Proficiency in cloud services like AWS (S3, Lambda, SageMaker) , Azure ML , or GCP Vertex AI . Strong grasp of software engineering concepts: OOP, design patterns, data structures . Experience in version control systems ( Git/GitHub/GitLab ) and setting up CI/CD pipelines . Ability to work independently and solve complex problems with minimal supervision. Excellent communication and client interaction skills. Required Skills Python,Machine Learning,Machine Learning Models

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code - for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware's. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages - proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile - Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Snowflake Developer Job Description Position Overview We are seeking an experienced Snowflake Developer with a proven track record in designing, implementing, and optimizing data solutions using Snowflake's cloud data platform. The ideal candidate will have extensive experience in data loading processes from AWS S3 to Snowflake and be well-versed in AWS services and DevOps practices. Required Qualifications Minimum 4 years of professional experience as a Snowflake Developer or in a similar data engineering role Strong expertise in Snowflake's architecture, features, and best practices Demonstrated experience in loading data from AWS S3 to Snowflake using various methods (COPY, Snowpipe, etc.) Proficient in writing optimized SQL queries for Snowflake Experience with AWS services (S3, Lambda, IAM, etc.) Knowledge of CI/CD pipelines and AWS CloudFormation Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience) Preferred Qualifications Snowflake Certification Experience with Snowflake DevOps practices Experience with version control tools (Git, GitHub, etc.) Experience with Python or another programming language for scripting and automation Understanding of data governance and security principles Key Responsibilities Design and implement efficient data loading processes from S3 to Snowflake Create and maintain Snowflake objects (warehouses, databases, schemas, tables, views, stored procedures) Collaborate with data engineers, analysts, and business stakeholders Assist in establishing and maintaining CI/CD pipelines for Snowflake deployments Document processes, configurations, and implementations Support Snowflake maintenance activities including user management and resource monitoring Troubleshoot and resolve data loading and processing issues Skills Advanced SQL knowledge AWS services (S3, Lambda, IAM, CloudFormation) Snowflake architecture and features Data integration and ETL processes CI/CD and DevOps practices Problem-solving and analytical thinking Effective communication and collaboration - Should have strong experience in Python, Snowflake, Streamlit - Should be well aware of AWS ecosystem - Having Health Insurance domain knowledge is nice-to-have Required Skills Python,Snowflake

Posted 1 month ago

Apply

5 - 10 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply

5 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

8 - 13 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

2 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 306668 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job TitleLead Data Engineer (Warehouse) Required Skills and Qualifications - 7+ years of experience in data engineering of which atleast 3+ years as lead / managed team of 5+ data engineering team. - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification - Bachelor's degree in Computer Science, Engineering, or related field Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Position Overview: We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Computer Science, Database, SQL, Consulting, Technology

Posted 1 month ago

Apply

4 - 9 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology

Posted 1 month ago

Apply

4 - 9 years

18 - 30 Lacs

Hyderabad, India

Hybrid

Naukri logo

Department: Software Engineering Employment Type: Full Time Location: India Reporting To: Manoj Puranik Description At Vitech, we believe in the power of technology to simplify complex business processes. Our mission is to bring better software solutions to market, addressing the intricacies of the insurance and retirement industries. We combine deep domain expertise with the latest technological advancements to deliver innovative, user-centric solutions that future-proof and empower our clients to thrive in an ever-changing landscape. With over 1,600 talented professionals on our team, our innovative solutions are recognized by industry leaders like Gartner, Celent, Aite-Novarica, and ISG. We offer a competitive compensation package along with comprehensive benefits that support your health, well-being, and financial security. Location: Hyderabad (Hybrid Role) Role: Full-Stack Java Developer Are you a Java Developer with 4-7+ years of experience eager to elevate your career? At Vitech, we’re looking for a talented professional with a solid background in Core Java who’s ready to make a significant impact. As a Full-Stack Developer at Vitech, you’ll dive deep into backend development while also contributing to frontend work with ReactJS / GWT. Our small, agile pods allow you to spend up to 40% of your time on innovation and writing new software, pushing our products forward. What you will do: Lead and contribute to the full software development lifecycle —from design and coding to testing , deployment , and support . Apply advanced Core Java concepts such as inheritance , interfaces , and abstract classes to solve complex business challenges . Develop and maintain applications across the full stack , with a strong focus on backend development in Java and frontend work using ReactJS or GWT . Collaborate with a cross-functional, high-performing team to deliver scalable , customer-centric solutions . Drive innovation by designing and building software that fuels product enhancements and supports business growth . What We're Looking For: A dvanced Core Java skills with deep expertise in object-oriented programming concepts like inheritance , interfaces , abstract/concrete classes , and control structures Ability to apply these principles to solve complex, business-driven challenges Proficient SQL knowledge with the ability to write and optimize complex queries in relational databases Hands-on experience with Spring Boot , Spring MVC , and Hibernate for backend development Familiarity with REST APIs and microservices architecture Frontend development experience using ReactJS , Angular , or GWT , with the ability to build responsive , user-friendly interfaces and integrate them in a full-stack environment Experience with AWS services such as EC2 , S3 , RDS , Lambda , API Gateway , CloudWatch , and IAM is a plus Strong analytical and problem-solving skills Experience in technical leadership or mentoring is preferred Excellent communication and collaboration skills A commitment to clean, maintainable code and a passion for continuous learning Join Us at Vitech! Career Development: At Vitech, we’re committed to your growth. You’ll have ample opportunities to deepen your expertise in both Java and ReactJS, advancing your career in a supportive environment. Innovative Environment: Work with cutting-edge technologies in an Agile setting where your ideas and creativity are welcomed and encouraged. Impactful Work: Your contributions will be crucial in shaping our products and delivering exceptional solutions to our global clients. At Vitech, you’re not just maintaining software but creating it. At Vitech, you’ll be part of a forward-thinking team that values collaboration, innovation, and continuous improvement. We provide a supportive and inclusive environment where you can grow as a leader while helping shape the future of our organization.

Posted 1 month ago

Apply

3 - 5 years

14 - 18 Lacs

Pune

Work from Office

Naukri logo

In this role, you will maintain and enhance frontend services, working with AngularJS to support existing user-facing features while contributing to modernization efforts using React. Collaboration is key, as you'll work closely with designers, backend developers, and product managers to integrate systems involving XHTML/JSF-based UIs and Java backend frameworks like Spring and JBoss-Seam. You'll play a pivotal role in ensuring the technical feasibility of UI/UX designs while optimizing applications for performance, scalability, and accessibility. A strong emphasis on code quality is essential, requiring you to write clean, maintainable, and well-documented code that adheres to industry best practices. Additionally, you'll stay informed about emerging technologies and industry trends, continuously learning and applying this knowledge to improve the system and processes . Required Experience Skills 5 to 8 years' experience in a relevant software development role Frontend Frameworks : Proficiency in AngularJS; familiarity with React is a plus. Web Development : Expertise in HTML5, CSS3, JavaScript, and TypeScript. Backend Integration : Experience with Node.js, Express.js, and API-driven development. Build Tools and Testing : Familiarity with task automation tools like Grunt and unit testing frameworks like Karma. Containerization : Understanding of Docker for application deployment. AWS Services : Working knowledge of AWS services such as S3, Lambda, and CloudFront. Soft Skills : Strong communication, problem-solving skills, and a collaborative mindset. Preferred Skills Agile Practices : Familiarity with agile development methodologies. Testing Frameworks : Proficiency with Jest or Mocha for testing. Advanced AWS Services : Familiarity with CloudWatch, DynamoDB, API Gateway, AppSync, Route 53, CloudTrail, WAF, and X-Ray

Posted 1 month ago

Apply

3 - 6 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Job Title: Senior DevOps Engineer Location: Bangalore / Hyderabad / Chennai / Coimbatore Position: Full-time Department: Annalect Engineering Position Overview Annalect is currently seeking a Senior DevOps Engineer to join our technology team remotely, We are passionate about building distributed back-end systems in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in shaping our technical architecture, design and development of software products, collaborate with back-end developers from other tracks, as well as research and evaluation of new technical solutions. Responsibilities Key Responsibilities: Build and maintain cloud infrastructure through terraform IaC. Cloud networking and orchestration with AWS (EKS, ECS, VPC, S3, ALB, NLB). Improve and automate processes and procedures. Constructing CI/CD pipelines. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Qualifications Required Skills 5 + years of DevOps experience 5 + years of hands-on experience in administering cloud technologies on AWS, especially with IAM, VPC, Lambda, EKS, EC2, S3, ECS, CloudFront, ALB, API Gateway, RDS, Codebuild, SSM, Secret Manager, Lambda, API Gateway etc. Experience with microservices, containers (Docker), container orchestration (Kubernetes). Demonstrable experience of using Terraform to provision and configure infrastructure. Scripting ability - PowerShell, Python, Bash etc. Comfortable working with Linux/Unix based operating systems (Ubuntu preferred) Familiarity with software development, CICD and DevOps tools (Bitbucket, Jenkins, GitLab, Codebuild, Codepipeline) Knowledge of writing Infrastructure as Code (laC) using Terraform. Experience with microservices, containers (Docker), container orchestration (Kubernetes), serverless computing (AWS Lambda) and distributed/scalable systems. Possesses a problem-solving attitude. Creative, self-motivated, a quick study, and willing to develop new skills. Additional Skills Familiarity with working with data and databases (SQL, MySQL, PostgreSQL, Amazon Aurora, Redis, Amazon Redshift, Google BigQuery). Knowledge of Database administration. Experience with continuous deployment/continuous delivery (Jenkins, Bamboo). AWS/GCP/Azure Certification is a plus. Experience in python coding is welcome. Passion for data-driven software. All of our tools are built on top of data and require work with data. Knowledge of laaS/PaaS architecture with good understanding of Infrastructure and Web Application security Experience with logging/monitoring (CloudWatch, Datadog, Loggly, ELK). Passion for writing good documentation and creating architecture diagrams.

Posted 1 month ago

Apply

4 - 6 years

25 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Annalect is currently seeking a senior developer to join our technology team. In this role, you will contribute to the design and development of intuitive front-end applications and distributed backend microservices. We are passionate about modular reusable software architecture. We are looking for people who have a shared passion for developing and building cool reusable user interfaces and services. In this role you will contribute to the technical architecture of the product as well as research and evaluation of new technical solutions while coordinating between interdisciplinary teams to help shape the perfect solution for us and our agencies. Responsibilities Development and unit testing of web application including front-end (SPA) and back-end (microservices), maintenance & support of the same. Provide assistance to Project Managers and Technical Leads in the planning of projects (eg provision of estimates, risk analysis, requirements analysis, technical options) Involvement in full life cycle of projects (including requirement analysis and system design, development and support if required) Support and work collaboratively with teams across areas of design, development, quality assurance and operations Commit your knowledge and experience to team success. To be a knowledge keeper for product, its architecture, design, and implementation details Provide overall mentorship, coaching and on-demand trainings to improve and unify development style. Qualifications 5 - 7 years in application development. Understanding the sense of OOP/OOD/DDD and design patterns. .Net Full Stack, Angular 11+, .Net Core 6+, Web API TypeScript, Jest , NodeJs, OAuth2 Database experience (Sql Server) and ORM technologies (LINQ, EF or similar). NoSql - MongoDB AWS Experience – Lambda, Step Function, S3, ECS, RDS, SQS, Elasticsearch Performance optimization Security design and implementation. CI/CD practices Docker – good to have

Posted 1 month ago

Apply

2 - 5 years

14 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies