Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
3 - 7 Lacs
mumbai
Work from Office
- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
ranchi
Work from Office
- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
amravati
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
bengaluru
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
patna
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
imphal
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
hyderabad
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
2.0 - 4.0 years
1 - 6 Lacs
noida
Work from Office
Job Role : Java Developer Work Experience : 2-4Yrs Minimum Education : Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Job Location : Noida Working Time : 10 Am to 7 Pm office hours ( Required some time for occational support) 5 Days working ( Monday - Friday) About WebBee Global: At WebBee Global, we specialize in providing top-line solutions for eCommerce merchants to help drive online business success. With comprehensive integrations for platforms like Amazon, Shopify, Big Commerce, Magento, WooCommerce, and NetSuite, we ensure seamless operations for our clients. E-commerce Integration Solution Provider for NetSuite Integration, Amazon MCF Integration, EDI Integration. About Co-Founders: Abhishek Jain (Co-Founder & Director), a postgraduate from the University of Northampton, England spearheads WebBee Global's solutions for scaling eCommerce merchants. With two decades of exposures in the area of application development, integrations and customization, he started WebBee Global in 2005. Himani Jain (Co-Founder & Director), is passionate about building a culture that fosters collaborations, creativity, and growth. She's also the Marketing Head of the company who's committed ensure the security of every individual associated with the organization in order to deliver top-quality software solutions that helps eCommerce businesses scale. Job Summary We are seeking a highly skilled Software Engineer to join our dynamic team. The ideal candidate will have strong expertise in Java development, with experience in building robust and scalable web applications. Key Responsibilities; Develop, implement, and maintain high-performance Java-based applications. Collaborate with cross-functional teams to analyze business requirements and provide technical solutions. Design and develop RESTful APIs and integrate third-party services. Optimize application performance and ensure scalability. Conduct code reviews, mentor junior developers, and uphold best coding practices. Troubleshoot, debug, and resolve technical issues promptly. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.Required Skills & Qualifications: 2+ years of hands-on experience in Java development. Strong understanding of Spring Framework, Hibernate, and related technologies. Proficient in RESTful API development and microservices architecture. Experience with cloud platforms such as AWS. Familiarity with relational databases like MySQL, PostgreSQL, or MongoDB. Benefits: 5 Days working (Monday to Friday) Near to Sector 62 metro Station, Noida Medical insurance & Meals Positive and Healthy working environment If Interested , Kindly share your Resume at pooja.singh@webbeeglobal.com
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
patna
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
srinagar
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
kohima
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
4.0 - 7.0 years
3 - 7 Lacs
sikkim
Hybrid
Job Type:- Contractual Duration: 6 months Job Description:- Apptad is looking for a Backend Developer. It is a full-time/long-term job opportunity with us. Advanced JAVA concepts and Spring boot applications. Profound experience working with microservice architecture. Application deployment using Docker and Kubernetes. Experience in working AWS Cloud services such as ECS Fargate, DynamoDB, RDS, and Lambda. Experienced in any CI-CD pipelines/Jenkins. Database knowledge in PostgreSQL/AWS RDS or No SQL DBs. Nice to have: Knowledge on Elasticsearch querying. Strong foundational knowledge on computer science principles, Great teammate Flexible to work in shifts, Detail-Oriented, Motivated, Dedicated, Positive, Polite, Approachable
Posted Date not available
1.0 - 6.0 years
8 - 13 Lacs
pune
Work from Office
Cloud Observability Administrator JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22685 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Cloud Observability Administrator ZS is looking for a Cloud Observability Administrator to join our team in Pune. As a Cloud Observability Administrator, you will be working on configuration of various Observability tools and create solutions to address business problems across multiple client engagements. You will leverage information from requirements-gathering phase and utilize past experience to design a flexible and scalable solution; Collaborate with other team members (involved in the requirements gathering, testing, roll-out and operations phases) to ensure seamless transitions. What Youll Do: Deploying, managing, and operating scalable, highly available, and fault tolerant Splunk architecture. Onboarding various kinds of log sources like Windows/Linux/Firewalls/Network into Splunk. Developing alerts, dashboards and reports in Splunk. Writing complex SPL queries. Managing and administering a distributed Splunk architecture. Very good knowledge on configuration files used in Splunk for data ingestion and field extraction. Perform regular upgrades of Splunk and relevant Apps/add-ons. Possess a comprehensive understanding of AWS infrastructure, including EC2, EKS, VPC, CloudTrail, Lambda etc. Automation of manual tasks using Shell/PowerShell scripting. Knowledge of Python scripting is a plus. Good knowledge of Linux commands to manage administration of servers. What Youll Bring: 1+ years of experience in Splunk Development & Administration, Bachelor's Degree in CS, EE, or related discipline Strong analytic, problem solving, and programming ability 1-1.5 years of relevant consulting-industry experience working on medium-large scale technology solution delivery engagements; Strong verbal, written and team presentation communication skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams Proven ability to work creatively and analytically in a problem-solving environment Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects Knowledge on Observability tools such as Cribl, Datadog, Pagerduty is a plus. Knowledge on AWS Prometheus and Grafana is a plus. Knowledge on APM concepts is a plus. Knowledge on Linux/Python scripting is a plus. Splunk Certification is a plus. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted Date not available
3.0 - 7.0 years
8 - 12 Lacs
pune
Work from Office
Cloud Administrator (AWS Operations) JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22705 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Cloud Administrator / AWS We seek a professional IT administrator to join our Pune, India office. This role is responsible for deploying and administering acloud-based computing platformwithin ZS. What Youll Do? ? Participate inAWS deployment, configuration and optimization to provide a cloud-based platform to address business problems across multiple client engagements; Leverage information from requirements-gathering phase and utilize past experience to implement a flexible and scalable solution; Collaborate with other team members (involved in the requirements gathering, testing, roll-out and operations phases) to ensure seamless transitions; Operatescalable, highly available, and fault tolerant systems on AWS; Controlthe flow of data to and from AWS; Use appropriateAWS operational best practices; Configureand understandVPC network and associated nuances for AWS infrastructure; Configureand understandAWS IAM policies; Configuresecured AWS infrastructure; User and access management for different AWS services. What Youll Bring? ? Bachelor's Degree in CS, IT or EE 1-2 years of experience in AWS Administration, Application deployment and configuration management Good knowledge of AWS services (EC2,S3,IAM,VPC,Lambda,EMR,CloudFront,Elastic Load Balancer etc.) Basic knowledge of CI/CD tools like JetBrains TeamCity, SVN,BitBucket etc. Good knowledge of Windows and Linux operating system administration. Basic knowledge of RDBMS and database technologies like SQL server, PostgreSQL. Basic knowledge of web server software like IIS or Apache Tomcat Experience in any scripting knowledge like PowerShell, Python, Bash/Shell scripting Knowledge of DevOps methodology Strong verbal, written and team presentation communication skills Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted Date not available
2.0 - 4.0 years
7 - 12 Lacs
pune, gurugram
Work from Office
What you’ll do Co-create a transformative end-to-end platform that enables ZS’s clients to solve for their data ingestion to data analytics to insight creation to application of these insights helping them make critical decisions and improve their sales operations. Implement major full-stack features with limited guidance from leads and managers Demonstrate a passion for SaaS product development, and be extremely detail-oriented Show willingness to rapidly learn new languages and platforms Be a technical expert, persuade and mentor junior developers, and share your expertise with the team Provide accurate effort estimates on work Take full ownership of the code you write – from designing to developing to maintaining your modules Work to build future ready cloud native software assets that are maintainable, testable, extendable etc. What you’ll bring ZS welcomes candidates with a bachelor’s or master’s degrees in computer science, electrical engineering, mathematics or a related discipline with a demonstrated record of academic success Candidates should have 1+ years of software product development experience Experience developing high-performing, secure, production quality code Experience developing full-stack SaaS software products Expertise developing SOA web services and REST / OData based APIs Expertise with IoC containers and DI frameworks (like Guice), SOLID and DRY principles, REST frameworks, ORMs (like Hibernate), CI/CD solutions using Maven, Jenkins, TeamCity etc. Deep expertise in object-oriented programming, preferably in Java and Java based frameworks Nice to haveExperience with AWS platform, specifically Lambda, API Gateway, IAM Nice to haveExperience with Big Data, specifically EMR, Spark, and Scala\ Strong analytic, problem solving, and programming ability Strong understanding of algorithms and data structures Initiative and willingness to work in fast paced, agile teams Excellent organizational and task management skills Strong communication and persuasion skills Ability and excitement to quickly learn new programming languages and platforms and frameworks Ability to work in global cross-office teams, including travelling to remote offices as required Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted Date not available
2.0 - 5.0 years
14 - 19 Lacs
chennai
Work from Office
What you’ll do Develop and maintain web applications using frontend frameworks (React, Angular, or Vue) and backend technologies (Node.js, Python, Java, or .NET) . Integrate applications with cloud services (preferably AWS) and enterprise APIs. Design database models and manage data storage using relational or NoSQL databases. Optimize application performance, scalability, and security. Collaborate with DevOps for CI/CD pipeline integration and cloud deployments. Participate in code reviews, design discussions, and agile ceremonies. What you’ll bring 1-3 years of experience in fullstack web application development. Strong knowledge of JavaScript frameworks (React, Angular) and backend technologies (Node.js, Python, Java) . Experience working with AWS (EC2, Lambda, S3, API Gateway, RDS) preferred. Familiarity with RESTful API design, OAuth, and modern application security practices. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams
Posted Date not available
3.0 - 8.0 years
15 - 20 Lacs
chennai
Work from Office
What youll do : Develop and maintain web applications using frontend frameworks (React, Angular, or Vue) and backend technologies (Node.js, Python, Java, or .NET) . Integrate applications with cloud services (preferably AWS) and enterprise APIs. Design database models and manage data storage using relational or NoSQL databases. Optimize application performance, scalability, and security. Collaborate with DevOps for CI/CD pipeline integration and cloud deployments. Participate in code reviews, design discussions, and agile ceremonies. What youll bring: 3+ years of experience in fullstack web application development. Strong knowledge of JavaScript frameworks (React, Angular) and backend technologies (Node.js, Python, Java) . Experience working with AWS (EC2, Lambda, S3, API Gateway, RDS) preferred. Familiarity with RESTful API design, OAuth, and modern application security practices. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams
Posted Date not available
6.0 - 11.0 years
18 - 22 Lacs
pune
Work from Office
As a Senior Cloud Engineering Lead , you will be an individual contributor and subject matter expert who maintains and participates in the design and implementation of technology solutions. The engineer will collaborate within a team of technologists to produce enterprise scale solutions for our clients needs. This position will be working with the latest Amazon Web Services technologies around cloud architecture, infrastructure automation, and network security. What You'll Do: Identify, test prototype solution and proof of concepts on public clouds Help develop architectural standards and guidelines for scalability, performance, resilience, and efficient operations while adhering to necessary security and compliance standards Work with application development teams to select and automate repeatable tasks along with participating and assisting in root cause analysis activities Architect cloud solutions using industry-leading DevSecOps best practices and technologies Review software product designs to ensure consistency with architectural best practices; participate in regular implementation reviews to ensure consistent quality and adherence with internal standards Partner closely with cross-functional leaders (platform engineers / software development / product management / business leaders) to ensure a clear understanding of business and technical needs; jointly select the best strategy after evaluating the benefits and costs associated with different approaches Closely collaborate with implementation teams to ensure understanding and utilization of the most optimal approach What You'll Bring 6+ years in an Infrastructure Engineering / Software Engineering / DevOps role, deploying and maintaining SaaS applications 4+ years experience with AWS/Azure/GCP cloud technologies and at least one cloud proficiency certification is required 3+ years experience functioning as a senior member in an infrastructure/software team Hands-on experience with AWS services like Lambda, S3, RDS, EMR, CloudFormation (or Terraform), CodeBuild, Config, Systems Manager, ServiceCatalog, Lambda, etc. Experience building automation using scripting languages like Bash/Python/PowerShell Experience working and contributing to software applications development/deployment/management processes Experience in architecting and implementing cloud-based solutions with robust Business Continuity and Disaster Recover requirements Experience working in agile teams with short release cycles Possess strong verbal, written and team presentation communication skills. ZS is a global firm; fluency in English is required This role requires healthy doses of initiative and the ability to remain flexible and responsive in a very dynamic environment Ability to work around unknowns and develop robust solutions. Experience of delivering quality work on defined tasks with limited oversight Ability to quickly learn new platforms, cloud technologies, languages, tools, and techniques as needed to meet project requirements.
Posted Date not available
5.0 - 10.0 years
4 - 8 Lacs
bengaluru
Work from Office
About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
5.0 - 10.0 years
3 - 5 Lacs
bengaluru
Work from Office
About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
2.0 - 5.0 years
4 - 8 Lacs
bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
8.0 - 13.0 years
30 - 32 Lacs
bengaluru
Work from Office
About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
8.0 - 13.0 years
30 - 32 Lacs
bengaluru
Work from Office
About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
5.0 - 10.0 years
30 - 35 Lacs
bengaluru
Work from Office
About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted Date not available
5.0 - 10.0 years
30 - 32 Lacs
bengaluru
Work from Office
About The Role Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City