Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru, Karnataka
Work from Office
Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
8.0 - 13.0 years
27 - 32 Lacs
Pune
Work from Office
You will: Deliver relevant insights that inform and influence strategic corporate programs through the derivation and curation of main performance metrics Develop an understanding of both business processes and the technologies that enable them, delivering insights are grounded in operational reality and technical feasibility Help translate complex data into meaningful narratives that inspire executive decision-making and shape future performance. Find success in this role by demonstrating the ability to connect the dots between data, strategy, and outcomes You will be part of the Global Analytics and Insights organization, reporting to the Pre Post Sales Insights team. You will partner with Marketing, Sales Development, Customer Success, Global Support, Go-Live, Compliance, and many other teams to provide detailed analyses and insights What Your Responsibilities Will Be You will lead the value stream for critical business processes, ensuring agreement between insights, operational goals, and the development of scalable, long-term analytical solutions Facilitate and manage cross-functional partnerships, agreement, and execution of process improvements and strategic programs Deliver data-driven insights and performance metrics that inform main decisions and support the successful implementation and measurement of business strategies Engage with senior executives and partners, including major customers, to present findings, influence direction, and ensure solutions address our needs You will be reporting to Senior Director, Value Creation Insights What You'll Need to be Successful Minimum of: 8 years of related experience with a Bachelor's degree, or 5 years with a Master's degree, or Equivalent combination of education and experience Demonstrated advanced SQL skills and data modeling expertise Proven ability to bring in best practices and streamline workflows Create solutions for process improvement and efficiency gains Leverage subject matter expertise (SME) and intellectual curiosity to achieve results
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Hello Jobseekers.... I am Hiring now Adobe Analytics senior enginner role for my client. Location: Bangalore Experienece: 6-12 Years NP: Immediate-30 days Must have: 6+ yrs. of overall experience working in web analytics or a related field Bachelor's/Master's degree in Computer Science with equivalent work experience Solid understanding of online marketing, tools and technology Strong understanding of HTML and web protocols Strong-to-advanced JavaScript skills Passion for the internet domain and use of technology to solve business problems Solid understanding of general business models, concepts and strategies Must be self-motivated, responsive, professional and dedicated to customer success Possess an innovative, problem-solving, and solutions-oriented mindset Exceptional organizational, presentation, and communication skills- both verbal and written Demonstrated ability to learn quickly, be a team player, and manage change effectively Extensive knowledge of Microsoft Office Special consideration given for: Previous experience working with Adobe Adobe Analytics or similar tools Website optimization consulting experience Advanced SQL skills, Data modelling / Data Warehousing skills Web development experience Experience working with Mobile/Media Analytics implementations to the core code level details and development Experience around usage of APIs, hands on development experience on programming languages such as Node.js or Python ERP, Saas, or other software implementation experience Deep vertical industry experience (e.g., Retail, media, financial services, high tech, etc.) Expertise with mobile or social media analytics Kinldy share your resume at chanchal@oitindia.com Or share this job in your network.
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Hyderabad, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 10 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 month ago
2.0 - 5.0 years
30 - 32 Lacs
Bengaluru
Work from Office
Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
5.0 - 9.0 years
30 - 32 Lacs
Bengaluru
Work from Office
Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
3.0 - 8.0 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and ETL processes. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 month ago
8.0 - 12.0 years
10 - 20 Lacs
Noida, Greater Noida
Work from Office
Hi All We are looking for Sr Business Analyst Exp -8+ years CTC-27L Advance Sql, Power Bi, Data Build tools, DBT, Altryx, Azure combination of any of these 2. If any one Interested call me on 9820389632 OR mail me your resume at vinoda@phebushr.com
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Noida
Remote
Architect & manage data solutions using Snowflake & advanced SQL Design & implement data pipelines, data warehouses, & data lakes, ensuring efficient data transformation Develop best practices for data security, access control, & compliance Required Candidate profile Exp 8-14 yrs Strong data architect SQL& Snowflake exp must Collaborate with cross-functional teams, integrate & translate them into robust data architectures Manufacturing industry exp is a must
Posted 1 month ago
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
We are looking for an experienced Data Modelling professional, proficient in tools such as Erwin and ER/Studio. A strong understanding of Azure Databricks, Snowflake/Redshift, SAP HANA, and advanced SQL is required. Prior experience in leading teams is also preferred.
Posted 1 month ago
3.0 - 5.0 years
30 - 32 Lacs
India, Bengaluru
Work from Office
Job Title : Data Engineer (DE) / SDE Data Location Bangalore Experience range 3-15 What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
8.0 - 12.0 years
30 - 35 Lacs
Mumbai, Pune
Hybrid
Essential Duties & Responsibilities: Provides support to the Engineering teams, with a high attention to detail Researches, analyzes, and documents findings. May influence others within the Software Engineering team through the explanation of facts, policies, practices Designs, builds, and maintains large-scale production services, web applications, data pipelines, and streaming systems Works on systems critical to companys current and future operations Debugs production issues across services and multiple levels of the stack Assists with improvement of organizational engineering standards, tooling, and processes Participates in the testing process through test review and analysis, test witnessing, and certification of software Evaluates codes to ensure validity, proper structure, alignment with industry standards, and compatibility with operating systems Maintains an understanding of current technologies or programming practices through continuing education, reading or participation in professional conferences, workshops, and/or groups Required Skills hands-on experience with Java version 8+ and object-oriented programming principles Understanding of JVM internals, garbage collection, and performance tuning Concurrency and multithreading concepts Exception handling and debugging techniques Experience with testing frameworks (JUnit, Mockito) Understanding of application server deployment and configuration Spring Boot: Strong experience building RESTful APIs using Spring Framework Spring: Knowledge of additional Spring modules including, Spring Security, Spring Data JPA, Spring Web MVC Maven: Experience with Maven for project management, dependency management, and build automation SQL: Advanced SQL skills including complex queries, database optimization, and performance tuning Database: Experience with relational databases (PostgreSQL) Version Control: Proficiency with Git and collaborative development workflows Qualifications: BE/Btech/MCA
Posted 1 month ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
9.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 month ago
6.0 - 9.0 years
10 - 16 Lacs
Pune, Chennai, Bengaluru
Work from Office
6+ years of proven experience in Software testing - Manual and Automation. Experience in UI Automation tools such as Selenium WebDriver with any programming languages - C#/Java/Python Experience in Manual and Automation testing of Web services / API. Experience on API Automation tools (JMeter, Rest assured, Postman etc.). Working experience on Python (Mandatory) Experience with Database testing with SQL/No SQL Databases. Nice to have experience in cloud environment (AWS). Nice to have experience in Linux.
Posted 1 month ago
4.0 - 9.0 years
16 - 20 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description An excellent opportunity for personal development in a dynamic environment. You will join a highly skilled and dynamic team supporting Murex applications in the UK and our global practice focused on Application installation support around the world. We are one of the largest Murex partners and offer a wide range of opportunities in the region. There are good opportunities to develop in different areas. The team is highly skilled and will provide a great opportunity to expand your knowledge. Responsibilities Act as the subject matter expert for datamart and integration ensuring that all functionality of the product are installed and leveraged to its best capability Technical Analysis of changes, solution design, development/configuration and unit testing of MxML workflows and datamart Analysis & Documentation of user requirements and transpose into Functional Specifications Define the systems and data requirements and validate the systems design and processes from functional and technical aspects End to end ownership of tasks in cooperation with Business Analysts and Testing team. Contribute to the User Training activities, through one-to-one discussion, preparation of user training guides & presentations Follow up with vendor support as and when necessary to resolve bugs/issues Ensure technical and functional hand over of the project and changes to the relevant teams Participate in fixing production and test defects Skills Must have 4+ years of Murex Development experience Experience working in the financial industry with relevant experience in business analysis and project implementation. Experience in managing and delivery of trading platforms for Treasury products on a global scale, integrated within the organizations treasury product systems. Strong team player with excellent communication & inter-personal skills. Strong problem solver who can question and understand proposed solutions and business drivers. Strong organizational and leadership skills Strong understanding of treasury products and experience in back office projects. Good knowledge of the different post-trade interactions between the various actors of capital markets including service providers Advanced MxML workflow and formulae development Strong datamart knowledge Advanced SQL Good general financial market understanding Knowledge of pre trade framework along with MSL scripting language Unix Nice to have Experience in other Murex modules
Posted 1 month ago
5.0 - 10.0 years
17 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description Our Customer is a Leading bank in Australia that provides a front to back integrated platform for straight through processing and risk management. This is a multi-year initiative where different projects run in concurrence under the program's variety milestones. These streams include new product initiatives, new entity roll-outs, and regulatory compliances. We will have key roles in projects such as managing the scope, design and delivering requirements from front to back office with DXC Luxoft. This specific initiative consists in supporting the replacement of a front system Security Lending platform with a new vendor system. We are looking for talented and ambitious people. The roles are in the respective Functional, Test Management, Development, Test Support, Environment Management and Release teams. These units will collectively undertake scoping, design, building, testing, and implementation phases to deliver the variety program milestones. We require an experienced MxML developer with strong knowledge of Murex, experience in MxML solution design, and broad exposure to financial markets. You will be working as a Senior Developer in a team of Murex Developers on a variety of tasks. Responsibilities Write transformation logic for source data format to Murex understandable format (MxML) Create MxML import and export workflows using MXML Exchange Configure Messaging queues for real-time interfacing Configure real-time and EOD market data upload from various source systems Document Functional, Technical Specification for integration Build custom tasks in MxML Exchange for specific processing not available through standard task library Ensure alignment to latest Murex CICD processes including working with MxCI/MxConfig/MxPipeline Skills Must have 5+ years of Murex Development experience Minimum 3 Years of Murex MxML experience on 3.1 Outstanding MxML workflow and (XSL) formulae development Advanced SQL Good general financial market understanding, especially in IRD and Rates Unix Scripting (Shell, Perl), SQL (TSQL/PL SQL), Messaging Queues (MQ, TIBCO) Basic pretrade Nice to have Experience interfacing Market Data Good Java experience, especially integrating Java code in MxML DevOps on Murex experience (GIT, Jenkins, JIRA etc) Technical solution design experience and start-to-end solution ownership
Posted 1 month ago
4.0 - 9.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description Luxoft has been engaged by a leading UK Financial Services organization to provide Murex implementation services across a varied portfolio of projects. We require an experienced integration developers with strong knowledge of Murex and broad exposure to financial markets to work on a multi-year program. You will be working in a high-performing team of Luxoft and Client staff. Responsibilities Technical Analysis of changes, solution design, development/configuration, and unit testing of MxML Workflows Technical Analysis of changes, solution design, development, and unit testing Participate in fixing production and test defects End-to-end ownership of tasks in cooperation with Business Analysts and Testing team Skills Must have 4+ years of Murex MxML experience Advanced MxML workflow and formulae development Advanced SQL Good general financial market understanding Knowledge of pre-trade framework along with MSL scripting language Unix Nice to have Good Java experience, especially integrating Java code in MxML Pretrade experience DevOps on Murex experience (GIT, Jenkins, JIRA, etc) Technical solution design experience and start-to-end solution ownership.
Posted 1 month ago
8.0 - 13.0 years
40 - 60 Lacs
Gurugram, Bengaluru
Hybrid
Exp: 8 to 15 years Location: Gurgaon/Bengalore Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc. Role and Responsibilities: - Primary day-to-day client contacts and often interact with C-level executives. - Is primary client day-to-day contact & ensures client needs are well understood, defined & met - Serves as the primary interface between senior client management and senior leadership (VPs and SVPs) - Interacts regularly with clients to understand business requirements, define analytical problems, structure and communicate solutions and ensure client satisfaction with strong business driving results - Manages / leads 1-2 engagement/ projects with different team structures and service delivery models. Responsible for driving revenue generating from these accounts - Leads project teams of 3-6 consultants or more, and/ or Team Lead and Analysts in all aspects of project execution - Plays critical role in defining the problem, structuring the solution, and executing against it - Clearly defines project deliverables, timelines and methodology laying out the project plan - Owns the execution of the project, with on time delivery every time, ensuring all project goals are met - Manages team members, including definition of objectives, oversight of execution and evaluation of performance - Provides thought leadership and delivers business insights to identify and resolve complex issues critical to their clients' success - Candidate Profile: - 7+ years of experience comprising analytics service delivery, consulting, solution design and client management - Experience and strong knowledge of the analytics service industry with focus on B2C & Omni-channel retail across customer lifecycle and customer journey - Demonstrable leadership ability, superior problem solving and people management skills - Excellent listening, written communication and presentation skills - Experience / exposure to marketing, operations, sales, merchandizing and supply chain, retail strategy, project management, cost reduction, and business development for Retail and Media organizations - Master's degree in management, data science, economics, mathematics, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply
Posted 1 month ago
6.0 - 8.0 years
15 - 20 Lacs
Chennai
Work from Office
Senior Data Engineer: Job Title: Senior Data Engineer Experience : 6 to 8 Years Location: Chennai Job Description: Movate is seeking a highly skilled Senior Data Engineer to lead the development of scalable, modular, and high-performance data pipelines. You will work closely with cross-functional teams to support data integration, transformation, and delivery for analytics and business intelligence. Key Responsibilities: Design and maintain ETL/ELT pipelines using Apache Airflow , Azure Databricks , and Azure Data Factory Build scalable data infrastructure and optimize data workflows Ensure data quality, security, and governance across platforms Collaborate with data scientists and BI developers to support analytics and reporting Monitor and troubleshoot data pipelines for reliability and performance Document data processes and workflows for knowledge sharing Technical Skills Required: Strong proficiency in Python (Pandas, NumPy, REST APIs) Advanced SQL skills (joins, CTEs, performance tuning) Experience with Databricks , Apache Airflow , and Azure Cloud Services Knowledge of SparkSQL , PySpark , and containerization using Docker Familiarity with data lake vs data warehouse architectures Experience in data security , encryption , and access provisioning Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or related field Excellent problem-solving and communication skills Ability to work independently and manage end-to-end delivery Comfortable in agile development environments EEO Statement: Movate provides equal opportunity in all our employment practices to all qualified employees and applicants without regard race, color, religion, sex (including gender identity, sexual orientation, and pregnancy), national origin, age, disability or genetic information and other characteristics that are protected by applicable law.
Posted 1 month ago
2.0 - 7.0 years
18 - 22 Lacs
Bengaluru
Work from Office
About The Role Job Title Data Science/Data Engineering ML9 - Sales Excellence COE - Data Engineering Specialist Management Level :ML9 Location:Open Must have skills: strong GCP cloud technology experience, Big query, Data Science basics. Good to have skills: Build and maintain data models from different sources. Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate Job Summary :The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Understand the existing Data Model in SQL Server and help redesign/migrate in GCP Bigquery. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent A Minimum of2 years of strong GCP cloud technology experience A minimum of 2 years Advanced SQL knowledge and experience working with relational databases A minimum of 2 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 2 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A basic knowledge of Data Science models and tools. Additional Information: Extra credit if you have: Understanding of sales processes and systems. Masters degree in a technical field. Experience with Python. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connection and a distraction-free environment for working at home, in accordance with local guidelines. About Our Company | Accenture (do not remove the hyperlink)Qualification Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate
Posted 1 month ago
0.0 years
1 - 2 Lacs
Noida, Chennai, Bengaluru
Hybrid
Responsibilities Include: • Assembles data from multiple sources, conducts analyses, performs research, designs, documents and implements analytical solutions. • Develop and maintain scripts and queries to import, manipulate, clean, transform data from different sources. • Presents analysis conclusions in a clear, understandable, and valuable way. Tells compelling stories supported by data using visualization tools (Power BI, Tableau or similar). • Evaluate and interpret data to find risks and opportunities, relaying them to the team and management. • Create best practices in tools and visual technologies to help evolve the project dashboards to provide essential data for project monitoring. • Builds metrics reporting systems and tracks process improvements across the team. • Relay insights in layman's terms and visualize to be used for taking data-driven decisions. • Provide predictive insights to compute newer prediction/data models. • Perform quality assurance on generated and final results to ensure consistency and accuracy. Skills & Abilities: • Professional analytical experience, applying data to improve process improvements. • Programming and Database experienced with advanced SQL skills and experience of other languages such as JavaScript, Python. • Expert in data visualization techniques for charts and reports - good knowledge of Microsoft Power BI and/or Tableau. • Strong knowledge on math and statistical techniques including probability, correlation, regression, classification, and the development of statistical models. • Strong problem solving, creative thinking, and analytical skills. • Knowledge on ML/AI is a plus. • Highly effective communicator both verbally and written
Posted 1 month ago
2.0 - 5.0 years
20 Lacs
Hyderabad
Hybrid
Responsibilities Include: • Assembles data from multiple sources, conducts analyses, performs research, designs, documents and implements analytical solutions. • Develop and maintain scripts and queries to import, manipulate, clean, transform data from different sources. • Presents analysis conclusions in a clear, understandable, and valuable way. Tells compelling stories supported by data using visualization tools (Power BI, Tableau or similar). • Evaluate and interpret data to find risks and opportunities, relaying them to the team and management. • Create best practices in tools and visual technologies to help evolve the project dashboards to provide essential data for project monitoring. • Builds metrics reporting systems and tracks process improvements across the team. • Relay insights in layman's terms and visualize to be used for taking data-driven decisions. • Provide predictive insights to compute newer prediction/data models. • Perform quality assurance on generated and final results to ensure consistency and accuracy. Skills & Abilities: • Professional analytical experience, applying data to improve process improvements. • Programming and Database experienced with advanced SQL skills and experience of other languages such as JavaScript, Python. • Expert in data visualization techniques for charts and reports - good knowledge of Microsoft Power BI and/or Tableau. • Strong knowledge on math and statistical techniques including probability, correlation, regression, classification, and the development of statistical models. • Strong problem solving, creative thinking, and analytical skills. • Knowledge on ML/AI is a plus. • Highly effective communicator both verbally and written
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai, Navi Mumbai
Work from Office
Proven work experience as a backend developerHands-on experience with Java (Java 8 is a MUST Java 11/17 added advantage)Strong knowledge of Spring Framework, Spring Boot, and RESTful API designLeverage AI-assisted development tools such as GitHub Copilot and ChatGPT to enhance code quality, accelerate development, and automate routine tasksUtilize AI models and frameworks including Llama for natural language understanding and generation tasks relevant to product featuresImplement and optimize AI inference using Groq hardware accelerators for performance-critical workloadsEmploy Langsmith or similar AI workflow management tools to design, monitor, and improve AI model pipelines and integrationsExperience with Advanced SQL, PLSQLFamiliarity with version control systems, especially GitAdherence to coding conventions and best practicesExcellent analytical and debugging skillsRequirementsDesired Candidate ProfileExperience working with AI-powered development environments and tools to boost productivity and innovationStrong interest in staying updated with AI trends and integrating AI-driven solutions within software productsStrong expertise in Java, Sprint boot, advanced SQL and PL/SQL for complex data querying and database programmingFamiliarity with containerization using Docker, K8S, GCP
Posted 1 month ago
5.0 - 10.0 years
10 - 19 Lacs
Hyderabad
Work from Office
Required Skills & Qualifications : 5+ years of hands-on experience in data analysis roles. Expertise in Advanced SQL including window functions, CTEs, subqueries, and performance tuning. Strong understanding of data structures, ETL processes , and relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake). Experience with BI tools like Power BI, Tableau, Looker, or similar. Proficiency in Excel (PivotTables, advanced formulas, charts). Ability to interpret business problems into analytical solutions. Strong attention to detail, critical thinking, and problem-solving skills. Excellent verbal and written communication skills. Good to Have : Experience with scripting languages like Python or R for data manipulation. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of data warehousing concepts and tools (e.g., Redshift, BigQuery). Exposure to CRM, ERP, or marketing analytics datasets.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
57101 Jobs | Dublin
Wipro
24505 Jobs | Bengaluru
Accenture in India
19467 Jobs | Dublin 2
EY
17463 Jobs | London
Uplers
12745 Jobs | Ahmedabad
IBM
12087 Jobs | Armonk
Bajaj Finserv
11514 Jobs |
Amazon
11498 Jobs | Seattle,WA
Accenture services Pvt Ltd
10993 Jobs |
Oracle
10696 Jobs | Redwood City