Home
Jobs

95 Advance Sql Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 4.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

This is an Internal document. Job Title:- Entry Level – Data Engineer/SDE About Kotak Mahindra GroupEstablished in 1985, the Kotak Mahindra Group is one of India’s leading financial services conglomerates. In February 2003, Kotak Mahindra Finance Ltd (KMFL), the group’s flagship company, received a banking license from the Reserve Bank of India (RBI). With this, KMFL became the first non-banking finance company in India to become a bank – Kotak Mahindra Bank Limited. The consolidated balance sheet of Kotak Mahindra Group is over 1 lakh crore and the consolidated net worth of the Group stands at 13,943 crore (approx. US$2.6 billion) as on September 30, 2012. The Group offers a wide range of financial services that encompass every sphere of life. From commercial banking, to stock broking, mutual funds, life insurance and investment banking, the Group caters to the diverse financial needs of individuals and the corporate sector. The Group has a wide distribution network through branches and franchisees across India, the international offices in London, New York, California, Dubai, Abu Dhabi, Bahrain, Mauritius and Singapore. For information, please visit the company’s website at http://www.kotak.com What we offer Our mission is simple – Building trust. Our customers' trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages the entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on the greenfield project to revamp the entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities for technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, spark, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be an early member in digital transformation journey of Kotak, learn and leverage technology to build complex data platform solutions including, real time, micro batch, batch and This is an Internal document. analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticalsData Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and build data models in a config based and programmatic and think big to build one of the most leveraged data models for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data built by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics use cases. Data Governance The team will be the central data governance team for Kotak bank building and managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve the right data skills and are ready for building high concurrency systems involving multiple systems from scratch, then this is the team for you. :- Responsibilities:- Develop, and maintain scalable data pipelines and databases. Assist in collecting, cleaning, and transforming data from various sources. Work with AWS services like EC2, EKS, EMR, S3, Glue, Redshift, and MWAA. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Work with data scientists and analysts to provide necessary data for analysis. Ensure data quality and integrity throughout the data lifecycle. Participate in code reviews and contribute to a collaborative, high-performing team environment. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Create state of art software solutions which are durable and reusable for multiple teams This is an Internal document. :- Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Proficiency in SQL and experience with at least one programming language (Python, Scala, Java, etc.). Familiarity with data warehousing concepts and ETL processes. Understanding of big data technologies like Hadoop, Spark, etc., is a plus. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Good to Have:- Experience with cloud platforms (AWS, Azure, Google Cloud). Familiarity with data visualization tools (Tableau, PowerBI). Personal Attributes:- Strong written and verbal communication skills Self-starter who requires minimal oversight Ability to prioritize and manage multiple tasks

Posted 19 hours ago

Apply

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 19 hours ago

Apply

1.0 - 4.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

This is an Internal document. Job Title:- Entry Level – Data Engineer/SDE About Kotak Mahindra GroupEstablished in 1985, the Kotak Mahindra Group is one of India’s leading financial services conglomerates. In February 2003, Kotak Mahindra Finance Ltd (KMFL), the group’s flagship company, received a banking license from the Reserve Bank of India (RBI). With this, KMFL became the first non-banking finance company in India to become a bank – Kotak Mahindra Bank Limited. The consolidated balance sheet of Kotak Mahindra Group is over 1 lakh crore and the consolidated net worth of the Group stands at 13,943 crore (approx. US$2.6 billion) as on September 30, 2012. The Group offers a wide range of financial services that encompass every sphere of life. From commercial banking, to stock broking, mutual funds, life insurance and investment banking, the Group caters to the diverse financial needs of individuals and the corporate sector. The Group has a wide distribution network through branches and franchisees across India, the international offices in London, New York, California, Dubai, Abu Dhabi, Bahrain, Mauritius and Singapore. For information, please visit the company’s website at http://www.kotak.com What we offer Our mission is simple – Building trust. Our customers' trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages the entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on the greenfield project to revamp the entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities for technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, spark, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be an early member in digital transformation journey of Kotak, learn and leverage technology to build complex data platform solutions including, real time, micro batch, batch and This is an Internal document. analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticalsData Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and build data models in a config based and programmatic and think big to build one of the most leveraged data models for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data built by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics use cases. Data Governance The team will be the central data governance team for Kotak bank building and managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve the right data skills and are ready for building high concurrency systems involving multiple systems from scratch, then this is the team for you. :- Responsibilities:- Develop, and maintain scalable data pipelines and databases. Assist in collecting, cleaning, and transforming data from various sources. Work with AWS services like EC2, EKS, EMR, S3, Glue, Redshift, and MWAA. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Work with data scientists and analysts to provide necessary data for analysis. Ensure data quality and integrity throughout the data lifecycle. Participate in code reviews and contribute to a collaborative, high-performing team environment. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Create state of art software solutions which are durable and reusable for multiple teams This is an Internal document. :- Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Proficiency in SQL and experience with at least one programming language (Python, Scala, Java, etc.). Familiarity with data warehousing concepts and ETL processes. Understanding of big data technologies like Hadoop, Spark, etc., is a plus. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Good to Have:- Experience with cloud platforms (AWS, Azure, Google Cloud). Familiarity with data visualization tools (Tableau, PowerBI). Personal Attributes:- Strong written and verbal communication skills Self-starter who requires minimal oversight Ability to prioritize and manage multiple tasks

Posted 19 hours ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 19 hours ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 19 hours ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 19 hours ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 19 hours ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 19 hours ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

The role involves hands-on experience with data testing, data integration, and supporting data quality in big data environments. Key responsibilities include selecting and integrating data tools and frameworks, providing technical guidance for software engineers, and collaborating with data scientists, data engineers, and other stakeholders. This role requires implementing ETL processes, monitoring performance, advising on infrastructure, and defining data retention policies. Candidates should be proficient in Python, advanced SQL, Hive QL, and Spark QL, with hands-on experience in data testing tools like DBT, iCEDQ, QuerySurge, Denodo, or Informatica. Strong experience with NoSQL, Linux/Unix, and messaging systems (Kafka or RabbitMQ) is also required. Additional responsibilities include troubleshooting, debugging, UAT with business users in Agile environments, and automating tests to increase coverage and efficiency. Location: Chennai, Hyderabad, Pune, Kolkata, Ahmedabad, RemotE

Posted 1 day ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Position : Senior Snowflake Developer Primary Skill: Snowflake + Advance SQL (5+ yrs) + DWH (Data warehouse) Work Mode: Hybrid (2 days from office) Job Description : The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions. Requirements Bachelor's degree in computer science, Engineering, or related fields or equivalent practical experience. 8+ years of industry experience with hands-on managing projects in Data Warehousing. Minimum 4 years of experience working on Snowflake. Responsibilities Lead in developing project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Make necessary ongoing updates to modeling principles, processes, solutions, and best practices to ensure alignment with business needs. Design and build manual or auto ingestion data pipeline using Snowpipe. Develop stored Procedures and write Queries to analyze and transform data. Good-to-have DBT / Matillion Interested candidates can apply to kinnera259@gmail.com Regards, HR Manager

Posted 2 days ago

Apply

4.0 - 7.0 years

15 - 25 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary- 15 to 30 LPA Exp- 4 to75 years Location- Gurgaon Notice: Immediate to 30 days..!! Role & responsibilities: Strong knowledge on SQL & Tableau Strong knowledge in Tableau dashboards Emphasis is on end-to-end delivery of analysis Contribute to how analytical approach is structured for specification of analysis Contribute insights from conclusions of analysis that integrate with initial hypothesis and business objective. Independently address complex problems Participate in design of analysis and modelling approach with managers and modelers Extremely comfortable working with data, including managing large number of data sources, analyzing data quality, and pro-actively working with clients data/ IT teams to resolve issues Use variety of analytical tools (Python, SQL, Tableau, Power BI etc.) to carry out analysis and drive conclusions Reformulate highly technical information into concise, understandable terms for presentations

Posted 4 days ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Company Overview. Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM), What you'll do. Docusign is seeking a talented and results-oriented Director, Data Engineering to lead and grow the Bengaluru hub of our Global Data & Analytics Organisation. Your mission is to build the modern, trusted data foundation that powers Enterprise analytics, internal decision-making, and customer-facing insights. As a key leader, you will partner closely with Engineering, Product, GTM, Finance, and Customer Success to set the data strategy for Docusign India, focusing on data architecture, enterprise data foundation and data ingestion. During a typical day, you will drive the development of governed data products, operationalize quality and observability, and lead a high-performing team of data engineers and architects. The ideal candidate will demonstrate strong leadership, a passion for innovation in AI & data technologies, and the drive to achieve "five-nines" reliability for our data platforms, This position is a people manager role reporting to the Senior Director, Global Data & Analytics, Responsibility. Own Snowflake architecture, performance and cost governance; define modelling standards (star, data vault, or ELT-first) and enforce security/RBAC best practices. Scale our dbt codebase—design project structure, modular macros, end-to-end CI/CD and automated testing so every pull-request ships with quality gates and lineage metadata. Drive ingestion excellence via Fivetran (50 + connectors today, growing fast). Establish SLAs for freshness, completeness and incident response. Embed with business stakeholders (Finance, Sales Ops, Product, Legal & Compliance) to translate ambiguous questions into governed data products and trusted KPIs—especially SaaS ARR, churn, agreement throughput and IAM adoption metrics. Lead and inspire a high-performing team of data engineers and architects; hire, coach, set OKRs. Operationalise quality and observability using dbt tests, Great Expectations, lineage graphs and alerting so we achieve “five-nines” reliability. Partner on AI initiatives—deliver well-modelled features to data scientists. Evaluate semantic layers, data contracts and cost-optimisation techniques. Job Designation. Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation). Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law, What you bring. Basic. E. Advanced SQL and Python programming skills. 12+ years in data engineering with demonstrable success in an enterprise environment. 5+ years of experience as a people manager, Comfortable with Git-based DevOps. Preferred. Expert in Snowflake, dbt (production CI/CD, macros, tests) and Fivetran (setup, monitoring, log-based replication). Proven ability to model SaaS business processes—ARR/ACV, funnel, usage, billing, marketing, security and compliance. Track record building inclusive, global distributed teams. Adept at executive communication and prioritisation. Experience operationalizing data quality and observability using tools like dbt tests, data lineage, and alerting. Experience partnering on AI initiatives and delivering features for data scientists. Experience evaluating and implementing semantic layers, data contracts, and cost-optimization techniques. Life at Docusign. Working here. Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal, We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live, Accommodation. Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign,, If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign, for assistance, Applicant and Candidate Privacy Notice. Show more Show less

Posted 4 days ago

Apply

2.0 - 5.0 years

7 - 10 Lacs

Chennai

Hybrid

Naukri logo

J O B D E S C R I P T I O N POSITION TITLE : Data support Analyst, India Location: Chennai, India Summary/Objective: Reveleer is a healthcare data and analytics company that uses Artificial Intelligence to give health plans across all business lines greater control over their Quality Improvement, Risk Adjustment, and Member Management programs. With one transformative solution, the Reveleer platform enables plans to independently execute and manage every aspect of enrollment, provider outreach and data retrieval, coding, abstraction, reporting, and submissions. Leveraging proprietary technology, robust data sets, and subject matter expertise, Reveleer provides complete record retrieval and review services so health plans can confidently plan and execute risk, quality, and member management programs to deliver more value and improved outcomes. General Summary: We are seeking a highly motivated and detail-oriented Data Support Analyst to join our team in Chennai. The ideal candidate will have strong SQL skills, a customer-first mindset, excellent analytical capabilities, and a proactive attitude. This role requires flexibility to provide support on weekends as needed and a continuous improvement mindset through data analysis and operational insights. Key Responsibilities: Write and execute complex SQL queries, including SELECT statements, joins, updates, deletes, inserts, and data import/export tasks. Manage and manipulate large datasets using tools like MS Excel, MS Access, and SQL Server (or other RDBMS). Monitor, start, and stop Medicare batch jobs based on Standard Operating Procedures (SOPs). Use job scheduling tools to oversee and maintain automated data jobs. Perform high-level ETL (Extract, Transform, Load) activities and analyze data for patterns and continuous improvement. Provide weekend support on a rotational or on-call basis when needed. Required Skills & Qualifications: Strong working knowledge of SQL, including complex queries and data operations. Experience with data management tools such as Excel, MS Access, and RDBMS platforms. Familiarity with Medicare batch job operations and SOPs. Hands-on experience with job scheduling and ETL tools. Strong analytical skills to interpret data and identify actionable insights. Ability to troubleshoot using system/application logs and command-line tools. Willingness to work flexible hours, including weekends when necessary. A strong sense of accountability and a positive, solution-oriented attitude. Desired Skills 2+ years of professional SQL experience. Experience using job scheduling tools such as vCron, Tidal, or BMC. 2+ years of experience working with MS Excel and MS Access (or equivalent). Prior experience with reviewing and interpreting system and application logs. Education Bachelors degree or higher in Computer Science, Information Technology, or equivalent experience. Why Join Reveleer? Impactful Work: Be part of a mission-driven company dedicated to improving healthcare outcomes and driving innovation. technology roadmap. excellence. growth. remote work. Leadership Opportunity: Shape the growth and culture of our India technology team and influence the global Collaboration & Culture: Work closely with a globally distributed team of passionate professionals committed to Competitive Compensation: Receive a comprehensive benefits package and opportunities for professional. Flexible Work Environment: Collaborate with modern tools and practices, with flexibility to balance on-site and

Posted 4 days ago

Apply

4.0 - 10.0 years

6 - 9 Lacs

Chennai

Work from Office

Naukri logo

Experienced. Chennai. Posted 1 week ago. Solvedge. We’re dedicated to leveraging technology to make a positive impact in healthcare. Our software solutions are crafted to optimize processes, support patient care, and drive better health outcomes. As we continue to innovate, we’re seeking an experienced PostgreSQL Developer to join our team. If you’re enthusiastic about scalable database development and eager to contribute to meaningful healthcare technology projects, we want you on our journey to empower healthcare professionals with advanced tools and insights... What You’ll Do. We are looking for a skilled and detail-oriented PostgreSQL Developer with 4–6 years of hands-on experience to join our dynamic engineering team. In this role, you will be responsible for designing developing, and optimizing PostgreSQL databases that power high-performance applications in the healthcare sector. You will collaborate with architects, backend engineers, and business analysts to deliver reliable and scalable data solutions. Responsibilities. Database Development and Optimization. Design and implement efficient PostgreSQL schemas, indexes, constraints, and relationships.. Develop advanced SQL queries, stored procedures, views, and triggers using PostgreSQL.. Optimize complex queries and database performance for scalability and speed.. Perform data profiling, query tuning, and performance analysis.. Data Architecture and Modeling. Create and maintain logical and physical data models based on business requirements.. Define standards for data consistency, normalization, and integrity.. Implement data validation rules and constraints to ensure data accuracy.. Integration and Collaboration. Collaborate with backend developers to ensure seamless data access through APIs and services.. Design and implement ETL processes for internal data flows and external data ingestion.. Work with cross-functional teams to translate business requirements into database logic.. Tools and Automation. Utilize tools for database versioning (e.g., Flyway, Liquibase).. Automate database deployments and migrations within CI/CD pipelines.. Continuous Improvement. Monitor emerging PostgreSQL features and best practices.. Recommend and implement improvements in data design, coding practices, and performance strategy.. Qualifications. Bachelor’s degree in Computer Science, Engineering, or equivalent technical field.. 4–6 years of professional experience with PostgreSQL database development.. Experience working in Agile/Scrum environments.. Exposure to microservices and cloud-native applications is an advantage.. Primary Skills. PostgreSQL: Strong proficiency in PostgreSQL and advanced SQL.. SQL Development: Experience building reusable stored procedures, functions, views, CTEs, and triggers.. Performance Tuning: Expertise in optimizing complex queries using indexing, execution plans, and materialized views.. Schema Design: In-depth knowledge of data modeling, normalization, and relational design.. Data Integration: Experience with data pipelines, ETL processes, and transforming structured/semi-structured data.. JSON/JSONB: Practical experience working with unstructured data and PostgreSQL’s advanced JSON features.. ORMs: Experience integrating PostgreSQL with ORMs such as Sequelize, Hibernate, or SQLAlchemy.. Secondary Skills. Experience working with cloud-based PostgreSQL (e.g., AWS RDS, Azure Database for PostgreSQL).. Familiarity with RESTful APIs and backend service integration.. Working knowledge of NoSQL alternatives, hybrid storage strategies, or data lakes.. CI/CD and DevOps understanding for integrating DB updates into pipelines.. Strong analytical and debugging skills.. Effective communication and documentation abilities to interact with stakeholders.. Why Apply?. Even if you feel you don’t meet every single requirement, we encourage you to apply. We’re looking for passionate individuals who may bring diverse perspectives and skills to our team. At SolvEdge, we value talent and dedication and are committed to fostering growth within our organization.. How to Apply?. Ready to make a difference? Submit your resume, a cover letter that highlights your qualifications, and any relevant experience. We look forward to hearing from you!. SolvEdge is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.. About SolvEdge. Solvedge: Pioneering the Future of Digital Healthcare. Our Expertise. SOLVEDGE stands at the forefront of digital healthcare innovation as a premier healthcare performance company. With over 18 years of dedicated service in the healthcare industry, we specialize in a digital care journey platform that revolutionizes how hospitals and health systems engage, monitor, and connect with patients throughout their healthcare experiences. Our partnership with Fortune 100 medical device companies and hospitals nationwide underscores our position as a trusted partner in healthcare solutions.. Key Features of SOLVEDGE. Our Platform Is Designed To Empower Healthcare Providers With The Tools They Need To Automate And Streamline Care Delivery, Thereby Improving Clinical Outcomes And Patient Satisfaction. Personalized Care Plans: Leveraging evidence-based data, SOLVEDGE delivers digital care plans customized to meet the individual needs and conditions of each patient.. Real-Time Patient Monitoring: Through daily health checks, assessment, surveys, and integration with wearable devices, our platform facilitates continuous monitoring of patient health.. Automated Care Delivery: We automate essential tasks, including appointment scheduling, sending reminders, and delivering educational content, to enhance patient engagement and reduce administrative tasks.. Remote Patient Monitoring: Healthcare providers can monitor vital signs, symptoms, and treatment plan adherence remotely, enabling timely interventions and proactive care management.. The SOLVEDGE Advantage. Our platform offers significant benefits to healthcare providers and patients alike:. Improved Clinical Outcomes: By facilitating more effective care pathways and enabling early intervention, SOLVEDGE contributes to reduced readmission rates, fewer emergency department visits, and shorter hospital stays.. Enhanced Patient Satisfaction: Patients enjoy a higher quality of care with SOLVEDGE, benefiting from improved communication, comprehensive education, and continuous support.. Cost Savings: Healthcare organizations can achieve substantial cost reductions by minimizing unnecessary readmission, emergency visits, and complications associated with poor care management.. Applications and Impact. SOLVEDGE’s versatility allows for its application across various aspects of healthcare, with a particular emphasis on surgical care. From preparing patients for surgery to monitoring their post-operative recovery, our platform ensures a seamless and supportive care journey. Beyond surgical care, our focus encompasses managing care pathways, enhancing patient engagement through patient-reported outcomes, providing advanced data analytic, integrating with electronic medical records (EMR), and streamlining billing processes. Our comprehensive approach addresses the myriad challenges faced by today’s healthcare. industry, backed by our commitment to excellence in service, communication, and customer. experience.. A Trusted Partner in Healthcare Innovation. Our strategic relationships and deep understanding of healthcare challenges have positioned us as an indispensable ally to healthcare providers nationwide. As we continue to develop innovative solutions, our goal remains unchanged: to simplify healthcare delivery, improve patient outcomes, and enhance the overall patient experience.. Job Features. Job Category Developer. Apply For This Job. Attach Resume*. No file chosen. Browse. Show more Show less

Posted 4 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Data Engineer. Job Type: Full-Time. Location: On-site Hyderabad, Telangana, India. Job Summary:. We are seeking an accomplished Data Engineer to join one of our top customer's dynamic team in Hyderabad. You will be instrumental in designing, implementing, and optimizing data pipelines that drive our business insights and analytics. If you are passionate about harnessing the power of big data, possess a strong technical skill set, and thrive in a collaborative environment, we would love to hear from you.. Key Responsibilities:. Develop and maintain scalable data pipelines using Python, PySpark, and SQL.. Implement robust data warehousing and data lake architectures.. Leverage the Databricks platform to enhance data processing and analytics capabilities.. Model, design, and optimize complex database schemas.. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.. Lead and mentor junior data engineers and establish best practices.. Troubleshoot and resolve data processing issues promptly.. Required Skills and Qualifications:. Strong proficiency in Python and PySpark.. Extensive experience with the Databricks platform.. Advanced SQL and data modeling skills.. Demonstrated experience in data warehousing and data lake architectures.. Exceptional problem-solving and analytical skills.. Strong written and verbal communication skills.. Preferred Qualifications:. Experience with graph databases, particularly MarkLogic.. Proven track record of leading data engineering teams.. Understanding of data governance and best practices in data management.. Show more Show less

Posted 4 days ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

: Headquartered in Noida, India, Paytm Insurance Broking Private Limited (PIBPL), a wholly owned subsidiary of One97 Communications (OCL) is an online insurance market place, that offers insurance products across all leading insurance companies, with products across auto, life and health insurance and provide policy management and claim services for our customers. Expectations/ : 1. Using automated tools to extract data from primary and secondary sources 2. Removing corrupted data and fixing coding errors and related problems 3. Developing and maintaining databases, data systems – reorganizing data in a readable format 4. Preparing reports for the management stating trends, patterns, and predictions using relevant data 5. Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends 6. Supporting the data warehouse in identifying and revising reporting requirements. 7. Setup robust automated dashboards to drive performance management 8. Derive business insights from data with a focus on driving business level metrics 9. 1 -2 years of experience in business analysis or a related field. Superpowers/ Skills that will help you succeed in this role : 1. Problem solving - Assess what data is required to prove hypotheses and derive actionable insights 2. Analytical skills - Top notch excel skills are necessary 3. Strong communication and project management skills 4. Hands on with SQL, Hive, Excel and comfortable handling very large scale data. 5. Ability to interact and convince business stakeholders. 6. Experience working with web analytics platforms is an added advantage. 7. Experimentative mindset with attention to detail. 8. Proficiency in Advance SQL , MS Excel and Python or R is a must 9. Exceptional analytical and conceptual thinking skills. 10. The ability to influence stakeholders and work closely with them to determine acceptable solutions. 11. Advanced technical skills. 12. Excellent documentation skills. 13. Fundamental analytical and conceptual thinking skills. 14. Experience creating detailed reports and giving presentations. 15. Competency in Microsoft applications including Word, Excel, and Outlook. 16. A track record of following through on commitments. 17. Excellent planning, organizational, and time management skills. 18. Experience leading and developing top-performing teams. 19. A history of leading and supporting successful projects. Preferred Industry - Fintech/ E-commerce / Data Analytics Education - Any graduate or a Graduate from Premium Institute is preferred. Why join us: 1. We give immense opportunities to make a difference, and have a great time doing that. 2. You are challenged and encouraged here to do meaning work for yourself and customers/clients 3. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be

Posted 4 days ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Noida

Work from Office

Naukri logo

Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. Job LocationNoida (Work From Office) Max Experience2-6 Yrs About the team: You will be working on millions of merchants' device data and getting insight into Expectations/ 1. Must have advance SQL experience. 2. Basic knowledge of Python. 3. Working experience on Google Looker or another visualization tool. 4. Should have worked on automation projects related to data integration, extraction, and visualization. 5. Should have advanced excel knowledge. 6. Should have built management reports, MIS and dashboardsExperience in VBA. 7. Analytics and critical thinking is important8. Good to have worked on Looker Studio for visualization Superpowers/ Skills that will help you succeed in this role 1. Bachelor's degree in Computer Science, Engineering, or a related field. 2. 2 + years of experience as a Product Analyst, preferably in the financial services or technology industry. 3. Strong analytical and problem-solving skills, with the ability to understand complex business problems and translate them into technical requirements. 5. Excellent communication and interpersonal skills, with the ability to work effectively with cross-functional teams and stakeholders. 6. Experience in market research and analysis is a plus

Posted 4 days ago

Apply

4.0 - 6.0 years

7 - 10 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Title: Business Analyst Advanced SQL & Power BI Location: Bangalore/Mumbai Job Type: Full-time Experience: 57 years Industry: BFSI / FinTech / Analytics / Consulting Job Summary: We are looking for a highly skilled Business Analyst with deep expertise in Advanced SQL and Power BI to join our analytics team. The ideal candidate will be responsible for translating business requirements into data models, dashboards, and actionable insights, while working closely with cross-functional teams to drive data-informed decisions. Key Responsibilities: Collaborate with stakeholders to gather and document business and data requirements. Design and develop interactive dashboards and reports using Power BI. Write and optimize complex SQL queries for data extraction, transformation, and analysis. Perform data profiling, cleansing, and validation to ensure data quality. Conduct root cause analysis, identify trends, and provide strategic recommendations. Create data dictionaries, ER diagrams, and maintain documentation for BI solutions. Support data governance, compliance, and audit requirements. Work in Agile/Scrum environments and participate in sprint planning and reviews. Technical Skills – Detailed SQL (Advanced) Writing complex queries using: Joins (inner, outer, self, cross), Common Table Expressions (CTEs), Window functions (RANK, ROW_NUMBER, LEAD/LAG), Subqueries and correlated subqueries o Stored procedures, triggers, and views Query optimization and performance tuning Experience with RDBMS like SQL Server, PostgreSQL, MySQL, or Oracle Power BI Advanced DAX for calculated columns, measures, and KPIs Power Query (M language) for data transformation Designing data models with star/snowflake schemas Implementing Row-Level Security (RLS) Creating custom visuals, bookmarks, and drill-through reports Publishing and managing reports on Power BI Service Data Integration & ETL Experience with ETL tools like SSIS, Azure Data Factory, or Talend Connecting Power BI to various data sources (SQL, Excel, APIs, SharePoint, etc.) Understanding of data warehousing and data lake concepts Business Analysis Tools & Techniques Process modelling (BPMN, flowcharts) Use case and user story development Wireframing tools (e.g., Balsamiq, Figma) Agile tools (JIRA, Confluence) Preferred Qualifications: Experience in BFSI, FinTech, or regulated industries Familiarity with cloud platforms (Azure, AWS, GCP) Knowledge of Python or R for data analysis is a plus Microsoft certifications (e.g., PL-300: Power BI Data Analyst) Soft Skills: Strong analytical and problem-solving mindset Excellent communication and stakeholder engagement Ability to work independently and in cross-functional teams Attention to detail and a passion for data storytelling

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

5+ years of experience with BI toos, with expertise and/or certification in at east one major BI patform - Tabeau preferred. Advanced knowedge of SQL, incuding the abiity to write compex stored procedures, views, and functions. Proven capabiity in data storyteing and visuaization, deivering actionabe insights through compeing presentations. Exceent communication skis, with the abiity to convey compex anaytica findings to non-technica stakehoders in a cear, concise, and meaningfu way. Identifying and anayzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Provide expertise in anaysis, requirements gathering, design, coordination, customization, testing and support of reports, in cient’s environment Deveop and maintain a strong working reationship with business and technica members of the team Reentess focus on quaity and continuous improvement Perform root cause anaysis of reports issues Deveopment / evoutionary maintenance of the environment, performance, capabiity and avaiabiity. Assisting in defining technica requirements and deveoping soutions Effective content and source-code management, troubeshooting and debugging Preferred technica and professiona experience Troubeshooting capabiities to debug Data contros Capabe of converting business requirements into workabe mode. Good communication skis, wiingness to earn new technoogies, Team Payer, Sef-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Reationships)

Posted 5 days ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

Provide expertise in anaysis, requirements gathering, design, coordination, customization, testing and support of reports, in cient’s environment Deveop and maintain a strong working reationship with business and technica members of the team Reentess focus on quaity and continuous improvement Perform root cause anaysis of reports issues Deveopment / evoutionary maintenance of the environment, performance, capabiity and avaiabiity. Assisting in defining technica requirements and deveoping soutions Effective content and source-code management, troubeshooting and debugging. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Tabeau Desktop Speciaist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowfake, Statistics, ETL experience. Extensive knowedge on using creating impactfu visuaization using Tabeau. Must have thorough understanding of SQL & advance SQL (Joining & Reationships). Must have experience in working with different databases and how to bend & create reationships in Tabeau. Must have extensive knowedge to creating Custom SQL to pu desired data from databases. Troubeshooting capabiities to debug Data contros. Preferred technica and professiona experience Troubeshooting capabiities to debug Data contros Capabe of converting business requirements into workabe mode. Good communication skis, wiingness to earn new technoogies, Team Payer, Sef-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Reationships).

Posted 5 days ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) About the Role: A global analytics team is seeking a Business Intelligence Engineer to drive data-backed insights, design robust reporting frameworks, and influence key business strategies across international markets. This role is ideal for professionals returning to the workforce and looking to re-engage in high-impact analytical work. You will collaborate closely with business stakeholders across geographies (Europe, US, Japan, Asia), working on payments and lending analytics. This is a high-ownership, high-impact role requiring a passion for data, a knack for storytelling through dashboards, and the ability to work independently in a fast-paced environment. Key Responsibilities: Design and maintain dashboards, reports, and metrics to support executive-level business decision-making Ensure data accuracy and integrity across tools, dashboards, and reporting pipelines Use SQL, Excel, and scripting languages (e.g. Python, R, Java) for deep-dive analysis Develop BI tools and data visualizations using platforms like Tableau, AWS QuickSight, Looker, etc. Analyze business trends and apply statistical rigor (t-tests, chi-squared tests, regressions, forecasting) to derive insights Lead alignment and standardization of key metrics across global BI teams Model data and metadata to support robust analytics infrastructure Automate manual reporting efforts to enhance operational efficiency Work with cross-functional teams to recommend data-driven growth strategies Present insights and narratives to stakeholders including global leaders and executives A Day in the Life: Define and refine performance metrics, reports, and insights for international payment systems Drive analytical alignment with global BI leaders and executive stakeholders Lead deep dives into metrics influencing revenue, signups, and operational performance Own VP- and Director-level reporting initiatives and decision-support analysis Collaborate across regions to deliver unified and actionable analytics strategies Basic Qualifications: 2+ years of experience in data analytics using Redshift, Oracle, NoSQL, or similar data sources Strong SQL skills for data retrieval and analysis Proficiency in data visualization using Tableau , QuickSight , Power BI , or similar tools Comfort with scripting languages like Python , Java , or R Experience applying statistical techniques to real-world data problems Preferred Qualifications: Masters degree or other advanced technical degree Experience with data modeling and data pipeline architecture Strong grasp of statistical analysis techniques, including correlation analysis and hypothesis testing Top 10 Must-Have Skills: Advanced SQL Data Visualization (Tableau, QuickSight, Power BI, Looker) Statistical Analysis (t-test, Chi-squared, Regression) Scripting (Python / R / Java) Redshift / Oracle / NoSQL Databases Dashboard & Report Development Data Modeling & Pipeline Design Cross-functional Global Collaboration Business Metrics & KPI Definition Executive-Level Reporting

Posted 5 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 6 days ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 6 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Role- Senior Test Analyst Exp-5+Yrs Loc- Gurgaon Only. Good-to-Have Automation using selenium Java Banking domain experience CD/CI knowledge Must have- ETL Tester with Advance SQL

Posted 6 days ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

Noida, Greater Noida

Work from Office

Naukri logo

Role & responsibilities Utilize Python (specifically Pandas) to clean, transform, and analyze data, automate repetitive tasks, and create custom reports and visualizations. Analyze and interpret complex datasets, deriving actionable insights to support business decisions. Write and optimize advanced SQL queries for data extraction, manipulation, and analysis from various sources, including relational databases and cloud-based data storage. Collaborate with cross-functional teams to understand data needs and deliver data-driven solutions. Create and maintain dashboards and reports that visualize key metrics and performance indicators. Identify trends, patterns, and anomalies in data to support business intelligence efforts and provide strategic recommendations. Ensure data integrity and accuracy by developing and implementing data validation techniques. Support data migration, transformation, and ETL processes within cloud environments. Requirements 3 - 5 years of experience as a Data analyst or equivalent role. Good experience in Python, with hands-on experience using Pandas for data analysis and manipulation. Expertise in analytical SQL, including writing complex queries for data extraction, aggregation, and transformation. Knowledge of cloud platforms, particularly AWS (Amazon Web Services). Strong analytical thinking, problem-solving, and troubleshooting abilities. Familiarity with data visualization tools (e.g., Tableau, Power BI, Quicksight , Superset etc.) is a plus. Excellent communication skills, with the ability to explain complex data insights in a clear and actionable manner. Detail-oriented with a focus on data quality and accuracy. Preferred Qualifications: Experience working in a cloud-based data analytics environment. Familiarity with additional cloud services and tools (e.g. Snowflake , Athena). Experience working in an Agile environment or with data-oriented teams.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies