Jobs
Interviews

140 Aws Technologies Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

20 - 25 Lacs

pune

Work from Office

Solution Design & Implementation: Designs, builds, and deploys solutions on AWS, ensuring scalability, security, and cost-effectiveness. Customer Engagement: Works directly with customers to understand their needs, business objectives, and technical challenges. Technical Expertise: Possesses deep knowledge of AWS services, best practices, and industry trends, which include but limited to AWS services like AWS Control Tower, AWS Organization, VPC, AWS Cloud WAN, AWS Security Hub. Networking: Should have strong expertise in configuring networking services in AWS. Communication & Collaboration: Effectively communicates technical solutions to both technical and non-technical audiences. Continuous Learning: Keeps abreast of the latest AWS technologies and trends. Problem Solving: Identifies and resolves complex technical issues. Digital Transformation: Helps organizations move from on-premises systems to the cloud, enabling digital transformation. Qualifications: Experience: Extensive experience in designing, implementing, and managing cloud solutions on AWS, typically 10+ years. Skills: Strong technical skills in cloud computing, networking, security, and application development. Certifications: AWS certifications, such as AWS Certified Solutions Architect - Professional, are highly valued. Education: Bachelors degree in Computer Science or a related field. Soft Skills: Excellent communication, problem-solving, and collaboration skills.

Posted 3 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 days ago

Apply

5.0 - 9.0 years

0 - 1 Lacs

noida

Remote

Roles and Responsibilities: Deliver technical solutions with robust written, well tested code. Advocate and advance modern, agile software development practices and help develop and evangelise engineering and organisational practices, Contribute to growing a healthy, collaborative engineering culture in line with the organizations defined values. Be an active part of the engineering community team and collaborate with other technical leads. Identifying and resolving technical debt An ability to balance business deliverables and technical excellence. Document and share knowledge across teams where required. Skillset Needed: Spring Boot, MySQL/Aurora, Spring Cloud, Spring Data, Java 8-11, Junit, Spring Integration test. Candidates should have a strong understanding of Agile Scrum Methodology. Strong working knowledge of RESTful APIs. Experience working using AWS Technologies (Kubernetes, Lambda, Elastic search etc.,), Docker. Developing in a Micro services Architecture Style. Good knowledge of software development methodologies and techniques. Develop maintainable and supportable code: clean, reusable code thats easy to read and test. Quality oriented; understands software testing, writes tests where appropriate and the concept of test/behaviour driven development. Strong problem-solving capability; using appropriate debugging tools. Experience: 5 + Years of Experience Organization : This is a direct job with iMEGH Private Limited (India) Fluid. Live is hiring partner for iMEGH Private Limited

Posted 5 days ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark,Technology->Data On Cloud - NoSQL->Amazon Dynamo DB Preferred Skills: Technology->Cloud Platform->AWS Database->AWS Technology->Big Data - Data Processing->Spark->SparkSQL

Posted 6 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate for this position will have solid line reporting relationships with development staff members and will be responsible for the work, products, and status communications from that team. As the team lead, you will provide technical support in new project initiatives, design, develop, and enhance existing systems. To qualify for this role, you should possess a Bachelor's degree in Math, EE, CS, or Software Engineering, with a Master's degree being preferred. You should have at least 8 years of hands-on experience in the development of commercial-grade software products, including 2+ years in leading either the full development lifecycle or leading a technical team. Additionally, you should have 5+ years of proven experience in developing and supporting applications using .NET Core, C#/Java/Python, and Angular, as well as 3+ years of experience with AWS technologies. It is essential that you have a strong understanding of AWS services and infrastructure, deploying and managing applications on AWS. You should also possess a deep understanding of Agile processes and best practices, as well as a strong knowledge of the latest technologies and trends. Your ability to develop use-cases from business requirements and collaborate effectively with project stakeholders outside of the Development group, especially with Product Management on feature requirements and project schedules, will be crucial in this role. Furthermore, you must be able to provide effective technical leadership and oversight to the development team on a project, ensuring that software is developed in adherence to established architecture, design, quality standards, and delivery schedule. Verisk has been the leading data analytics and technology partner to the global insurance industry for over 50 years, delivering value to clients through expertise and scale. As an employer, Verisk offers a unique and rewarding career opportunity with work flexibility, support, coaching, and training to help you succeed. Verisk is proud to be recognized as a Great Place to Work for outstanding workplace culture in multiple countries, emphasizing inclusivity, diversity, learning, caring, and results. Join Verisk's 7,000 strong workforce and contribute to the pursuit of innovation ethically. Help translate big data into big ideas and create an exceptional experience for yourself while shaping a better tomorrow for future generations. Verisk's business units include Underwriting Solutions, Claims Solutions, Property Estimating Solutions, Extreme Event Solutions, Specialty Business Solutions, Marketing Solutions, Life Insurance Solutions, and Verisk Maplecroft, all aimed at providing cutting-edge solutions to various sectors. Verisk Analytics is committed to being an equal opportunity employer. For more information and to explore career opportunities, visit the Verisk Careers page at https://www.verisk.com/company/careers/,

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 5 Lacs

bengaluru

Work from Office

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 32 Lacs

bengaluru

Work from Office

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

6.0 - 11.0 years

0 - 0 Lacs

chennai, bengaluru

Hybrid

Role & responsibilities Minimum of 6 years of software development experience in a professional environment and/or comparable experience such as: o Familiar with Agile or other rapid application development methods o Experience with design and coding in Java and across one or more platforms and additional languages as appropriate o Experience with Big Data processing and Batch/streaming technologies such as Apache Spark, Kafka, Flink, Beam and Scala as a programming language preferred o Experience with RxJava and functional programming is preferred o Experience in Full Stack development using Java with React, Node preferred o Experience with AWS or GCP cloud is preferred o Experience with AWS Technologies like EMR, MSK (Kafka), EKS (Kubernetes), DynamoDB, Aurora DB preferred. o Experience with GCP Technologies like Dataflow (Apache Beam pipelines), BigQuery, Bigtable, Pub/Sub Dataproc, Pubsub and Bigtable is preferred. o Backend experience including Apache Cassandra, and relational databases such as Oracle, PostgreSQL a plus o Hands-on expertise with application design, software development and automated testing. o Experience with distributed (multi-tiered) systems, algorithms, and relational and No-SQL databases o Confirmed experience with object-oriented design and coding with variety of languages o Experience in managing and delivering applications and services using a cloud computing model across Public, Private, and Hybrid Cloud environments. • Bachelors degree in computer science, computer science engineering, or related experience required, advanced degree preferred

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

As a Senior API Engineer in the Telecommunication industry located in Okhla, Delhi, you will be responsible for developing and enhancing cutting-edge application integrations within an agile delivery team. Your role will involve owning the full lifecycle of API development, from design and implementation to deployment and maintenance, ensuring robust and secure integrations across diverse systems. Your major responsibilities will include End-to-End API Integration where you will own the full lifecycle of API development, API Design & Documentation to create detailed API specifications and design documents, API Security & Integration to build secure, scalable integration APIs, CI/CD Implementation to set up and maintain continuous integration and continuous deployment pipelines, System Gap Analysis to identify gaps between current systems and desired end-state solutions, Software Development Lifecycle to design, develop, and enhance software solutions, and Project Ownership & Mentorship where you will take ownership of moderately complex projects and mentor junior engineers. You should have 5+ years of hands-on experience in API development with expertise in Java, Spring Boot, and API management platforms such as APIGEE. Extensive experience working with AWS technologies is required along with a strong familiarity with authentication/authorization protocols, build tools, unit testing frameworks, service orchestration, messaging technologies, and microservices architectures. Exposure to SOAP and traditional web services is a plus, and experience with API gateway features is preferred. If you have experience in the telecommunications industry, specifically with provisioning APIs or TMF APIs, it would be considered a strong advantage.,

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

india

On-site

DESCRIPTION DESCRIPTION Retail Business Services (RBS) supports Amazon's Retail business growth WW through three core tasks. These are (a) Selection, where RBS sources, creates and enrich ASINs to drive GMS growth (b) Defect Elimination: where RBS resolves inbound supply chain defects and develops root cause fixes to improve free cash flow and (c) supports operational process for WW Retail teams where there is an air gap in the tech stack. The tech team in RBS develops automation that leverages Machine/Deep Learning to scale execution of these high complex tasks that currently require human cognitive skills. Our solutions ensure that information in Amazon's catalog is complete, correct and, comprehensive enough to give Amazon customers a great shopping experience every time. We are building solutions to move towards retail automation and are using latest technology like LLM to solve business problems. We believe in Work Hard. Have Fun. Make History value by having a strong focus on sharing learning experiences from the front line with the development teams. So, the options for people in the team are vast. If you like mastering a domain and going deep, we need you. If you can juggle three tasks and coordinate with multiple people in the heat of an incident, we need you. If you love the benefits of process and methodical improvement, you will love it here. If you want to keep your head down, headphones on, and bash out code to support the team, we have a spot for you too. At Amazon, we hire the best minds in technology to innovate and build on behalf of our customers. The focus we have on our customers is why we are one of the world's most beloved brands - customer obsession is part of our company DNA. We are seeking a Software Development Engineer who can design and develop systems for its businesses. As an SDE, you would have immense opportunity to explore avenues of how to model things, figure out right abstraction level, take judgement calls around multiple design trade offs at hand that influence a multi billion dollar business. We are looking for highly talented software development engineers who are passionate not only about architecting and developing large scale distributed technology solutions but also innovating new ideas and providing directions to our business. You will get to work on some of the key initiatives planned to support our rapid evolution and growth of our businesses. In this process you will drive best practices, mentor other engineers, drive continuous improvements in engineering and operational excellence. You will use cutting-edge technology to solve complex problems and get to see the impact of their work first-hand. The challenges SDEs solve for at Amazon are big and influence millions of customers, sellers, and products around the world. As a SDE with RBS team, you will work with talented peers of SDEs and Applied scientists to build robust engineering platforms to solve high impact issues. You will constantly stretch the boundaries of innovation to tackle the business challenges. If you enjoy designing and building highly distributed systems that can scale and solving challenging problems, come to join us! Work/Life Balance RBS Tech team puts a high value on work-life harmony. It isn't about how many hours you spend at home or at work it's about the flow you establish that brings energy to both parts of your life. We believe striking the right balance between your personal and professional life is critical to life-long happiness and we encourage you to find your own balance between your work and personal lives. Mentorship & Career Growth Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we're building an environment that celebrates knowledge sharing and mentorship. Our senior members enjoy one-on-one mentoring, detailed and constructive code reviews. We have casual coffee chats with Principal & Senior Engineers from RBS tech. You get an opportunity to network and have technical conversations around your work, technical challenges, suggestions, ideas and proposals. You can also seek advice and discuss about things outside work, like, life in general, your family, hobbies etc. We provide trainings to the employees through online learning platforms such as O'reilly and also encourage them to take up AWS/ML certifications. Key job responsibilities We are looking for a sharp, experienced Software developer with a diverse skillset and background. As a SDE-2, your work is consistently of high quality. You solve difficult problems, applying appropriate technologies and best practices. You work with your team to invent, design and build software that is stable and performant. You are proficient in a broad range of design approaches and know when it is appropriate to use them (and when it is not). You limit the use of short-term workarounds. You create flexible software without over-engineering. You make appropriate trade-offs, re-use where possible, and are judicious about introducing dependencies. You are efficient with resource usage (e.g., system hardware, database, memory/CPU, etc.). You work on project ideas with customers, stakeholders, and peers. You help your team evolve by actively participating in the code review process, design discussions, team planning, and ticket/metric reviews. You focus on operational excellence, constructively identifying problems and proposing solutions. You take on projects and make software enhancements that improve team software and processes. You are able to train new team-mates on how your team's software is constructed, how it operates, how secure it is, and how it fits into the bigger picture. You foster a constructive dialogue and seek resolutions in a professional way. You help recruit and interview for your team. You mentor and help to develop others. BASIC QUALIFICATIONS - 3+ years of non-internship professional software development experience - 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience - Experience programming with at least one software programming language - 1+ years of experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems. - Bachelor's degree in Computer Science, Software Engineering or a related technical discipline - . Computer Science fundamentals in object-oriented design, data structures, algorithm design, problem solving, and complexity analysis. - A strong track record of project delivery for large, cross-functional projects - Experience building complex software systems that have been successfully delivered to customers - Experience with building high-performance, highly-available and scalable distributed systems. - A willingness to dive deep, experiment rapidly and get things done PREFERRED QUALIFICATIONS - 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience - Bachelor's degree in computer science or equivalent - Deep knowledge of Distributed SOA Architecture, Relational DB knowledge, ElasticSearch, DynamoDB, and various AWS technologies. - Exposure to Machine Learning/deep learning projects. - Experience of successfully mentoring junior SDEs. - High attention to detail and proven ability to manage multiple, competing priorities simultaneously. - . Ability to work in a fast-paced environment where continuous innovation is desired. - . History of teamwork and willingness to roll up one's sleeves to get the job done. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

india

On-site

DESCRIPTION Amazon World Wide Grocery Store Tech is seeking an experienced and proven Software Development Engineer (SDE) to lead medium to large cross-functional strategic initiatives that support WWGST strategic goals. This is a unique opportunity for someone inspired by WWGST core values and is interested in Amazon's high-growth grocery business to bring their whole self to the role. Key job responsibilities As an SDE, you are responsible for setting a high bar throughout the software development and deployment lifecycle, including design, development, documentation, testing, and operations. The ideal candidate will have a strong background in software and application development, AWS technologies, business judgment, curiosity, and superior written and verbal communication skills. He/she will work closely with the business and technical teams to analyze many non-standard and unique business problems and use creative problem solving to deliver results. He/she will be a self-starter, comfortable with ambiguity, able to think big and be creative while paying careful attention to detail and will enjoy working in a fast-paced and dynamic environment. BASIC QUALIFICATIONS - 3+ years of non-internship professional software development experience - 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience - Experience building complex software systems that have been successfully delivered to customers - Experience programming with at least one software programming language - Experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems - Experience with full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations PREFERRED QUALIFICATIONS - 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience - Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

3.0 - 8.0 years

19 - 25 Lacs

bengaluru

Work from Office

Your Role Design and implement generative AI solutions using AWS services such asAmazon Bedrock,SageMaker,Lambda, andStep Functions. Fine-tune and deploy large language models (LLMs) for specific business use cases. Develop and optimize prompt engineering strategies for various foundation models (e.g., Anthropic Claude, Mistral, Meta Llama, Amazon Titan). Build scalable, secure, and cost-effective ML pipelines and APIs. Collaborate with data scientists, ML engineers, and product teams to integrate AI capabilities into applications. Monitor and evaluate model performance, ensuring fairness, accuracy, and compliance Stay updated with the latest trends in generative AI and AWS technologies . Your Profile 3+ years of experience in machine learning or AI engineering. Strong hands-on experience withAWS AI/ML stack(SageMaker, Bedrock, Comprehend, etc.). Proficiency inPython,PyTorchorTensorFlow, andAPI development. Experience withLLMs,transformers, andembedding models. Familiarity with prompt engineering and RAG (Retrieval-Augmented Generation) architectures. Strong understanding of cloud security, scalability, and DevOps practices What you"ll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a senior software engineer at our company, you will have the opportunity to work with cutting-edge technologies to design and build high-performance and scalable products in the AWS Cloud environment. Your responsibilities will include developing solutions and components using JavaScript frameworks, Python, and various AWS technologies such as Redshift, Aurora, and Glue for enterprise-scale multi-user applications. To qualify for this role, you should have a Bachelor's degree in Math, EE, CS, or Software Engineering, with a Master's degree being preferred. You should also have at least 5 years of hands-on experience in developing commercial-grade software products and a minimum of 3 years of experience working with AWS technologies like AWS Lambda, AWS Glue, and S3. Additionally, you should possess a strong understanding of AWS services and infrastructure, as well as experience in deploying and managing applications on AWS. Your expertise should also include a minimum of 3 years of proven experience in developing applications using .NET Core, C#/Java, and Python, as well as experience in developing with Angular. Knowledge of Angular/React is considered a plus. It is essential that you have a solid understanding of Agile processes and best practices, as well as knowledge of the latest technologies and trends. In this role, you will be expected to collaborate effectively with stakeholders, particularly Product Management, on feature requirements and project schedules. Strong communication skills, both written and verbal, are necessary for successful interaction with team members. Your problem-solving skills, enthusiasm, creativity, and ability to learn quickly will be valuable assets in ensuring that software development adheres to established architecture, design, quality standards, and delivery schedules. Verisk, a leading data analytics and technology partner to the global insurance industry, is committed to creating a diverse and inclusive workplace where employees are valued for their unique contributions. As part of our team, you will have the opportunity to build a rewarding career supported by coaching, training, and work flexibility. We are dedicated to fostering a culture of learning, caring, and results, and we prioritize inclusivity and diversity in everything we do. Join us at Verisk and become a part of a dynamic team of 7,000 individuals who are passionate about pursuing innovation ethically. Together, we can transform big data into big ideas that will shape a better tomorrow for future generations. Don't miss this opportunity to create an exceptional experience for yourself while contributing to the success of our company. Verisk offers a range of businesses including Underwriting Solutions, Claims Solutions, Property Estimating Solutions, Extreme Event Solutions, Specialty Business Solutions, Marketing Solutions, Life Insurance Solutions, and Verisk Maplecroft. As an equal opportunity employer, Verisk Analytics is dedicated to providing a positive workplace culture where every employee can thrive. For more information about career opportunities at Verisk, please visit our website: https://www.verisk.com/company/careers/,

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

pune, maharashtra, india

On-site

KONE is a global leader in the elevator and escalator industry, making people's journeys safe, convenient, and reliable with smart and sustainable People Flow. In a world where cities are constantly evolving and more and more people choose to live in them, we at KONE stand with one clear aim: to shape the future of cities. We make urban life more vibrant and livable. And we do it by enabling safe, sustainable, and effortless people flow for all. We help cities leave a positive mark on the planet - for the next century and beyond. We shape the future of cities. KONE Technology and Innovation unit (KTI) is where we combine the physical world - escalators and elevators - with smart and connected digital systems. We are changing and improving the way billions of people move within buildings every day. We are on a mission to shape the future of the industry with new technologies and sustainable innovations. KONE IT (part of KTI) is a team of expert professional's working along with business functions and area teams to develop new capabilities and enable new business opportunities. We are trusted partners of KONE business lines and functions to develop, transform, manage and run their information technology solutions. Sustainability, curious mindset and innovation are at the core of everything we do, and this makes us an integral part of KONE's success. KONE IT is now looking for a leader to develop further our newly established AWS team as our AWS Full Stack Development Leader. The role holds a critical position, responsible for the performance of team of 50+ Full Stack developers and their development. KONE IT has made a strategic decision to use Full Stack and specifically AWS technologies especially for those customer- and other end user -centered solutions where KONE wants to differentiate with. Therefore, the team members will work in multiple IT projects and multiple IT product teams, primarily within the Commercial & Operations (C&O) domain, developing the capabilities / products that are essential to KONE's new RISE strategy. This role encompasses leadership across KONE IT and leads the team with example and great drive. The role is in Pune, India and you will report to Head of C&O IT and in dotted line to Head of IT India. Additionally, you will serve as a member of the India Technology & Engineering Centre (ITEC) Leadership Team (LT), acting as the Pune site lead and representing local operations in strategic decisions . Lead and mentor teams of Full Stack professionals, fostering a high-performance culture and driving excellence in modern software engineering practice . Provide expert leadership in Full Stack and modern technology domains, with a strong emphasis on cloud-native and scalable enterprise solutions . Drive employee development and satisfaction through structured talent and competence management strategies . Demonstrate tangible impact through organizational transformation and sustained business outcomes-not limited to individual project or service-level achievements . Oversee team budgeting, ensuring effective resource allocation and cost optimization . Champion and drive AI-driven software development practices and DevOps methodologies, and lead successful project delivery practices in a dynamic, fast-paced environment . Coach the team to ensure that all deliverables align e.g. with enterprise architecture and cybersecurity standards at a global level . Contribute to the ITEC LT's strategic direction by promoting a transformative, forward-thinking IT culture . Facilitate collaboration between regional stakeholders and the global IT leadership team through consistent governance and communication channels . Proven leadership experience in a senior IT role, ideally with participation in a global or regional leadership team, and capability to represent the organization as a senior line manager and site lead . Deep technical expertise in Full Stack development and modern software technologies and agile software development practices . Demonstrated and successful track record of leading, mentoring, and growing group of teams specializing in Full Stack and modern technology stacks . Prior line management of medium to large organizations (50+ employees), with responsibility for operational and strategic outcomes . Master's degree in information technology, Computer Science, or another relevant field. . Experience of developing and executing successful organizational development, transformation or business strategies and plans . Proven abilities to achieve tangible results and to develop organizations or business . Prior experience in leading in an international matrix organization . Change management, financial and project management skills . Good presentation skills What are we looking for in an ideal candidate: . Comes from a modern, forward-thinking tech environment-where innovation is valued more than formality (no need for suits and ties here) . Ideally has spent time living or working in Europe or the US, bringing global perspective and cross cultural adaptability . Demonstrates resilience and tenacity in navigating and overcoming organizational bureaucracy, driving meaningful change against structural inertia . Passionate about modern software development practices, with hands-on experience and a strong vision for integrating AI tools seamlessly into the daily workflows of developer What do you get in return KONE is building up a unique community, aiming to provide the most advanced People Flow solutions for its customers and lifting the customer experience to the top. At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual and where we share actively ideas. Sustainability is an integral part of our culture and a daily practice. We are proud to offer a range of opportunities that will support you in achieving your career and personal goals and enable you to live a healthy and balanced life. We believe in improving performance through inspiring, engaging and developing our people. Read more on

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Epergne Solutions is seeking a Data Platform Engineer with over 6 years of total experience, including at least 5 years of relevant experience in Python and Airflow Data Engineering. The role is based in India and is a contract position with a work-from-office mode. As a Data Platform Engineer at Epergne Solutions, you will be responsible for designing, developing, and maintaining complex data pipelines using Python for efficient data processing and orchestration. You will collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment. Your role will also involve implementing data integration and transformation processes, optimizing existing data pipelines, and troubleshooting issues related to data pipelines to ensure smooth operation and minimal downtime. The ideal candidate should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with proficiency in Python and SQL for data processing and manipulation. You should have a minimum of 5 years of experience in data engineering, with a strong background in Apache Airflow and AWS technologies, particularly S3, Glue, EMR, Redshift, and AWS Lambda. Knowledge of Snowflake is preferred, along with experience in optimizing and scaling data pipelines for performance and efficiency. In addition to technical skills, you should possess excellent problem-solving abilities, effective communication skills, and the capacity to work in a fast-paced, collaborative environment. Keeping abreast of the latest industry trends and best practices related to data engineering and AWS services is crucial for this role. Preferred qualifications include AWS certifications related to data engineering or big data, experience with big data technologies like Snowflake, Spark, Hadoop, or related frameworks, familiarity with other data orchestration tools besides Apache Airflow, and knowledge of version control systems like Bitbucket and Git. Epergne Solutions prefers candidates who can join within 30 days or have a lesser notice period.,

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

hyderabad, telangana, india

Remote

Job Description You will be #LI-hybrid based in Hyderabad and reporting to Director Engineering. Working as part of an existing agile team to develop quality solutions within required deadline. Collaborating effectively to support and enhance the full product lifecycle. Reviewing proposals, evaluating alternatives, providing estimates and making recommendations. Shape and lead the infrastructure and security strategy and the technical roadmaps across the teams. Lead the design and implementation of scalable, secure, and compliant cloud solutions, collaborating across engineering, security, and business teams. Champion DevSecOps practices, CI/CD pipelines, and cloud-native tooling to enhance posture and reduce friction. Ensure operational coverage, whilst driving automation and innovation using modern engineering practices such as Infrastructure as Code and Policy as Code and engineering delivery and maintaining platform security posture. Individuals will ideally be qualified to Degree, HND or HNC standard in a software development discipline or can demonstrate commercial experience of managing infrastructure deployed on AWS. Qualifications 4+ years of experience in full development lifecycle You should be able to explain solutions to technical and non-technical audience Ability to analyse problems and requirements. Expertise in DevOps & IaC tooling - CDK, Terraform, CI/CD pipelines, Git, Groovy, shell scripting Proficient with scripting language(s): Python (preferred), PowerShell , Configuration as Code principles and API integration Familiarity in AWS technologies and principles such as Lake Formation, IAM, Glue, EC2 Expertise with automated testing methodologies. Exposure to TDD and BDD. Remain up to date with the terminology, concepts and best practice Expertise in troubleshooting and debugging production systems in AWS Desired Skills Understanding of Agile methodologies AWS Solutions Architect Certification Expertise in sing CloudFormation and CDK RESTful and Microservice Architectures Familiarity of Sonarqube and Veracode Expertise in data engineering practices Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning World's Best Workplaces 2024 (Fortune Global Top 25), Great Place To Work in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. This is a hybrid remote/in-office role. Experian Careers - Creating a better tomorrow together

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

hyderabad, telangana, india

On-site

Job Description We are looking for a passionate Data Engineer to join our agile team. You will be #LI-hybrid (Hybrid work schedule) based in Hyderabad and reporting to Director Engineering. You will help build high-quality solutions that meet the highest technical standards and deliver value to our customers. The ideal candidate will have 5+yrs of experience in the software development lifecycle, a strong understanding of business needs, and responsibility for the products and services we deliver. Collaborate with an agile team to develop quality solutions within deadlines. Support and enhance the full product lifecycle through effective collaboration. Review proposals, evaluate alternatives, provide estimates, and make recommendations. Serve as an expert on applications and provide technical support. Revise, update, refactor, and debug both new and existing codebases. Support the development of other team members. Qualifications Degree, HND, or HNC in a software development discipline, or equivalent commercial experience in developing applications deployed on AWS. Minimum 5+yrs of experience Expertise with the full development lifecycle. Able to explain solutions to both technical and non-technical audiences. Ability to write clean, scalable code, with a focus on design patterns and best practices. Proficiency with Application Lifecycle Management Tools (e.g., GIT, Jira, Confluence). Familiarity with CI/CD pipeline tools. Expertise in Scala development & with Spark Framework Familiarity with AWS technologies. Commitment to staying updated with the latest terminology, concepts, and best practices. Desired Skills: Understanding of Agile methodologies. AWS Developer Certification. Expertise in Python, Openshift/Kubernetes Proficiency with automated testing tools (e.g., Scalatest). Knowledge of RESTful and microservice architectures. Expertise in Terraform or CloudFormation. Familiarity with SonarQube and Veracode. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning World's Best Workplaces 2024 (Fortune Global Top 25), Great Place To Work in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Experian Careers - Creating a better tomorrow together

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

You will be working with a cross-functional team to create software products following approved architecture and the roadmap set by the product management team. Additionally, you will be responsible for managing a team of software engineers to achieve the goals set by the business teams. Your main tasks will include writing high-quality distributed system software and utilizing hands-on experience in Cloud technologies, as well as exposure and experience with the latest technology stack. Collaboration with architects and technical product managers to translate system architecture and product requirements into well-designed software components will be crucial. You will take ownership of implementing individual software components, focusing on quality, test-driven development, and sound software engineering practices. Participation in software design reviews, conducting peer code reviews, and providing feedback to other team members will also be part of your responsibilities. Designing, implementing, testing, deploying, and maintaining innovative software solutions to enhance service performance, durability, cost, and security will be key objectives. It is essential to utilize software engineering best practices to ensure a high standard of quality for all team deliverables. To be successful in this role, you must have a Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering, and Math) with a minimum of 12+ years of experience. You should have at least 3 years of experience in building scalable, distributed systems using modern cloud frameworks such as AWS and 2+ years of experience in leading design or architecture (design patterns, reliability, and scaling) of new and existing systems. Hands-on experience in backend product development using Python is required, along with deep knowledge of microservices architecture and containerization technologies (e.g., Docker, Kubernetes). An excellent understanding of cloud-native design patterns and best practices is essential. Strong problem-solving skills and the ability to troubleshoot complex technical issues are crucial. You should also be capable of consulting customers on the alignment of outcomes and desired technical solutions at an enterprise level. Experience with designing/architecting large-scale distributed systems, preferably using AWS technologies, is preferred. Bringing fresh ideas from various areas, including testing and validation automation, while maintaining production availability, conversion automation, distributed computing, and large-scale system design, is important. Experience in creating, documenting, and communicating software architectures for complex products and building, tracking, and communicating plans within Agile processes will be beneficial for this role.,

Posted 2 weeks ago

Apply

6.0 - 11.0 years

13 - 18 Lacs

gurugram

Work from Office

Job Summary: Were hiring for a Technical Account Manager (TAM) with a strong background in AWS cloud technologies. As part of the TAM team, youll provide guidance to both new and existing customers specifically about AWS technologies, services, and offerings. Youll have the opportunity to collaborate with other Rackspace teams to develop and deliver innovative solutions utilizing various public cloud services. Ultimately, youll be assisting customers in rethinking their technical platforms and application landscapes, promoting the adoption of cloud-native technologies, and driving innovation within their business. Work Location: If you live within a 40km radius of our Mexico City office wed like for you to work in the office two days a week. If further than 40km you may work fully remote from one these states Monterey, Aguascalientes, Jalisco, Nuevo Leon, Puebla, Guadalajara, or Queretaro. Please submit a resume in English to be considered for the role. All interviews will be held in English. Key Responsibilities: Conduct implementation calls. Present written and verbal proposals and grow the installed base. Understand invoicing, contracts, proposals and renewals. Escalate with guidance and receive escalations. Identify the issue, understand it's cause, and determine the appropriate solution. Asses deficiencies in the existing customer configuration and forecast potential issues arising from planned modifications. Describe the effectiveness of each option and highlight the key differences between the recommendations. Act as the customer primary technical advocate. Work with Rackspace cloud engineers, CSM, and other stakeholders to translate business and technical objectives into Rackspace solutions. Provide impartial advocacy, ongoing support, and assistance for workloads. Bring innovative technologies and products to the customer. Provide value-added insight to the customer ensuring AWS cloud efficiency and operational excellence in a hands-off keyboard role. Qualifications: 6+ years of relatable experience. AWS Solutions Architect Associate or other AWS strongly preferred. Excellent communication both verbally and written using the Spanish and English languages. Excellent technical knowledge of all AWS technologies, services, and offerings. Must understand be able to conduct the AWS Well Architected Framework review and proactive workloads. Ability to scale customer configurations. Ability to use account management tools. Ability to drive technical strategy of customer's cloud native transformation Strong knowledge of the configuration build process.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organizations seeking independent talent. Our client, a leading global specialist in energy management and automation, is looking to engage with a Consultant Tableau Developer. In this role, you will play an active part in accelerating the company's Big Data and Analytics environment, contributing to Digital initiatives aimed at enhancing, automating, and accelerating the implementation of master data management, adoption of big data platforms, data excellence and data dictionary evolution, data security, and business intelligence and analytics. You will collaborate with different business units of the company and team members distributed between Paris, Grenoble, Bangalore, and Barcelona. Key responsibilities include: - Designing, developing, and delivering Analytics solutions integrated with the corporate Data Platform for self and a team of developers - Conducting data analysis, data modeling, designing Analytics Dashboards architecture, and delivering in alignment with Global Platform standards - Interacting with customers to understand their business problems and provide analytics solutions - Collaborating with Global Data Platform leaders to integrate analytics with corporate platforms - Working with UX/UI global functions to design visualization for customers - Building interactive, rich visualization dashboards showcasing KPIs - Demonstrating strength in data modeling, ETL development, and data warehousing - Utilizing SQL and Query performance tuning skills - Developing solutions using Tableau to meet enterprise level requirements - Operating large-scale data warehousing and analytics projects using AWS technologies Duration: 3 to 4 months Location: On-site, Bagmane, Bangalore (Hybrid work mode) Capacity: Full time Skills Required: - B.E / B TECH/Masters in Computer Science, Electronics, relevant technical certification - 7 years of experience in Analytics Development with data modeling experience on Tableau - Certified on Tableau - Strong presentation, communication, and interpersonal skills - Ability to work effectively with globally dispersed stakeholders - Ability to manage multiple priorities in a fast-paced environment - Data-driven mindset and ability to communicate complex business problems and technical solutions - Strong team player/leader with analytical skills and problem-solving abilities In summary, this role as a Consultant Tableau Developer offers an opportunity to contribute significantly to the enhancement of Big Data and Analytics environment in a global work culture, collaborating with different business units and team members across multiple locations. The position requires a strong technology and solution delivery focus, with responsibilities spanning from data analysis to visualization design and integration with corporate platforms, utilizing Tableau and other analytics tools.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies