Jobs
Interviews

160 Kinesis Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Solution Architect in the Pre-Sales department, with 4-6 years of experience in cloud infrastructure deployment, migration, and managed services, your primary responsibility will be to design AWS Cloud Professional Services and AWS Cloud Managed Services solutions tailored to meet customer needs and requirements. You will engage with customers to analyze their requirements, ensuring cost-effective and technically sound solutions are provided. Your role will also involve developing technical and commercial proposals in response to various client inquiries such as Requests for Information (RFI), Requests for Quotation (RFQ), and Requests for Proposal (RFP). Additionally, you will prepare and deliver technical presentations to clients, highlighting the value and capabilities of AWS solutions. Collaborating closely with the sales team, you will work towards supporting their objectives and closing deals that align with business needs. Your ability to apply creative and analytical problem-solving skills to address complex challenges using AWS technology will be crucial. Furthermore, you should possess hands-on experience in planning, designing, and implementing AWS IaaS, PaaS, and SaaS services. Experience in executing end-to-end cloud migrations to AWS, including discovery, assessment, and implementation, is required. You must also be proficient in designing and deploying well-architected landing zones and disaster recovery environments on AWS. Excellent communication skills, both written and verbal, are essential for effectively articulating solutions to technical and non-technical stakeholders. Your organizational, time management, problem-solving, and analytical skills will play a vital role in driving consistent business performance and exceeding targets. Desirable skills include intermediate-level experience with AWS services like AppStream, Elastic BeanStalk, ECS, Elasticache, and more, as well as IT orchestration and automation tools such as Ansible, Puppet, and Chef. Knowledge of Terraform, Azure DevOps, and AWS development services will be advantageous. In this role based in Noida, Uttar Pradesh, India, you will have the opportunity to collaborate with technical and non-technical teams across the organization, ensuring scalable, efficient, and secure solutions are delivered on the AWS platform.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

You should have 6-8 years of hands-on experience with Big Data technologies such as pySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should possess good hands-on experience with python and Bash Scripts, along with a solid understanding of SQL and data warehouse concepts. Strong analytical, problem-solving, data analysis, and research skills are crucial for this role. It is essential to have a demonstrable ability to think creatively and independently, beyond relying solely on readily available tools. Excellent communication, presentation, and interpersonal skills are a must for effective collaboration within the team. Hands-on experience with Cloud Platform provided Big Data technologies like IAM, Glue, EMR, RedShift, S3, and Kinesis is required. Experience in orchestrating with Airflow and any job scheduler is highly beneficial. Familiarity with migrating workloads from on-premise to cloud and cloud to cloud migrations is also desired. In this role, you will be responsible for developing efficient ETL pipelines based on business requirements while adhering to development standards and best practices. Integration testing of different pipelines in AWS environment and providing estimates for development, testing, and deployments on various environments will be part of your responsibilities. Participation in code peer reviews to ensure compliance with best practices is essential. Creating cost-effective AWS pipelines using necessary AWS services like S3, IAM, Glue, EMR, Redshift, etc., is a key aspect of this position. Your experience should range from 6 to 8 years in relevant fields. The job reference number for this position is 13024.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

punjab

On-site

As a skilled and versatile NodeJS & Python Engineer, you will play a crucial role in designing, developing, and maintaining robust server-side logic and APIs to support a suite of applications. Collaborating closely with front-end developers, cross-functional engineering teams, and product stakeholders, you will ensure smooth integration of user-facing features with back-end functionality. Leveraging the Serverless framework within an AWS Lambda environment is key to your responsibilities. Your expertise in NodeJS and proficiency in Python are essential for supporting and extending services written in both languages. Your adaptability and experience across multiple technologies will be pivotal in building scalable, high-performance, and secure applications for the global user base. Escalon, a rapidly growing company offering essential back-office services worldwide, relies on the engineering team to develop tools and platforms that drive success and scalability for both Escalon and its clients. Qualifications: - Bachelor's degree in Computer Science, Engineering, or related field, or 4+ years of enterprise software development experience. - Minimum 4 years of hands-on experience with NodeJS in the Serverless framework. - Professional experience in Python (2+ years) for back-end scripting and service development. - Solid grasp of object-oriented programming principles in JavaScript and Python. - Experience with AWS serverless environment, including Lambda, Fargate, S3, RDS, SQS, SNS, Kinesis, and Parameter Store. - Understanding of asynchronous programming patterns and challenges. - Knowledge of front-end technologies like HTML5 and templating systems. - Proficiency in designing and developing loosely coupled serverless applications and REST APIs. - Strong experience with SQL and database schema design. - Familiarity with service-oriented architecture (SOA) principles and microservices best practices. - Effective verbal and written communication skills. - Experience in modern software engineering practices such as version control (Git), CI/CD, unit testing, and agile development. - Strong analytical, problem-solving, and debugging skills. - Write reusable, testable, and efficient code in both NodeJS and Python. - Develop and maintain unit tests and automated testing coverage. - Integrate front-end elements with server-side logic in a Serverless architecture. - Design and implement low-latency, high-availability, and high-performance applications. - Ensure security, data protection, and adherence to compliance standards. - Build and consume RESTful APIs and microservices using AWS Lambda and related services. - Actively participate in code reviews, design discussions, and architecture planning. - Promote the use of quality open-source libraries, considering licensing and long-term support. - Leverage and enhance the existing CI/CD DevOps pipeline for code integration and deployment.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Software Engineer (Python Fullstack) at Coders Brain Technology Pvt. Ltd., you will play a crucial role in designing, implementing, and maintaining scalable software solutions. Your primary responsibilities will involve utilizing Python and React to develop end-to-end solutions for various projects. Additionally, you will leverage AWS services such as Lambda, DynamoDB, API Gateway, and Kinesis to build efficient and scalable systems. Working on complex data projects will be a key aspect of your role, ensuring data integrity, security, and optimal performance. Collaboration with a cross-functional team will be essential to deliver high-quality software solutions. You will be responsible for conducting code reviews, offering constructive feedback to team members, and ensuring adherence to coding standards and best practices. Writing bug-free, scalable, and maintainable code will be a core part of your responsibilities. You will contribute to the enhancement of the codebase and optimize database structures for seamless data retrieval and storage. Staying updated with the latest industry trends and technologies will be crucial to enhancing the technical capabilities of the team. The ideal candidate for this position should possess a Bachelor's or Master's degree in Computer Science or a related field, along with a minimum of 5 years of experience in Full Stack development. Proficiency in Python Core, React.js, AWS Services (Lambda, DynamoDB, API Gateway, Kinesis), JavaScript/TypeScript, and Full Stack Development is required. Additional skills such as nosql databases, pytest, JavaScript, Redux, React Testing Library, Jest, Elasticsearch, scalability, team collaboration, Redis, API Gateway, performance optimization, and code quality are highly desirable for this role.,

Posted 3 days ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru

Work from Office

We are seeking an exceptionally motivated and experienced Senior Software Engineer to join the Content Productization & Delivery (CPD) organization at Thomson Reuters. As a key member of our team, you will play a pivotal role in ensuring the quality, reliability, and availability of critical systems that support our research-based applications and APIs across our core products. About the Role: In this opportunity as Senior Software Engineer, you will Actively participate and collaborate in meetings, processes, agile ceremonies, and interactions with other technology groups Work with Lead Engineers and Architects to develop high-performing and scalable software solutions Provide technical guidance, mentoring, and coaching to software or systems engineering teams Assist in identifying and correcting software performance bottlenecks Provide regular progress and status updates to management Provide technical support to operations or other development teams by troubleshooting, debugging, and solving critical issues Interpret code and solve problems based on existing standards Create and maintain technical documentation related to assigned components Deliver high-quality software solutions that meet requirements and design specifications Collaborate with cross-functional teams to ensure the success of our products Share knowledge and best practices on using new and emerging technologies Ensure the reliability and availability of critical systems About You: You're a fit for the role of Senior Software Engineer,if your background includes Python With AWS Bachelor's or master's degree in computer science, Engineering, Information Technology, or equivalent experience. 6+ years of professional software development experience Strong Python programming AWS experience with EKS/Kubernetes Experience with LLMs, AI Solutions, and evaluation Understanding of agentic systems and workflows Experience creating Retrieval Systems leveraging tools like OpenSearch Experience with event-driven/asynchronous programming Experience with high-concurrency systems Experience with CI/CD using GitHub Actions and AWS services (Code Pipeline/Code Build) Strong understanding of Microservices and RESTful APIs FastAPI Celery Data Engineering background Experience with AWS services (Redis, DynamoDB, S3, SQS, Kinesis, KMS, IAM, Secret Manager, etc.) Performance optimization and security practices #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an AWS Big Data Engineer working in a remote location, you will be a crucial part of a company that provides enterprise-grade platforms designed to accelerate the adoption of Kubernetes and Data. Our flagship platform, Gravity, offers developers a simplified Kubernetes experience by handling all the underlying complexities. You will have the opportunity to utilize tailor-made workflows to deploy Microservices, Workers, Data, and MLOps workloads across multiple Cloud Providers. Gravity takes care of various Kubernetes-related orchestration tasks including cluster provisioning, workload deployments, configuration management, secret management, scaling, and provisioning of cloud services. Additionally, Gravity provides out-of-the-box Observability for workloads, enabling developers to quickly engage in Day 2 operations. Moreover, you will work with Dark Matter, a unified data platform that enables enterprises to extract value from their data lakes. Within this platform, Data Engineers and Data Analysts can easily discover datasets through an Augmented Data Catalog. The Data Profile, Data Quality, and Data Privacy functionalities are deeply integrated within the catalog, offering an immediate snapshot of datasets in Data Lakes. Organizations can maintain Data Quality by defining quality rules that automatically monitor Accuracy, Validity, and Consistency of data to meet their data governance standards. The built-in Data Privacy engine can identify sensitive data in data lakes and take automated actions, such as redactions, through an integrated Policy and Governance engine. Your responsibilities will include having a minimum of 5+ years of experience working with high-volume data infrastructure, proficiency in AWS and/or Databricks, Kubernetes, ETL, and Job orchestration tooling. You should have extensive programming experience in either Python or Java, along with skills in data modeling, optimizing SQL queries, and system performance tuning. It is essential to possess knowledge and proficiency in the latest open-source data frameworks, modern data platform tech stacks, and tools. You should be proficient in SQL, AWS, Databases, Apache Spark, Spark Streaming, EMR, Kubernetes, and Kinesis/Kafka. Your passion should lie in tackling messy unstructured data and transforming it into clean, usable data that contributes to a more organized world. Continuous learning and staying updated with the rapidly evolving data landscape should be a priority for you. Strong communication skills, the ability to work independently, and a degree in Computer Science, Software Engineering, Mathematics, or equivalent experience are necessary for this role. Additionally, the benefit of working from home will be provided as part of this position.,

Posted 4 days ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Kochi, Bengaluru, Thiruvananthapuram

Work from Office

NP: Immediate to 15 Days Mandatory Skills: OMS , L3 Support , Ecom Domain, Java , Microservice , AWS , NewRelic , DataDog, Graphana, Splunk ( Monitoring tool ). Should have good End to End knowledge of various Commerce subsystems which include Storefront, Core Commerce back end, Post Purchase processing, OMS, Store / Warehouse Management processes, Supply Chain and Logistic processes. Extensive backend development knowledge with core Java/J2EE and Microservice based event driven architecture with a cloud based architecture (preferably AWS). Should be cognizant of key integrations undertaken in eCommerce and associated downstream subsystems which should include but not limited to different Search frameworks, Payment gateways, Product Lifecycle Management Systems, Loyalty platforms, Recommendation engines, Promotion frameworks etc. Recommend someone having a good knowledge of integrations with downstream eCommerce systems like OMS, Store Systems, ERP etc. Good understanding of Data Structures and Entity models. Should have understanding of building, deploying and maintaining server based as well as serverless applications on cloud, preferably AWS. Expertise in integrating synchronously and asynchronously with third party web services. Good to have concrete knowledge of AWS Lambda functions, API Gateway, AWS. CloudWatch, SQS, SNS, Event bridge, Kinesis, Secret Manager, S3 storage, server architectural models etc. Must have a working knowledge of Production Application Support. Good knowledge of Agile methodology, CI/CD pipelines, code repo and branching strategies preferably with GitHub or Bitbucket. Good knowledge of observability tools like NewRelic , DataDog, Graphana, Splunk etc. Should have a fairly good understanding of L3 support processes, roles and responsibilities. Should be flexible to work with overlap with some part of onsite (PST) hours to hand off / transition work for the day to the onsite counterpart for L3.

Posted 4 days ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Pune

Hybrid

So, what’s the role all about? We are seeking a highly skilled Backend Software Engineer to join the GenAI Solutions for CX , our fully integrated AI cloud customer experience platform. On this role you will get the exposure to new and exciting technologies and collaborate with professional engineers, architects, and product managers to create NICE’s advanced line of AI cloud products How will you make an impact? Design and implement high-performance microservices using AWS cloud technologies Build scalable backend systems using Python Lead the development of event-driven architectures utilizing Kafka and AWS Firehose Integrate with Athena , DynamoDB , S3 , and other AWS services to deliver end-to-end solutions Ensure high-quality deliverables with testable, reusable, and production-ready code Collaborate within an agile team, influencing architecture, design, and technology adoption Have you got what it takes? 5 + years of backend software development experience Strong expertise in Python /C# Deep knowledge of microservices architecture , RESTful APIs , and cloud-native development Hands -on experience with AWS Lambda , S3 , Athena , Kinesis Firehose , and Kafka Strong database skills (SQL & NoSQL), including schema design and performance tuning Experience designing scalable systems and delivering enterprise-grade software Comfortable working with CI/CD pipelines and DevOps practices Passion for clean code, best practices, and continuous improvement Excellent communication and collaboration abilities Fluent in English (written and spoken) What’s in it for you? Join an ever-growing, market-disrupting global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7981 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

indore, madhya pradesh

On-site

At InfoBeans, we believe in making other peoples lives better through our work and everyday interactions. We are currently looking for a Java Fullstack with Angular professional to join our team in Indore/Pune. With a focus on Web Application Development and over 10 years of experience, you will play a crucial role in developing Microservices using Spring/AWS technologies and deploying them on the AWS platform. Your responsibilities will include supporting Java Angular enterprise applications with multi-region setups, performing unit and system testing of application code, and executing implementation activities. You will be involved in designing, building, and testing Java EE and Angular full stack applications. In this role, you will have the opportunity to work in an open workspace with smart and pragmatic team members. You can expect ever-growing opportunities for professional and personal growth in a learning culture that encourages teamwork, collaboration, and diversity. Excellence, compassion, openness, and ownership are highly valued and rewarded in our environment. To excel in this role, we expect you to have in-depth knowledge of popular Java frameworks such as Spring boot and Spring, experience with Object-Oriented Design (OOD), and proficiency in Spring, Spring Boot, Relational Databases, MySQL, and ORM technologies (JPA2, Hibernate). Experience working in Agile (Scrum/Lean) with a DevSecOps focus is essential, along with familiarity with AWS, Kubernetes, Docker Containers, and AWS Component Usage, Configurations, and Deployment including Elasticsearch, EC2, S3, SNS, SQS, API Gateway Service, and Kinesis. An AWS certification would be advantageous, and any knowledge of Health and related technologies will be beneficial in this role. If you are looking for a challenging yet rewarding opportunity to contribute to cutting-edge projects in a supportive and dynamic environment, we encourage you to apply for this position.,

Posted 1 week ago

Apply

3.0 - 15.0 years

0 Lacs

delhi

On-site

At Smart Joules, we are committed to offering innovative, data-driven solutions that enable organizations to meet sustainability objectives while significantly reducing energy expenses. We are currently looking for a dedicated and customer-centric professional to assume the role of Head of Engineering. As the Head of Engineering, your primary responsibility will involve spearheading the development of our technology platform, DeJoule, to simplify and make energy optimization profitable on a large scale. DeJoule, built on cutting-edge IoT and web technologies integrated with Machine Learning, continuously detects and rectifies inefficiencies in dynamic energy systems like air conditioning and compressed air. The goal is to expand deployment, enhance data-driven intelligence, and achieve continuous optimization through full automation. In this position, you will collaborate closely with Smart Joules" leadership, management team, and clients to align our software capabilities with our mission. Your duties will entail leading a team of engineers, collaborating with various departments, and ensuring the successful delivery of high-quality products and solutions. The ideal candidate will possess a robust technical background, exceptional leadership qualities, and a fervor for innovation. Key responsibilities of the role will include building DeJoule into a globally competitive product for automatic and continuous performance optimization, user engagement, and cost efficiency. Additionally, recruiting and inspiring India's top energy tech team, managing the engineering team, fostering innovation, and aligning engineering strategies with company objectives are crucial aspects of the role. You will oversee the design and implementation of energy optimization solutions, collaborate across departments, monitor project timelines, and stay abreast of emerging energy technologies to drive innovation. The ideal candidate should have a proven track record of leading engineering teams and delivering complex projects in a fast-paced environment. Specific requirements include 10-15 years of engineering experience, with at least 3 years in a leadership role, excellent communication skills, and proficiency in various technologies such as System Design, Database Administration, JavaScript frameworks, AWS managed services, Python, AI, and machine learning. Bonus points will be considered for individuals with experience in startup environments or HVAC-related industries. Joining Smart Joules offers the opportunity to work on cutting-edge technology that can significantly reduce energy consumption in prominent buildings and factories, create a collaborative work environment, competitive salary and benefits, and the chance to contribute to sustainability goals. (ref:hirist.tech),

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Hiring Data Engineer in Bangalore with 6+ years experience in below skills: Must Have: - Big Data technologies: Hadoop, MapReduce, Spark, Kafka, Flink - Programming languages: Java/ Scala/ Python - Cloud: Azure, AWS, Google Cloud - Docker/Kubernetes Required Candidate profile - Strong in Communication Skills - Experience with relational SQL/ NoSQL databases- Postgres & Cassandra - Experience with ELK stack - Immediate Join is plus - Must be ready to work from office

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a Senior Python Data Application Developer with a strong expertise in core Python and data-focused libraries. Your primary responsibility is to design, develop, and maintain data-driven applications optimized for performance and scalability. You will be building robust data pipelines, ETL processes, and APIs for integrating various data sources efficiently within the cloud environment. In this role, you will work on AWS using serverless and microservices architectures, utilizing services such as AWS Lambda, API Gateway, S3, DynamoDB, Kinesis, and other AWS tools as required. Collaboration with cross-functional teams is essential to deliver feature-rich applications that meet business requirements. You will apply software design principles and best practices to ensure applications are maintainable, modular, and highly testable. Your tasks will also involve setting up monitoring solutions to proactively monitor application performance, detect anomalies, and resolve issues. Optimizing data applications for cost, performance, and reliability on AWS is a crucial aspect of your role. To excel in this position, you should have at least 5 years of professional experience in data-focused application development using Python. Proficiency in core Python and data libraries such as Pandas, NumPy, and PySpark is required. You must possess a strong understanding of AWS services like ECS, Lambda, API Gateway, S3, DynamoDB, Kinesis, etc. Experience with building highly distributed and scalable solutions via serverless, micro-service, and service-oriented architecture is essential. Furthermore, you should be familiar with unit test frameworks, code quality tools, and CI/CD practices. Knowledge of database management, ORM concepts, and experience with both relational (PostgreSQL, MySQL) and NoSQL (DynamoDB) databases is desired. An understanding of the end-to-end software development lifecycle, Agile methodology, and AWS certification would be advantageous. Strong problem-solving abilities, attention to detail, critical thinking, and excellent communication skills are necessary for effective collaboration with technical and non-technical teams. Mentoring junior developers and contributing to a collaborative team environment are also part of your responsibilities. This is a full-time position located in Bangalore with a hybrid work schedule. If you have proficiency in Pandas, NumPy, and PySpark, along with 5 years of experience in Python, we encourage you to apply and join our team dedicated to developing, optimizing, and deploying scalable data applications supporting company growth and innovation.,

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Tamil Nadu

Work from Office

Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer , you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python , SQL , AWS , PySpark , and Databricks , you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark , Databricks , and SQL on AWS cloud platforms . Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark . Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3 , Glue , EMR , Athena , Redshift , and Lambda . Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow , Databricks Jobs , or AWS Step Functions . Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability , monitoring , and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight.

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

4.0 - 8.0 years

14 - 24 Lacs

Hyderabad

Work from Office

Experience Required : Minimum 4.5+ years Job Summary: We are seeking a skilled Data Engineer with a strong background in data ingestion, processing, and storage. The ideal candidate will have experience working with various data sources and technologies, particularly in a cloud environment. You will be responsible for designing and implementing data pipelines, ensuring data quality, and optimizing data storage solutions. Key Responsibilities: Design, develop, and maintain scalable data pipelines for data ingestion and processing using Python, Spark, and AWS services. Work with on-prem Oracle databases, batch files, and Confluent Kafka for data sourcing. Implement and manage ETL processes using AWS Glue and EMR for batch and streaming data. Develop and maintain data storage solutions using Medallion Architecture in S3, Redshift, and Oracle. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Monitor and optimize data workflows using Airflow and other orchestration tools. Ensure data quality and integrity throughout the data lifecycle. Implement CI/CD practices for data pipeline deployment using Terraform and other tools. Utilize monitoring and logging tools such as CloudWatch, Datadog, and Splunk to ensure system reliability and performance. Communicate effectively with stakeholders to gather requirements and provide updates on project status. Technical Skills Required: Proficient in Python for data processing and automation. Strong experience with Apache Spark for large-scale data processing. Familiarity with AWS S3 for data storage and management. Experience with Kafka for real-time data streaming. Knowledge of Redshift for data warehousing solutions. Proficient in Oracle databases for data management. Experience with AWS Glue for ETL processes. Familiarity with Apache Airflow for workflow orchestration. Experience with EMR for big data processing. Mandatory: Strong AWS data engineering skills.

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Responsibilities: Design, build, and optimize data pipelines to ingest, process, transform, and load data from various sources into our data platform Implement and maintain ETL workfl ows using tools like Debezium, Kafka, Airfl ow, and Jenkins to ensure reliable and timely data processing Develop and optimize SQL and NoSQL database schemas, queries, and stored procedures for effi cient data retrieval and processing *** Work with both relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DocumentDB) to build scalable data solutions Design and implement data warehouse solutions that support analytical needs and machine learning applications Collaborate with data scientists and ML engineers to prepare data for AI/ML models and implement data-driven features Implement data quality checks, monitoring, and alerting to ensure data accuracy and reliability Optimize query performance across various database systems through indexing, partitioning, and query refactoring Develop and maintain documentation for data models, pipelines, and processes Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs Stay current with emerging technologies and best practices in data engineering Requirements: 5+ years of experience in data engineering or related roles with a proven track record of building data pipelines and infrastructure Strong profi ciency in SQL and experience with relational databases like MySQL and PostgreSQL Hands-on experience with NoSQL databases such as MongoDB or AWS DocumentDB Expertise in designing, implementing, and optimizing ETL processes using tools like Kafka, Debezium, Airfl ow, or similar technologies Experience with data warehousing concepts and technologies Solid understanding of data modeling principles and best practices for both operational and analytical systems Proven ability to optimize database performance, including query optimization, indexing strategies, and database tuning Experience with AWS data services such as RDS, Redshift, S3, Glue, Kinesis, and ELK stack Profi ciency in at least one programming language (Python, Node.js, Java) Experience with version control systems (Git) and CI/CD pipelines Job Description: Experience with graph databases (Neo4j, Amazon Neptune) Knowledge of big data technologies such as Hadoop, Spark, Hive, and data lake architectures Experience working with streaming data technologies and real-time data processing Familiarity with data governance and data security best practices Experience with containerization technologies (Docker, Kubernetes) Understanding of fi nancial back-offi ce operations and FinTech domain Experience working in a high-growth startup environment

Posted 1 week ago

Apply

10.0 - 20.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Preferred candidate profile Strong proficiency in GoLang Hands-on experience with AWS serverless offerings (Lambda, API Gateway, SQS, Kinesis, S3, DynamoDB, CloudFormation, etc.) Built and deployed cloud-based SaaS Solid grasp of AWS security (IAM, VPC, KMS) Familiar with modern software engineering practices (CI/CD, code reviews, testing) Strong communicator who thrives in cross-functional teams

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be responsible for building the most personalized and intelligent news experiences for India's next 750 million digital users. As Our Principal Data Engineer, your main tasks will include designing and maintaining data infrastructure to power personalization systems and analytics platforms. This involves ensuring seamless data flow from source to consumption, architecting scalable data pipelines to process massive volumes of user interaction and content data, and developing robust ETL processes for large-scale transformations and analytical processing. You will also be involved in creating and maintaining data lakes/warehouses that consolidate data from multiple sources, optimized for ML model consumption and business intelligence. Additionally, you will implement data governance practices and collaborate with the ML team to ensure the right data availability for recommendation systems. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field, along with 8-12 years of data engineering experience, including at least 3 years in a senior role. You must possess expert-level SQL skills and have strong experience in the Apache Spark ecosystem (Spark SQL, Streaming, SparkML), as well as proficiency in Python/Scala. Experience with the AWS data ecosystem (RedShift, S3, Glue, EMR, Kinesis, Lambda, Athena) and ETL frameworks (Glue, Airflow) is essential. A proven track record of building large-scale data pipelines in production environments, particularly in high-traffic digital media, will be advantageous. Excellent communication skills are also required, as you will need to collaborate effectively across teams in a fast-paced environment that demands engineering agility.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner Job description : Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru, Karnataka

Work from Office

Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies