Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Role Overview: You will be joining the Platform team at BOLD as a Senior Software Engineer, where you will be responsible for designing, developing, and optimizing enterprise-grade applications using Java and Spring Boot. Your role will involve collaborating with product managers, DevOps, and QA teams to ensure high-quality solutions are delivered. Additionally, you will be mentoring and guiding junior engineers while troubleshooting production issues and implementing long-term fixes. Key Responsibilities: - Design, develop, and maintain enterprise-grade applications using Java and Spring Boot. - Build and optimize APIs (REST, gRPC, and GraphQL) for high performance and scalability. - Ensure system reliability, scalability, and security in production environments. - Collaborate with cross-functional teams for delivering high-quality solutions. - Mentor and guide junior engineers, sharing best practices in coding, testing, and cloud development. - Troubleshoot production issues, perform root cause analysis, and implement long-term fixes. Qualifications Required: - 5-8 years of professional software engineering experience. - Strong expertise in Java and Spring Boot. - Hands-on experience with event-driven architectures using SQS, SNS, and Kinesis. - Proficiency in GraphQL API development. - Strong background in AWS services such as Lambda, ECS, EKS, Auto Scaling, and API Gateway. - Solid understanding of containerization and orchestration with Docker and Kubernetes. - Experience with CI/CD pipelines and automation frameworks. - Knowledge of microservices architecture and distributed system design. - Excellent debugging, problem-solving, and performance optimization skills. Additional Details: BOLD is an established global organization that helps people find jobs by creating digital products that empower individuals to build stronger resumes, cover letters, and CVs. The company celebrates diversity and promotes inclusion, with a focus on professional fulfillment and growth. Employees at BOLD are encouraged to be experts, learners, contributors, and creatives in a supportive and inclusive environment.,
Posted 5 days ago
7.0 - 12.0 years
25 - 35 Lacs
ahmedabad
Remote
Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, Five Tran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 1 week ago
8.0 - 13.0 years
18 - 22 Lacs
pune
Hybrid
Java Full stack java software engineer with 8+ years of experience strong in Core Java, Collections skills with 8+ years of experience. Preferred experience on Java 8 features such as Lambda Expressions, Streams etc. extensive experience on Spring Framework (Core / Boot / Integration) good knowledge of the design patterns applicable to data streaming experience of Apache Flink/Apache Kafka and the ELK stack are highly desirable (Elasticsearch, Logstash & Kibana) experience of Flowable or similar BPMN/CMMN tooling also highly desirable knowledge of front end technologies like Angular/JavaScript / React / Redux also applicable familiarity with CI / CD (TeamCity / Jenkins), Git / GitHub /GitLab familiarity with Docker/containerization technologies familiarity with Microsoft Azure proven track record in an agile SDLC in a large scale enterprise environment knowledge of Post trade processing in large financial institutions an added bonus!
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are looking for a skilled and experienced Microsoft Fabric Engineer to join the data engineering team. Your main responsibilities will include designing, developing, and maintaining data solutions using Microsoft Fabric. This will involve working across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. In this role, you will need to have a deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within the Microsoft ecosystem. Some of your key responsibilities will include designing and implementing scalable and secure data solutions, building and maintaining Data Pipelines using Dataflows Gen2 and Data Factory, working with Lakehouse architecture, and managing datasets in OneLake. You will also be responsible for developing and optimizing notebooks (PySpark or T-SQL) for data transformation and processing, collaborating with data analysts and business users to create interactive dashboards and reports using Power BI (within Fabric), leveraging Synapse Data Warehouse and KQL databases for structured and real-time analytics, monitoring and optimizing performance of data pipelines and queries, and ensuring data quality, security, and governance practices are adhered to. To excel in this role, you should have at least 3 years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. You must be proficient in tools such as Data Factory (Fabric), Synapse Data Warehouse/SQL Analytics Endpoints, Power BI integration, and DAX, as well as have a solid understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus, and familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Overall, as a Microsoft Fabric Engineer, you will play a crucial role in designing, developing, and maintaining data solutions using Microsoft Fabric, collaborating with various teams to ensure data quality and security, and staying current with Microsoft Fabric updates and best practices to recommend enhancements. Please note that the qualifications required for this role include proficiency in Microsoft Fabric, OneLake, Data Factory, Data Lake, and DataMesh.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Pega Decisioning Architect at Axioms Consulting Ltd & CollabPartnerz, you will play a crucial role in leading the design and implementation of complex, enterprise-scale Pega Decisioning solutions. Your responsibilities will involve mentoring junior team members, driving technical excellence, and collaborating with stakeholders to deliver innovative solutions that cater to critical business needs. This position offers you the chance to make a significant impact on our clients" success while further honing your expertise in Pega Decisioning. With 6+ years of Pega development experience, focusing primarily on Decisioning, and 3+ years of CDH experience, you will be expected to have a solid grasp of data integration and SQL. Your analytical and problem-solving skills, along with strong communication abilities and the capacity to work both independently and as part of a team, will be essential in this role. A Bachelor's degree or equivalent experience is required, while CPDC, CPDS, CSSA certifications are preferred qualifications. Your expertise should encompass integrating CDH with channels, real-time data ingestion, connections, and management, as well as API integrations, CDH upgrade, troubleshooting, debugging, and resolution. Experience with newer versions of CDH Infinity 23/24.x would be advantageous. Key responsibilities will include leading the design and configuration of complex decisioning architectures within Pega CDH, developing reusable components and frameworks, and designing and implementing advanced decision strategies incorporating predictive analytics and machine learning models. You will also lead the design and implementation of complex data integration solutions for decisioning, collaborate with data architects, define data models and governance policies, and optimize data integration performance. Moreover, you will be responsible for leading requirements gathering sessions, developing comprehensive functional and non-functional requirements, conducting gap analysis, and providing process improvement recommendations. Testing efforts for complex decisioning solutions, including performance testing, load testing, and user acceptance testing (UAT) will be under your purview. Additionally, establishing and enforcing Pega coding standards and best practices, mentoring junior and mid-level architects, and promoting a culture of technical excellence and continuous improvement within the team will be crucial aspects of the role. Your adaptability to complex project environments, ability to identify and address technical challenges, and skills in project management, planning, and estimation for complex decisioning implementations will be vital in ensuring the successful delivery of high-quality solutions.,
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
pune, chennai
Hybrid
Key Responsibilities: 5 to 10 years of experience designing and building data pipelines using Apache Spark, Databricks or equivalent bigdata frameworks Handson expertise with streaming and messaging systems such as Apache Kafka (publish subscribe architecture), Confluent Cloud, RabbitMQ or Azure Event Hub. Experience creating producers, consumers and topics and integrating them into downstream processing. Deep understanding of relational databases and CDC. Proficiency in SQL Server, Oracle, or other RDBMSs; experience capturing change events using Debezium or native CDC tools and transforming them for downstream consumption. Implement CDC and deduplication logic. Capture change events from source databases using Debezium, built-in CDC features of SQL Server/ Oracle or other connectors. Apply watermarking and drop duplicate strategies based on primary keys and event timestamps. Proficiency in programming languages such as Python, Scala or Java and solid knowledge of SQL for data manipulation and transformation. Cloud platform expertise. Experience with Azure or AWS services for data storage, compute, and orchestration (e.g., ADLS, S3, Azure Data Factory, AWS Glue, Airflow, DBX, DLT). Data modelling and warehousing. Knowledge of data Lakehouse architectures, Delta Lake, partitioning strategies, and performance optimisation. Version control and DevOps. Familiarity with Git and CI/CD pipelines; ability to automate deployment and manage infrastructure as code. Strong problem solving and communication skills. Ability to work with cross functional teams and articulate complex technical concepts to nontechnical stakeholders.
Posted 1 week ago
4.0 - 8.0 years
9 - 15 Lacs
hyderabad
Hybrid
We are hiring for a Data Engineer R ole for an ( Investment banking and financial services company) Permanent Opportunity for the Hyderabad location. If you’re interested in this opportunity, please do share your updated profile along with the updated resume in subnayak@allegisglobalsolutions.com. Expertise in big data technologies. Proficiency in big data with expertise in Spark, Python, Hive, SQL, Airflow. Proficient in AWS Glue, DBT, Data Streaming. Hands-on experience building, implementing, and enhancing enterprise-scale data platforms. Proficiency in big data with expertise in Spark, Python, Hive, SQL, Presto, storage formats like Parquet, and orchestration tools such as Apache Airflow. Knowledgeable in cloud environments (preferably AWS), with an understanding of EC2, S3, Linux, Docker, and Kubernetes. ETL Tools: Proficient in Talend, Apache Airflow, dbt, and Informatica. Data Warehousing: Experience with Amazon Redshift and Ateina.
Posted 1 week ago
8.0 - 13.0 years
25 - 40 Lacs
chennai
Work from Office
Roles and Responsibilities: Working with clients to understand their data. Based on the understanding you will be building the data structures and pipelines. You will be working on the application from end to end collaborating with UI and other development teams. You will be working with various cloud providers such as Azure & AWS. You will be engineering data using the Hadoop/Spark ecosystem. You will be responsible for designing, building, optimizing and supporting new and existing data pipelines. Orchestrating jobs using various tools such Oozie, Airflow, etc. Developing programs for cleaning and processing data. You will be responsible for building the data pipelines to migrate and load the data into the HDFS either on-prem or in the cloud. Developing Data ingestion/process/integration pipelines effectively. Creating Hive data structures, metadata and loading the data into data lakes / Bigdata warehouse environments. Optimized (Performance tuning) many data pipelines effectively to minimize cost. Code versioning control and git repository is up to date. You should be able to explain the data pipeline to internal and external stakeholders. You will be responsible for building and maintaining CI/CD of the data pipelines. Preferred Qualifications: Bachelors degree in computer science or related field. Minimum of 5+ years working experience with Spark, Hadoop eco systems. Minimum of 4+ years working experience on designing data streaming pipelines Minimum experience of 3+ years on NoSQL and Spark Streaming. Proven experience with big data ecosystem tools such as Sqoop, Spark, SQL, API, Hive, Oozie, Airflow, etc.. Solid experience in all phases of SDLC with 10+ years of experience (plan, design, develop, test, release, maintain and support) Hands-on experience using Azures data engineering stack. Should have implemented projects using programming languages such as Scala or Python. Working experience on SQL complex data merging techniques such as windowing functions etc.. Hands-on experience with on-prem distribution tools such as Cloudera/Horton Works/MapR.
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
We are searching for a highly skilled and seasoned Senior ETL & Data Streaming Engineer with over 10 years of experience to take on a crucial role in the design, development, and maintenance of our robust data pipelines. The ideal candidate will possess in-depth expertise in batch ETL processes as well as real-time data streaming technologies, along with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is a must. Your responsibilities will include designing, developing, and implementing highly scalable, fault-tolerant, and performant ETL processes using leading ETL tools to extract, transform, and load data from diverse source systems into our Data Lake and Data Warehouse. You will also be tasked with architecting and constructing batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to facilitate immediate data ingestion and processing requirements. Furthermore, you will need to leverage and optimize various AWS data services such as AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others to develop and manage data pipelines. Collaboration with data architects, data scientists, and business stakeholders to comprehend data requirements and translate them into efficient data pipeline solutions is a key aspect of the role. It will also be essential for you to ensure data quality, integrity, and security across all data pipelines and storage solutions, as well as monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Additionally, you will be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs, and implementing data governance policies and best practices within the Data Lake and Data Warehouse environments. As a mentor to junior engineers, you will contribute to fostering a culture of technical excellence and continuous improvement. Staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming will also be expected. Required Qualifications: - 10+ years of progressive experience in data engineering, focusing on ETL, ELT, and data pipeline development. - Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Proficiency with AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, and potentially AWS EMR. - Strong background in traditional data warehousing concepts, dimensional modeling, and DWH design principles. - Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Strong understanding of relational databases and NoSQL databases. - Experience with version control systems (e.g., Git). - Excellent analytical and problem-solving skills with attention to detail. - Strong verbal and written communication skills for conveying complex technical concepts to diverse audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or related areas.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Bluevine is revolutionizing small business banking by offering innovative solutions like checking, lending, and credit tailored to support entrepreneurs in their growth journey. With top-notch technology, advanced security measures, and a profound understanding of the small business community, Bluevine empowers entrepreneurs to expand their ventures confidently. Backed by prominent investors including Lightspeed Venture Partners, Menlo Ventures, 83North, and Citi Ventures, Bluevine has been dedicated to assisting small and medium-sized businesses since 2013. Having served over 500,000 customers nationwide, Bluevine has built a dynamic global team of 500 individuals. The ultimate mission is to equip small businesses with the financial resources necessary for success. Joining Bluevine means becoming a part of a collaborative, fast-paced team that is reshaping the future of banking. Are you ready to create an impact This is a hybrid role at Bluevine, emphasizing the importance of a collaborative culture upheld through in-person interactions and a lively office environment. Following local guidelines, all Bluevine offices have reopened and are operating on a hybrid model, with in-office days determined by location and role. **Responsibilities:** - Managing cloud-based databases on AWS. - Monitoring database infrastructure for availability, reliability, and performance. - Detecting, troubleshooting, and resolving incidents in real-time. - Designing scalable data architectures aligned with business requirements. - Supporting ETL processes and data streaming (CDC) in AWS. - Optimizing database performance and minimizing downtime. - Collaborating with IL and US teams on daily operations. - Working with DevOps, NOC, and R&D teams on procedures and workbooks. - Performing routine maintenance tasks such as upgrades, user management, and backups. **Requirements:** - 5+ years of expertise in relational databases (PostgreSQL), Datawarehouse DBs, and data lakes on AWS Cloud. - Strong understanding of Data streaming and/or Change Data Capture (CDC) methodologies and implementation. - Hands-on experience with observability tools like Airflow, Prometheus, Grafana, OpenSearch, and New Relic. - Proficiency in scripting and automation using Python. - Understanding of Graph DBs. - Familiarity with CI/CD Pipelines, AWS Lambda, and Terraform (not mandatory). - Exceptional troubleshooting skills with a proactive problem-solving approach. - Strong communication and collaboration skills to engage effectively across global teams. **Benefits & Perks:** - Excellent group health coverage and life insurance. - Stock options. - Hybrid work model. - Meal allowance. - Transportation assistance (terms and conditions apply). - Generous paid time off plan, including holidays. - Company-sponsored mental health benefits. - Financial advisory services for short- and long-term goals. - Learning and development opportunities to support career growth. - Community-based volunteering opportunities. #LI-MS1 #LI-Hybrid,
Posted 2 weeks ago
6.0 - 10.0 years
22 - 27 Lacs
chennai
Hybrid
• Strong hands-on experience with Salesforce Data Cloud and core Salesforce platform Solid understanding of data modelling and integration patterns • Solid understanding of Data Streams, Data Lake, Data Models, Data Transforms and Data Analysis
Posted 2 weeks ago
8.0 - 13.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for an accomplished and innovative Hands-On Technology Lead - VP to drive the development and delivery of cloud-based applications, Kafka implementations, and Java full-stack solutions. As a seasoned technical leader, you will be responsible for defining the architecture and leading the implementation of robust, scalable, and secure cloud-based applications. Your role will involve overseeing Kafka integrations for data streaming and event-driven architectures, as well as driving best practices in Java full-stack development. In this position, you will actively engage in coding, debugging, and designing solutions to ensure high-quality deliverables. You will also perform code reviews, optimize system performance, and leverage cloud technologies to build resilient, distributed applications. Additionally, you will lead and mentor a team of engineers, fostering a culture of innovation and collaboration, while identifying and addressing skill gaps within the team. As a Hands-On Technology Lead - VP, you will design and implement cloud-native solutions leveraging AWS, Azure, or Google Cloud Platform (GCP). You will ensure the optimal use of cloud resources to achieve scalability, performance, and cost-efficiency, while driving the adoption of Infrastructure as Code (IaC) and DevOps best practices. Your responsibilities will also include leading Kafka implementation projects, designing event-driven architectures aligned with business needs, and ensuring high availability, fault tolerance, and security of data streams. You will collaborate with business stakeholders to align technical solutions with organizational goals and effectively communicate technical concepts to non-technical audiences, including senior management. Furthermore, you will stay updated with emerging technologies and industry trends, evaluate and integrate new tools, frameworks, and methodologies, and contribute to the long-term technology roadmap and strategy. The ideal candidate will have a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, with 13+ years of experience in software development and 8+ years in a technical leadership role. You should have proven experience with cloud platforms (AWS, Azure, GCP) and Kafka implementations, expertise in Java full-stack development, and strong knowledge of relational and non-relational databases. Additionally, hands-on expertise in cloud-native application design, Kubernetes, Docker, and CI/CD pipelines is preferred. If you have experience in real-time analytics or big data platforms, knowledge of security best practices for cloud-based and distributed systems, and relevant certifications (e.g., AWS Certified Solutions Architect, Certified Kubernetes Administrator), it would be an advantage. This is a full-time position in the Applications Development job family group within the Technology domain at Citi.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will play a crucial role in contributing to the development of a high-throughput, stable, and real-time gaming platform. Working closely with cross-functional teams comprising product managers, designers, and backend/frontend engineers, you will participate in defining project requirements and deliverables. Your primary responsibility will involve developing and maintaining efficient and reliable Java code that aligns with the project's performance and scalability objectives. Moreover, you will be tasked with implementing and optimizing data streaming mechanisms to facilitate real-time player interactions, ensuring minimal latency and high responsiveness. It will be essential for you to identify and address bottlenecks, performance issues, and potential areas for enhancement to uphold system efficiency. Furthermore, you will be expected to write and maintain unit tests to guarantee the reliability and robustness of the codebase. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. You must have a proven track record of designing, implementing, and maintaining high-performance, real-time systems, preferably within the gaming or a similar industry. A strong command of the Java programming language and related frameworks is essential, along with a solid understanding of distributed systems, microservices architecture, and cloud technologies. Experience with data streaming platforms like Apache Kafka or RabbitMQ is highly desirable. Proficiency in database design and optimization for high throughput and low latency is also expected. You should exhibit strong problem-solving skills and the ability to troubleshoot complex issues in a real-time environment. Excellent communication skills and a collaborative mindset are essential for effective teamwork. Lastly, a passion for gaming and a user-centric approach will be beneficial in this role.,
Posted 3 weeks ago
4.0 - 8.0 years
10 - 20 Lacs
pune
Hybrid
Job Title: Data Engineer - Ingestion, Storage & Streaming (Confluent Kafka) Job Summary: As a Data Engineer specializing in Ingestion, Storage, and Streaming, you will design, implement, and maintain robust, scalable, and high-performance data pipelines for the efficient flow of data through our systems. You will work with Confluent Kafka to build real-time data streaming platforms, ensuring high availability and fault tolerance. You will also ensure that data is ingested, stored, and processed efficiently and in real-time to provide immediate insights. Key Responsibilities: Kafka-Based Streaming Solutions: Design, implement, and manage scalable and fault-tolerant data streaming platforms using Confluent Kafka. Develop real-time data streaming applications to support business-critical processes. Implement Kafka producers and consumers for ingesting data from various sources. Handle message brokering, processing, and event streaming within the platform. Ingestion & Data Integration: Build efficient data ingestion pipelines to bring real-time and batch data from various data sources into Kafka. Ensure smooth data integration across Kafka topics and handle multi-source data feeds. Develop and optimize connectors for data ingestion from diverse systems (e.g., databases, external APIs, cloud storage). Data Storage and Management: Manage and optimize data storage solutions in conjunction with Kafka, including topics, partitions, retention policies, and data compression. Work with distributed storage technologies to store large volumes of structured and unstructured data, ensuring accessibility and compliance. Implement strategies for schema management, data versioning, and data governance. Data Streaming & Processing: Leverage Kafka Streams and other stream processing frameworks (e.g., Apache Flink, ksqlDB) to process real-time data and provide immediate analytics. Build and optimize data processing pipelines to transform, filter, aggregate, and enrich streaming data. Monitoring, Optimization, and Security: Set up and manage monitoring tools to track the performance of Kafka clusters, ingestion, and streaming pipelines. Troubleshoot and resolve issues related to data flows, latency, and failures. Ensure data security and compliance by enforcing appropriate data access policies and encryption techniques. Collaboration and Documentation: Collaborate with data scientists, analysts, and other engineers to align data systems with business objectives. Document streaming architecture, pipeline workflows, and data governance processes to ensure system reliability and scalability. Provide regular updates on streaming and data ingestion pipeline performance and improvements to stakeholders. Required Skills & Qualifications: Experience: 3+ years of experience in data engineering, with a strong focus on Kafka, data streaming, ingestion, and storage solutions. Hands-on experience with Confluent Kafka, Kafka Streams, and related Kafka ecosystem tools. Experience with stream processing and real-time analytics frameworks (e.g., ksqlDB, Apache Flink). Technical Skills: Expertise in Kafka Connect, Kafka Streams, and Kafka producer/consumer APIs. Proficient in data ingestion and integration techniques from diverse sources (databases, APIs, etc.). Strong knowledge of cloud data storage and distributed systems. Experience with programming languages like Java, Scala, or Python for Kafka integration and stream processing. Familiarity with tools such as Apache Spark, Flink, Hadoop, or other data processing frameworks. Experience with containerization and orchestration tools such as Docker, Kubernetes.
Posted 3 weeks ago
8.0 - 13.0 years
0 Lacs
chennai, tamil nadu
On-site
As an accomplished and innovative Hands-On Technology Lead - VP, you will be responsible for driving the development and delivery of cloud-based applications, Kafka implementations, and Java full-stack solutions. This role requires a seasoned technical leader with extensive experience in architecting, developing, and deploying scalable enterprise solutions. You will combine hands-on technical expertise with leadership acumen to guide teams, implement best practices, and deliver world-class software products. Key Responsibilities Technical Leadership: - Define the architecture and lead the implementation of robust, scalable, and secure cloud-based applications. - Oversee Kafka integrations for data streaming and event-driven architectures. - Drive best practices in Java full-stack development, including backend and frontend technologies. - Provide technical direction for major database solutions and data management strategies. Hands-On Development: - Actively engage in coding, debugging, and designing solutions to ensure high-quality deliverables. - Perform code reviews and optimize system performance. - Leverage cloud technologies to build resilient, distributed applications. Team Leadership and Mentorship: - Lead and mentor a team of engineers, fostering a culture of innovation and collaboration. - Identify skill gaps within the team and provide targeted training and development. - Collaborate with global teams across time zones to deliver unified solutions. Cloud Strategy and Implementation: - Design and implement cloud-native solutions leveraging AWS, Azure, or Google Cloud Platform (GCP). - Ensure optimal use of cloud resources to achieve scalability, performance, and cost-efficiency. - Drive the adoption of Infrastructure as Code (IaC) and DevOps best practices. Kafka Implementation and Data Streaming: - Lead Kafka implementation projects, including setup, configuration, and optimization for real-time data processing. - Design event-driven architectures that align with business needs. - Ensure high availability, fault tolerance, and security of data streams. Stakeholder Management: - Partner with business stakeholders to align technical solutions with organizational goals. - Effectively communicate technical concepts to non-technical audiences, including senior management. - Manage expectations and ensure timely delivery of complex projects. Technical Innovation and Strategy: - Stay updated with emerging technologies and industry trends. - Evaluate and integrate new tools, frameworks, and methodologies to improve system performance and team productivity. - Contribute to the long-term technology roadmap and strategy. Qualifications And Skills Education: Bachelors or Masters degree in Computer Science, Engineering, or related field. Experience: - 13+ years of experience in software development, with 8+ years in a technical leadership role. - Proven experience with cloud platforms (AWS, Azure, GCP) and Kafka implementations. - Expertise in Java full-stack development, including Spring Boot, Hibernate, and modern frontend frameworks (e.g., Angular, React). - Strong knowledge of relational and non-relational databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB). Technical Skills: - Hands-on expertise in cloud-native application design, Kubernetes, Docker, and CI/CD pipelines. - In-depth understanding of microservices architecture and API design principles. - Strong problem-solving and analytical skills with a focus on system performance optimization. Leadership Skills: - Ability to lead cross-functional teams and manage complex projects. - Strong interpersonal and communication skills, with the ability to build relationships across the organization. Preferred Skills - Experience in real-time analytics or big data platforms. - Knowledge of security best practices for cloud-based and distributed systems. - Certifications in relevant technologies (e.g., AWS Certified Solutions Architect, Certified Kubernetes Administrator).,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining Programming.com, a global enterprise specializing in next-gen digital solutions to drive agility, efficiency, and impact. With over 22 years of experience and a team of 2200+ tech experts across multiple delivery hubs worldwide, we have successfully completed more than 1000 projects in various industries. Our key competencies include AI & Machine Learning, Digital Transformation, Cloud Enablement, Custom Applications, and Blockchain solutions, leading to significant operational cost reductions and faster time-to-market for our Fortune 500 clients. As a Kafka Developer with over 10 years of experience, you will play a crucial role in designing, developing, and maintaining Kafka solutions for real-time data processing and integration. Based in Noida, your responsibilities will include collaborating with software development teams, ensuring seamless data flow, optimizing Kafka clusters, and troubleshooting any issues. You will also contribute to the architecture and strategy of data pipelines, ensuring high performance, resilience, and scalability of systems. To excel in this role, you should possess a strong background in Computer Science and Programming, proficiency in Back-End Web Development, and expertise in Object-Oriented Programming (OOP). Experience with Kafka, data streaming, and real-time processing is essential, along with proven teamwork skills, problem-solving abilities, and analytical thinking. A Bachelor's or Master's degree in Computer Science or a related field is required, and prior experience in the technology or enterprise industry would be advantageous. Join us at Programming.com and be part of "Programming the future" of the tech revolution.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
At Ford, we take pride in being an industry icon with a well-deserved reputation for crafting iconic, world-class vehicles. The commitment and expertise of our exceptional teams play a pivotal role in making this possible. We highly value the talent, experience, and dedication of our people and teams, considering them as the cornerstones of our success. Embracing diversity of voices and perspectives, we believe that every individual's contribution is invaluable. We encourage you to bring your authentic self and join us in our pursuit of greatness to achieve remarkable things together. We are currently looking for a talented and experienced Engineering Product Manager to join our team. In this role, you will lead the development, enhancement, and extension of our observability platform. Your responsibilities will include shaping product roadmaps, collaborating with cross-functional teams, defining strategies, driving the delivery of innovative features to meet customer needs, and transforming monitoring practices to comprehensive full-stack observability solutions. The ideal candidate for this role will have a Bachelor's degree in Computer Science or equivalent work experience, along with at least 6 years of proven experience in product management for software engineering projects. Strong technical background in observability, monitoring, data management, and visualization are essential, including experience with technologies such as Kafka, data streaming, and visualization platforms. You should also have a track record of building and managing successful engineering teams, excellent project management skills, exceptional communication abilities, and an analytical mindset for data-driven decision-making. As a Product Manager, you will play a crucial role in creating clarity for our teams to execute effectively. You will define and communicate a clear product vision and strategy, collaborate on roadmap planning, lead feature development, work closely with cross-functional teams, champion a user-centric design approach, stay informed about industry trends, manage product releases, monitor key performance metrics, engage with customers, and drive ongoing product improvement. If you are someone who dreams big, plans exceptionally well, and has experience in balancing complex tasks at scale, we invite you to apply for this exciting opportunity to shape the future of our observability initiatives and strategies at Ford.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Bluevine is transforming small business banking with innovative solutions like checking, lending, and credit tailored to help entrepreneurs thrive. With best-in-class technology, advanced security, and a deep understanding of the small business community, Bluevine is empowering entrepreneurs to grow with confidence. Backed by leading investors like Lightspeed Venture Partners, Menlo Ventures, 83North, and Citi Ventures, Bluevine has been supporting SMBs since 2013, serving over 500,000 customers nationwide and growing a dynamic global team of 500 people. The mission of Bluevine is to fuel small businesses with the financial tools they need to succeed. Joining Bluevine means being part of a collaborative, fast-paced team that is reshaping the future of banking. If you are ready to make an impact, Bluevine is the place for you. This is a hybrid role at Bluevine. The company prides itself on its collaborative culture, which is best maintained through in-person interactions and a vibrant office environment. All Bluevine offices have reopened in accordance with local guidelines and are following a hybrid model. In-office days will be determined by location and discipline. **What You'll Do:** - Manage cloud-based databases on AWS. - Monitor database infrastructure for availability, reliability, and performance. - Detect, troubleshoot, and resolve incidents in real-time. - Design scalable data architectures aligned with business needs. - Support ETL processes and data streaming (CDC) in AWS. - Optimize database performance and minimize downtime. - Collaborate with IL and US teams on daily operations. - Work with DevOps, NOC, and R&D teams on procedures and workbooks. - Perform routine maintenance (upgrades, user management, backups). **What We Look For:** - 5+ years of expertise in relational databases (PostgreSQL), Datawarehouse DBs, and data lakes on AWS Cloud. - Strong understanding of Data streaming and/or Change Data Capture (CDC) methodologies and implementation. - Hands-on experience in observability tools such as Airflow, Prometheus, Grafana, OpenSearch, and New Relic. - Experience in scripting and automation using Python. - Understanding Graph DBs. - Familiarity with CI/CD Pipelines, AWS Lambda, and Terraform (not a must). - Exceptional troubleshooting skills with a proactive approach to problem-solving. - Strong communication and collaboration skills to work across global teams. **Benefits & Perks:** - Excellent group health coverage and life insurance. - Stock options. - Hybrid work model. - Meal allowance. - Transportation assistance (terms and conditions apply). - Generous paid time off plan, Holidays. - Company-sponsored mental health benefits. - Financial advisory services for both short- and long-term goals. - Learning and development opportunities to support career growth. - Community-based volunteering opportunities.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As a Systems Architect specializing in BizTech Integrations at Airbnb, you will be a crucial part of a team working towards the mission of creating a world where everyone can belong anywhere. The BizTech department plays a pivotal role in providing internal systems, enterprise technologies, and technical support to enable a seamless experience for both the business and its people. You will join the Integrations team within BizTech, responsible for managing integrations and APIs across various SaaS products and internal systems. Your primary focus will be on owning the end-to-end architecture of finance integrations, leading integration projects, and enhancing processes while ensuring security and best practices are implemented. Collaboration with functional and partner teams will be a key aspect of this role. Your role as a Systems Architect at Airbnb involves advancing integration platforms and SaaS integrations across the enterprise. Working closely with Finance and Procurement departments, you will lead over 200 finance integrations within the Oracle Integration framework. Your responsibilities include guiding engineers in creating robust integrations, driving the evolution of integration architecture, and managing key integration projects to ensure secure and efficient data exchanges. A typical day in this role will involve leading Oracle Integration practices, driving multiple projects simultaneously, collaborating with stakeholders, and architecting complex integrations and APIs to automate processes and connect systems. You will also be responsible for ensuring the quality of deliverables, communicating effectively with teams and stakeholders, and contributing to organizational priorities and strategies. To excel in this role, you should have 10+ years of hands-on experience with Oracle technologies, specifically Oracle Integration technologies including Oracle Integration Cloud (OIC). Expertise in designing and implementing integrations and APIs, familiarity with various SaaS apps, and strong SQL proficiency are essential. Experience with Oracle Cloud Infrastructure, programming languages like Java, CI/CD tools, and a desire to mentor other engineers are also valuable assets. At Airbnb, we are committed to fostering an inclusive and diverse work environment where all qualified individuals are encouraged to apply. By joining our team, you will have the opportunity to contribute to innovative solutions and products that resonate with our core values and mission.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Architect at Cigna International Markets, your primary responsibility is to define commercially aware and technically astute solutions that align with the architectural direction while considering project delivery constraints. You will be an integral part of the Architecture function, collaborating with senior stakeholders to establish strategic direction and ensure that business solutions reflect this intent. Your role will involve leading and defining effective business solutions within complex project environments, showcasing the ability to cultivate strong relationships across Business, IT, and 3rd Party stakeholders. Your main duties and responsibilities will include performing key enterprise-wide Data Architecture tasks within International Markets, particularly focusing on on-premise and cloud solution deployments. You will engage proactively with various stakeholders to ensure that business investments result in cost-effective and suitable data-driven solutions. Additionally, you will assist sponsors in creating compelling business cases for change and work with Solution Architects to define data solution designs that meet business and operational expectations. As a Data Architect, you will own and manage data models and design artifacts, offering guidance on best practices and standards for customer-centric data delivery and management. You will advocate for data-driven design within an agile delivery framework and actively participate in the full project lifecycle, from shaping estimates to governing solutions during development. Furthermore, you will be responsible for identifying and managing risks, issues, and assumptions throughout the project lifecycle and play a lead role in selecting 3rd Party solutions. Your skills and experience should include a minimum of 10 years in IT with 5 years in a Data Architecture or Data Design role. You should have experience leading data design projects and delivering significant assets to organizations such as Data Warehouse, Data Lake, or Customer 360 Data Platform. Proficiency in various data capabilities like data modeling, database design, data migration, and data integration (ETL/ELT and data streaming) is essential. Familiarity with toolsets and platforms like AWS, SQL Server, Qlik, and Collibra is preferred. A successful track record of working in globally dispersed teams, technical acumen across different domains, and a collaborative mindset are desirable attributes. Your commercial awareness, financial planning skills, and ability to work with diverse stakeholders to achieve mutually beneficial solutions will be crucial in this role. Join Cigna Healthcare, a division of The Cigna Group, and contribute to our mission of advocating for better health and improving lives.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Platform Engineer at Kenvue Data Platforms, you will have an exciting opportunity to be part of our growing Data & Analytics product line team. Your role involves collaborating closely with various teams such as Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. You will play a key role in shaping the overall solution and data platforms, ensuring their stability, responsiveness, and alignment with business and cloud computing needs. Your expertise will be crucial in optimizing business outcomes and contributing to the growth and success of the organization. Your responsibilities will include providing leadership for data platforms in partnership with architecture teams, conducting proof of concepts to deliver secure and scalable platforms, staying updated on emerging technologies, mentoring other platform engineers, and focusing on the execution and delivery of reliable data platforms. You will work closely with Business Analytics leaders to understand business needs and create value through technology. Additionally, you will lead data platforms operations, build next-generation data and analytics capabilities, and drive the adoption and scaling of data products within the organization. To be successful in this role, you should have an undergraduate degree in Technology, Computer Science, applied data sciences, or related fields, with an advanced degree being preferred. You should possess strong analytical skills, effective communication abilities, and a proven track record in developing and maintaining data platforms. Experience with cloud platforms such as Azure, GCP, AWS, cloud-based databases, data streaming platforms, and Agile methodology will be essential. Your ability to define platforms tech stack, prioritize work items, and work effectively in a diverse and inclusive company culture will be critical to your success in this role. If you are passionate about leveraging data and technology to drive business growth, make a positive impact on personal health, and shape the future of data platforms, then this role at Kenvue Data Platforms is the perfect opportunity for you. Join us in our mission to empower millions of people every day through insights, innovation, and care. We look forward to welcoming you to our team! Location: Asia Pacific-India-Karnataka-Bangalore Function: Digital Product Development Qualifications: - Undergraduate degree in Technology, Computer Science, applied data sciences or related fields; advanced degree preferred - Strong interpersonal and communication skills, ability to explain digital concepts to business leaders and vice versa - 4 years of data platforms experience in Consumer/Healthcare Goods companies - 6 years of progressive experience in developing and maintaining data platforms - Minimum 5 years hands-on experience with Cloud Platforms and cloud-based databases - Experience with data streaming platforms, microservices, and data integration - Proficiency in Agile methodology within DevSecOps model - Ability to define platforms tech stack to address data challenges - Proven track record of delivering high-profile projects within defined resources - Commitment to diversity, inclusion, and equal opportunity employment,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At JP Morgan Chase, you will play a crucial role in delivering end-to-end data pipeline solutions on cloud infrastructure to enhance the digital banking experience for our customers. Your expertise will help us leverage the latest technologies and industry best practices to build innovative business products and ensure a seamless and secure banking environment. Your responsibilities will include using domain modeling techniques to create top-tier business products, structuring software for easy understanding and evolution, and implementing scalable architectural patterns to avoid single points of failure. You will develop secure code to safeguard our customers and systems from malicious activities and promptly address and resolve any issues that may arise. Additionally, you will focus on optimizing data processing, monitoring system performance, and ensuring reliable and efficient operations. Your role will also involve updating technologies and patterns continuously, supporting products throughout their lifecycle, and managing incidents effectively to minimize downtime for end-users. To excel in this role, you should have formal training or certification in data engineering concepts, along with recent hands-on experience as a data engineer. Proficiency in coding with Python, designing effective tests, and strong communication skills in English are essential. Experience with cloud technologies, distributed systems, data transformation frameworks, and data pipeline orchestration tools will be beneficial for this position. Moreover, your ability to manage large volumes of data, optimize data processing, and work with event-based architecture, data streaming, and messaging frameworks will be valuable. You should also be capable of coaching team members on coding practices, design principles, and implementation patterns, as well as managing stakeholders and prioritizing tasks across multiple work streams. Preferred qualifications include experience in a highly regulated environment, familiarity with AWS cloud technologies, expertise in data governance frameworks, and an understanding of incremental data processing and versioning. Knowledge of RESTful APIs and web technologies will be an added advantage for this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
vadodara, gujarat
On-site
The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters. You should have experience that demonstrates proficiency and ease with one or more programming languages, quality assurance, scripting languages, and operating systems. Solid hands-on development experience in backend technologies including JAVA, J2EE, SQL and related technology stack, preferably incorporating open-source libraries is required. Hands-on experience with Spring Framework, Spring Boot, MongoDB, and JPA / Hibernate is a strong plus. Exposure to frameworks like Karate and TestNG is good to have to carry out QA tasks. Knowledge of test automation frameworks is also a strong plus. A strong foundation in data structures, algorithms, problem-solving, and complexity analysis is expected. You should possess strong designing, analytical, programming, and communication skills, along with an aptitude for building stable solutions. Knowledge about writing unit test cases using frameworks like JUnit / TestNG is desired. Some demonstrated experience with n-tier web application development and experience in the latest JDK is desired. Java / J2EE certification is a Plus. Experience with web services standards and related technologies (XML, JSON, REST, SOAP, WS*, AXIS, JERSEY) is nice to have. Demonstrable experience utilizing object-oriented patterns and design best practices is a strong plus. Exposure to tools like Postman / any REST Client is desired. LINUX skills are required. Exposure to frameworks like Karate and TestNG is good to have to carry out QA tasks. Some knowledge of test automation frameworks is also a strong plus. Working knowledge of Continuous Integration / Delivery, and Test Driven Development is good to have. Knowledge in microservices and hands-on experience on container platforms like Kubernetes, Docker, and OpenShift would be a strong plus. Hands-on experience in distributed architecture & data streaming approaches like Kafka and RabbitMQ is a strong plus. Mandatory Skills: Fullstack Java Enterprise Experience: 5-8 Years Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be part of a dynamic team at JP Morgan Chase, dedicated to providing exceptional value and a seamless experience to customers as a trusted financial institution. Chase UK, our digital banking arm, is at the forefront of transforming the banking experience through intuitive and enjoyable customer journeys. Leveraging the strong foundation of trust built by millions of customers in the US, we are rapidly expanding our presence in the UK and soon across Europe, shaping the bank of the future. This is your opportunity to join us and create a significant impact. Your responsibilities will include delivering end-to-end data pipeline solutions on cloud infrastructure by harnessing the latest technologies and industry best practices. You will utilize domain modeling techniques to develop top-notch business products, structuring software for easy understanding, testing, and evolution. Building solutions that are resilient and scalable, you will focus on developing secure code to safeguard customers and the institution from potential threats. Timely investigation and resolution of issues, along with ensuring zero downtime during releases, will be crucial aspects of your role. Optimizing data processing, monitoring performance, and ensuring reliable and efficient systems operation will also be part of your duties. Continuous learning and updating of technologies and patterns, as well as providing support throughout the product lifecycle, including production and incident management, will be key components of your role. To excel in this role, you should possess formal training or certification in data engineering concepts, along with recent hands-on experience as a data engineer. Proficiency in coding with Python, designing effective tests, and strong written and verbal communication skills in English are essential. Experience with cloud technologies, distributed systems, data transformation frameworks, and data pipeline orchestration tools will be beneficial. Managing large volumes of data, optimizing data processing, and understanding event-based architecture, data streaming, and messaging frameworks are also important skills. Your ability to coach team members, manage stakeholders, prioritize effectively across multiple work streams, and adapt to a fast-paced environment will be valuable. Preferred qualifications include experience in a highly regulated industry, familiarity with AWS cloud technologies, expertise in data governance frameworks, understanding of incremental data processing and versioning, and knowledge of RESTful APIs and web technologies.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At JP Morgan Chase, we prioritize providing exceptional value and a seamless experience to our customers as a trusted financial institution. Chase UK was established with the goal of revolutionizing digital banking through intuitive and enjoyable customer journeys. Backed by the trust of millions of customers in the US, our presence in the UK is rapidly expanding, with plans to extend across Europe. We are in the process of building the bank of the future from scratch, offering you the opportunity to join us and contribute significantly to our mission. As a part of our team, your responsibilities will include delivering end-to-end data pipeline solutions on cloud infrastructure by leveraging cutting-edge technologies and industry best practices. You will apply domain modeling techniques to develop top-notch business products and structure software for enhanced understanding, testing, and evolution. Building solutions with scalable architectural patterns to avoid single points of failure, you will also prioritize developing secure code to safeguard our customers and organization from malicious threats. Timely investigation and resolution of issues, ensuring they do not recur, and facilitating zero downtime releases for end-users are crucial aspects of your role. Additionally, optimizing data reading and writing, monitoring performance, and updating technologies and patterns continuously will be part of your daily tasks. Supporting products throughout their lifecycle, including production and incident management, will also fall under your purview. To excel in this role, you should possess formal training or certification in data engineering concepts, along with recent hands-on professional experience as a data engineer. Proficiency in coding with Python, designing and implementing effective tests, and excellent written and verbal communication skills in English are essential requirements. Experience with cloud technologies, distributed systems, data transformation frameworks, and data pipeline orchestration tools is vital. Managing large volumes of data, optimizing data processing, understanding event-based architecture, data streaming, and messaging frameworks are key competencies we seek. Your ability to coach team members on coding practices, design principles, and implementation patterns, as well as manage stakeholders and prioritize effectively across multiple work streams, will be highly valued. Additionally, preferred qualifications for this role include experience in a highly regulated environment/industry, familiarity with AWS cloud technologies, expertise with data governance frameworks, understanding of incremental data processing and versioning, and knowledge of RESTful APIs and web technologies.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |