Home
Jobs

6920 Kafka Jobs - Page 17

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Additional Job Description We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. Responsibilities Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities Implementing data wrangling, scarping, cleaning using both Java or Python Strong experience on data structure. Extensively work on API integration. Monitoring performance and advising any necessary infrastructure changes Defining data retention policies Skills And Qualifications Proficient understanding of distributed computing principles Proficient in Java or Pyhton and some part of machine learning Proficiency with Hadoop v2, MapReduce, HDFS,Pyspark,Spark Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark Experience with integration of data from multiple data sources Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of various ETL techniques and frameworks, such as Flume Experience with various messaging systems, such as Kafka or RabbitMQ Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O Good understanding of Lambda Architecture, along with its advantages and drawbacks Experience with Cloudera/MapR/Hortonworks Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Major Duties Monitor the production environment. Identify and implement opportunities to improve production stability. Ensure incidents are prioritized and worked on in proper order and review backlog items. Investigating, diagnosing, and solving application issues. Problem resolution in an analytical and logical manner, to troubleshoot root cause and resolve production incidents. Follow-up on cross-team incidents to drive to resolution. Developing and delivering product changes, enhancements in a collaborative, agile team environment. Build solutions to fix production issues and participate in ongoing software maintenance activities. Understand, define, estimate, develop, test, deploy and support change requests. Monitor and attend to all alerts and escalate production issues as needed to relevant teams and management. Operates independently; has in-depth knowledge of business unit / function. Communicate with stakeholders and business on escalated items. As subject area expert, provides comprehensive, in-depth consulting to team and partners at a high technical level. Develops periodic goals, organizes the work, sets short-term priorities, monitors all activities, and ensures timely and accurate completion of the work. Periodically engage with business partners to review progress and priorities and develop and maintain rapport through professional interactions with clear, concise communications. Ensure cross-functional duties, including bug fixes & scheduling changes etc. are scheduled and completed by the relevant teams. Work with the team to resolve problems, improve production reliability, stability, and availability. Follow the ITIL processes of Incident, Problem & Change Management. Ability to solve complex technical Have : 8 -12 years of professional experience in software maintenance / support / development with Programming / Strong Technical background. 80% Technical and 20% Manager skills. Proficient in working with ITIL / ITSM (Service Now) & Data Analysis. Expert on Unix commands and Scripting. Working knowledge of SQL (Preferably Oracle, MSSQL). Experience in supporting ETL/EDM/MDM Platform using tools like SSIS, Informatica, Markit EDM, IBM Infosphere DataStage ETL experience is mandate if EDM experience is not present. Understanding of batch scheduling system usage and implementation concepts. Trigger solutions using external schedulers (Control-M), services (Process Launchers & Event Watchers) and UI. Well versed with Change Management process and tools. Experience in incident management, understanding of ticket workflows and use of escalation. Good understanding of MQ/Kafka (both consumer/producer solutions). Good understanding of Rest/SOAP to Have : Proficient in Java and able to go into code to investigate and fix issues. Understanding of DevOps, CICD & Agile techniques preferred. Basic understanding of front-end technologies, such as React JS, JavaScript, HTML5, and CSS3. Banking and Financial Services Knowledge is preferred. More importantly, the candidate should have a strong technical background. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Position Overview We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience with AWS Glue, Apache Airflow, Kafka, SQL, Python and DataOps tools and technologies. Knowledge of SAP HANA & Snowflake is a plus. This role is critical for designing, developing, and maintaining our clients data pipeline architecture, ensuring the efficient and reliable flow of data across the organization. Key Responsibilities Design, Develop, and Maintain Data Pipelines : Develop robust and scalable data pipelines using AWS Glue, Apache Airflow, and other relevant technologies. Integrate various data sources, including SAP HANA, Kafka, and SQL databases, to ensure seamless data flow and processing. Optimize data pipelines for performance and reliability. Data Management And Transformation Design and implement data transformation processes to clean, enrich, and structure data for analytical purposes. Utilize SQL and Python for data extraction, transformation, and loading (ETL) tasks. Ensure data quality and integrity through rigorous testing and validation processes. Collaboration And Communication Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs. Collaborate with cross-functional teams to implement DataOps practices and improve data life cycle management. Monitoring And Optimization Monitor data pipeline performance and implement improvements to enhance efficiency and reduce latency. Troubleshoot and resolve data-related issues, ensuring minimal disruption to data workflows. Implement and manage monitoring and alerting systems to proactively identify and address potential issues. Documentation And Best Practices Maintain comprehensive documentation of data pipelines, transformations, and processes. Adhere to best practices in data engineering, including code versioning, testing, and deployment procedures. Stay up-to-date with the latest industry trends and technologies in data engineering and DataOps. Required Skills And Qualifications Technical Expertise : Extensive experience with AWS Glue for data integration and transformation. Proficient in Apache Airflow for workflow orchestration. Strong knowledge of Kafka for real-time data streaming and processing. Advanced SQL skills for querying and managing relational databases. Proficiency in Python for scripting and automation tasks. Experience with SAP HANA for data storage and management. Familiarity with DataOps tools and methodologies for continuous integration and delivery in data engineering. Preferred Skills Knowledge of Snowflake for cloud-based data warehousing solutions. Experience with other AWS data services such as Redshift, S3, and Athena. Familiarity with big data technologies such as Hadoop, Spark, and Hive. Soft Skills Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Detail-oriented with a commitment to data quality and accuracy. Ability to work independently and manage multiple projects simultaneously. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Designation : Solution Architect Office Location : Gurugram Position Description As a Solution Architect, you will be responsible for leading the development and delivery of the platforms. This includes overseeing the entire product lifecycle from the solution until execution and launch, building the right team & close collaboration with business and product teams. Primary Responsibilities Design end-to-end solutions that meet business requirements and align with the enterprise architecture. Define the architecture blueprint, including integration, data flow, application, and infrastructure components. Evaluate and select appropriate technology stacks, tools, and frameworks. Ensure proposed solutions are scalable, maintainable, and secure. Collaborate with business and technical stakeholders to gather requirements and clarify objectives. Act as a bridge between business problems and technology solutions. Guide development teams during the execution phase to ensure solutions are implemented according to design. Identify and mitigate architectural risks and issues. Ensure compliance with architecture principles, standards, policies, and best practices. Document architectures, designs, and implementation decisions clearly and thoroughly. Identify opportunities for innovation and efficiency within existing and upcoming solutions. Conduct regular performance and code reviews, and provide feedback to the development team members to improve professional development. Lead proof-of-concept initiatives to evaluate new Responsibilities : Facilitate daily stand-up meetings, sprint planning, sprint review, and retrospective meetings. Work closely with the product owner to priorities the product backlog and ensure that user stories are well-defined and ready for development. Identify and address issues or conflicts that may impact project delivery or team morale. Experience with Agile project management tools such as Jira and Trello. Required Skills Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with at least 3 years in a solution architecture or technical leadership role. Proficiency with AWS or GCP cloud platform. Strong implementation knowledge in JS tech stack, NodeJS, ReactJS, Experience with JS stack - ReactJS, NodeJS. Experience with Database Engines - MySQL and PostgreSQL with proven knowledge of Database migrations, high throughput and low latency use cases. Experience with key-value stores like Redis, MongoDB and similar. Preferred knowledge of distributed technologies - Kafka, Spark, Trino or similar with proven experience in event-driven data pipelines. Proven experience with setting up big data pipelines to handle high volume transactions and transformations. Experience with BI tools - Looker, PowerBI, Metabase or similar. Experience with Data warehouses like BigQuery, Redshift, or similar. Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and IaC to Have : Certifications such as AWS Certified Solutions Architect, Azure Solutions Architect Expert, TOGAF, etc. Experience setting up analytical pipelines using BI tools (Looker, PowerBI, Metabase or similar) and low-level Python tools like Pandas, Numpy, PyArrow Experience with data transformation tools like DBT, SQLMesh or similar. Experience with data orchestration tools like Apache Airflow, Kestra or similar. Work Environment Details About Affle : Affle is a global technology company with a proprietary consumer intelligence platform that delivers consumer engagement, acquisitions, and transactions through relevant Mobile Advertising. The platform aims to enhance returns on marketing investment through contextual mobile ads and also by reducing digital ad fraud. While Affle's Consumer platform is used by online & offline companies for measurable mobile advertising, its Enterprise platform helps offline companies to go online through platform-based app development, enablement of O2O commerce and through its customer data platform. Affle India successfully completed its IPO in India on 08. Aug.2019 and now trades on the stock exchanges (BSE : 542752 & NSE : AFFLE). Affle Holdings is the Singapore based promoter for Affle India and its investors include Microsoft, Bennett Coleman &Company (BCCL) amongst others. For more details : www.affle.com About BU Ultra - Access deals, coupons, and walled gardens based user acquisition on a single platform to offer bottom-funnel optimization across multiple inventory sources. For more details, please visit : https : //www.ultraplatform.io/ (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

36.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Job Role We are on a mission to build scalable, high-performance systems, and were looking for a Backend Engineer (SDE II) who can design, build, and maintain services that power our core platform. Key Responsibilities Architect and implement scalable backend systems using Python (Django/FastAPI) and TypeScript (Node.js). Lead system design discussions and own the design of backend modules and infrastructure. Design and optimize PostgreSQL schemas and queries for performance and reliability. Build microservices and deploy them using Docker and Kubernetes. Drive DevOps best practices including CI/CD, infrastructure automation, and cloud deployment. Integrate and manage RabbitMQ for asynchronous processing and event-driven workflows. Set up and manage log aggregation, monitoring, and alerting using tools like Prometheus, Grafana, ELK stack. Conduct code reviews, share knowledge, and mentor junior engineers and interns. Proactively monitor and improve the reliability, scalability, and performance of backend systems. Collaborate with cross-functional teams on features, architecture, and tech strategy. Experience & Qualifications 36 years of experience in backend development with strong command of Python and TypeScript. Expertise in building web services and APIs using Django, FastAPI, or Node.js. Strong knowledge of relational databases, particularly PostgreSQL. Solid experience with Kubernetes and Docker for deploying and managing microservices. Experience in DevOps operations, CI/CD pipelines, and infrastructure as code. Proficiency in RabbitMQ or similar message queue technologies. Hands-on experience with monitoring, logging, and alerting stacks (e.g., ELK, Prometheus, Grafana). Strong system design skills able to design scalable, fault-tolerant, and maintainable systems. Familiarity with Git workflows, agile processes, and collaborative software development. Good To Have Experience with cloud platforms like AWS, Azure, or GCP. Knowledge of Helm, Terraform, or similar IaC tools. Understanding of GraphQL and streaming data pipelines (Kafka, Redis streams, etc.). Exposure to event-driven architectures and distributed systems. Publicly available GitHub contributions or tech blog posts. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description We are seeking a highly motivated and skilled Java Developer with a strong understanding of IoT technologies to join our dynamic team in Gurgaon. The ideal candidate will be responsible for designing, developing, and implementing robust and scalable software solutions that integrate with IoT devices and cloud platforms. You will play a crucial role in building and maintaining our IoT ecosystem, ensuring seamless data flow and efficient device management. This role requires a proactive individual with excellent problem-solving skills and a passion for working with cutting-edge technologies. Design, develop, and maintain Java-based applications and microservices for our IoT platform. Integrate IoT devices and sensors with backend systems using various communication protocols (e.g., MQTT, CoAP, HTTP). Develop and consume RESTful APIs for data exchange between different components of the system. Work with databases such as PostgreSQL and/or MySQL for data storage and retrieval. Utilize cloud platforms (preferably Azure or AWS) for deploying, managing, and scaling IoT solutions. Implement security measures for IoT devices and data transmission. Write clean, well-documented, and efficient code following best practices and coding standards. Participate in code reviews to ensure code quality and knowledge sharing. Troubleshoot and debug issues across the entire IoT solution stack. Collaborate effectively with cross-functional teams including hardware engineers, data scientists, and product managers. Stay up-to-date with the latest trends and technologies in Java, IoT, and cloud computing. Contribute to the continuous improvement of our development processes and tools. Participate in the full software development lifecycle, from requirements gathering to deployment and maintenance. Programming Languages : Strong proficiency in Java and JavaScript. Databases : Experience with relational databases such as PostgreSQL and/or MySQL, including database design and querying. IoT Fundamentals : Solid understanding of IoT concepts, device communication protocols, and data management in IoT environments. Cloud Platforms : Hands-on experience with at least one major cloud platform (Azure or AWS), including services related to IoT, compute, storage, and networking. API Development : Experience in designing, developing, and consuming RESTful APIs. Version Control : Proficient in using Git for version control and collaboration. Problem-Solving : Excellent analytical and problem-solving skills with the ability to diagnose and resolve complex issues. Communication : Strong written and verbal communication skills. Teamwork : Ability to work effectively in a collaborative team environment. Experience with other IoT platforms or services. Knowledge of other programming languages (e.g., Python). Experience with message queuing systems (e.g., Kafka, RabbitMQ). Understanding of security best practices for IoT devices and cloud environments. Experience with containerization technologies (e.g., Docker, Kubernetes). Familiarity with agile development methodologies (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Key Responsibilities : Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ Master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing /supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies Demonstrated ability to deliver in a fast-paced environment Preferred Skills And Attributes Experience with modern cloud platforms and microservices-based deployments. Knowledge of supply chain and eCommerce backend architectures. Excellent communication and collaboration skills to work effectively in cross-functional (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

3.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Data Engineer (3-4 Years Experience) - Real-time & Batch Processing | AWS, Kafka, Click House, Python Location : NOIDA. Experience : 3-4 years. Job Type : Full-Time. About The Role We are looking for a skilled Data Engineer with 3-4 years of experience to design, build, and maintain real-time and batch data pipelines for handling large-scale datasets. You will work with AWS, Kafka, Cloudflare Workers, Python, Click House, Redis, and other modern technologies to enable seamless data ingestion, transformation, merging, and storage. Bonus: If you have Web Data Analytics or Programmatic Advertising knowledge, it will be a big plus!. Responsibilities Real-Time Data Processing & Transformation : Build low-latency, high-throughput real-time pipelines using Kafka, Redis, Firehose, Lambda, and Cloudflare Workers. Perform real-time data transformations like filtering, aggregation, enrichment, and deduplication using Kafka Streams, Redis Streams, or AWS Lambda. Merge data from multiple real-time sources into a single structured dataset for analytics. Batch Data Processing & Transformation Develop batch ETL/ELT pipelines for processing large-scale structured and unstructured data. Perform data transformations, joins, and merging across different sources in Click House, AWS Glue, or Python. Optimize data ingestion, transformation, and storage workflows for efficiency and reliability. Data Pipeline Development & Optimization Design, develop, and maintain scalable, fault-tolerant data pipelines for real-time & batch processing. Optimize data workflows to reduce latency, cost, and compute load. Data Integration & Merging Combine real-time and batch data streams for unified analytics. Integrate data from various sources (APIs, databases, event streams, cloud storage). Cloud Infrastructure & Storage Work with AWS services (S3, EC2, ECS, Lambda, Firehose, RDS, Redshift, ClickHouse) for scalable data processing. Implement data lake and warehouse solutions using S3, Redshift, and ClickHouse. Data Visualization & Reporting Work with Power BI, Tableau, or Grafana to create real-time dashboards and analytical reports. Web Data Analytics & Programmatic Advertising (Big Plus!) : Experience working with web tracking data, user behavior analytics, and digital marketing datasets. Knowledge of programmatic advertising, ad impressions, clickstream data, and real-time bidding (RTB) analytics. Monitoring & Performance Optimization Implement monitoring & logging of data pipelines using AWS CloudWatch, Prometheus, and Grafana. Tune Kafka, Click House, and Redis for high performance. Collaboration & Best Practices Work closely with data analysts, software engineers, and DevOps teams to enhance data accessibility. Follow best practices for data governance, security, and compliance. Must-Have Skills Programming : Strong experience in Python and JavaScript. Real-time Data Processing & Merging : Expertise in Kafka, Redis, Cloudflare Workers, Firehose, Lambda. Batch Processing & Transformation : Experience with Click House, Python, AWS Glue, SQL-based transformations. Data Storage & Integration : Experience with MySQL, Click House, Redshift, and S3-based storage. Cloud Technologies : Hands-on with AWS (S3, EC2, ECS, RDS, Firehose, Click House, Lambda, Redshift). Visualization & Reporting : Knowledge of Power BI, Tableau, or Grafana. CI/CD & Infrastructure as Code (IaC) : Familiarity with Terraform, CloudFormation, Git, Docker, and Kubernetes. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for a highly skilled and experienced Full Stack Developer with a strong background in Java, Spring Boot, Microservices, and Angular to join our development team. The ideal candidate will be responsible for the design, development, and maintenance of scalable and high-performance applications. This role requires deep technical knowledge, strong problem-solving abilities, and the capacity to lead and mentor junior developers. Key Responsibilities Design and develop robust, scalable, and secure full-stack applications using Java (Spring Boot) on the backend and Angular with Material Design on the frontend. Responsible for end-to-end application development from requirement analysis and design to development, testing, deployment, and maintenance. Write clean, efficient, and maintainable code that adheres to best practices in software engineering. Collaborate with cross-functional teams including Product Managers, UI/UX Designers, QA Engineers, and DevOps. Review and interpret business requirements into technical solutions. Conduct code reviews, mentor junior developers, and ensure code quality through static analysis, unit testing, and integration testing. Optimize application performance through monitoring, tuning, and debugging multithreaded and high-throughput systems. Participate in architectural discussions and design planning for new features and system improvements. Write and maintain technical documentation, including design documents and implementation specifications. Stay current with emerging technologies and industry trends. Backend Required Technical Skills : Strong proficiency in Java (Java 8 or higher) Spring Framework / Spring Boot Microservices architecture and RESTful API development Multithreading and concurrent programming Experience with JPA/Hibernate, SQL, and relational databases (e.g., MySQL, PostgreSQL) Familiarity with messaging frameworks (e.g., Apache Kafka, RabbitMQ, ActiveMQ) Frontend Solid experience with Angular (Angular 8+) Experience using Angular Material Design Proficient in TypeScript, HTML5, CSS3, and responsive design Understanding of state management and component-based architecture DevOps & Deployment Working knowledge of application servers (Tomcat, JBoss, etc.) Familiarity with CI/CD pipelines (Jenkins, GitHub Actions, etc.) Experience with containerization (Docker, Kubernetes is a plus) Version control using Git (GitHub, GitLab, Bitbucket) Testing & Quality Unit and integration testing frameworks (JUnit, Mockito, Jasmine, Karma) Understanding of automated build and test environments Soft Skills & Competencies Strong analytical and problem-solving skills Ability to work independently and as part of a team Strong written and verbal communication skills Attention to detail and commitment to producing high-quality work Proactive in identifying problems and suggesting solutions Experience working in Agile/Scrum environments Preferred Qualifications Bachelor's or Masters degree in Computer Science, Engineering, or a related field Experience in cloud-based development (AWS, Azure, or GCP) Exposure to monitoring tools (New Relic, Prometheus, etc.) Familiarity with performance profiling and system tuning (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Spendflo is a fast-growing Series A startup helping companies streamline how they procure, manage, and optimize their software and services. Backed by top-tier investors, were building the most intelligent, automated platform for procurement operations. We are now looking for a Senior Data Engineer to design, build, and scale our data infrastructure. Youll be the backbone of all data movement at Spendflo from ingestion to transformation to Youll Do : Design, implement, and own the end-to-end data architecture at Spendflo. Build and maintain robust, scalable ETL/ELT pipelines across multiple sources and systems. Develop and optimize data models for analytics, reporting, and product needs. Own the reporting layer and work with PMs, analysts, and leadership to deliver actionable data. Ensure data quality, consistency, and lineage through validation and monitoring. Collaborate with engineering, product, and data science teams to build seamless data flows. Optimize data storage and query performance for scale and speed. Own documentation for pipelines, models, and data flows. Stay current with the latest data tools and bring in the right technologies. Mentor junior data engineers and help establish data best Qualifications : 5+ years of experience as a data engineer, preferably in a product/startup environment . Strong expertise in building ETL/ELT pipelines using modern frameworks (e.g., Dagster, dbt, Airflow). Deep knowledge of data modeling (star/snowflake schemas, denormalization, dimensional modeling). Hands-on with SQL (advanced queries, performance tuning, window functions, etc.). Experience with cloud data warehouses like Redshift, BigQuery, Snowflake, or similar. Comfortable working with cloud platforms (AWS/GCP/Azure) and tools like S3, Lambda, etc. Exposure to BI tools like Looker, Power BI, Tableau, or equivalent. Strong debugging and performance tuning skills. Excellent communication and documentation Qualifications : Built or managed large-scale, cloud-native data pipelines. Experience with real-time or stream processing (Kafka, Kinesis, etc.). Understanding of data governance, privacy, and security best practices. Exposure to machine learning pipelines or collaboration with data science teams. Startup experience able to handle ambiguity, fast pace, and end-to-end ownership. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary Prana tree is seeking a highly skilled and experienced Software Engineer Level-2 (Backend Developer- Java and object-oriented development skills) to design and implement solutions for complex business workflows. The ideal candidate will possess exceptional coding skills, a strong sense of ownership, and the ability to mentor junior engineers while collaborating across teams to deliver high-quality solutions in a fast-paced, startup environment. Key Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ Master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing / supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

We're hiring for Python SDE 1 to join our Commerce Team. The Commerce Engineering Team forms the backbone of our core business. We build and iterate over our core platform that handles start from onboarding a seller to serving the finished products to end customers across different channels with customisation and configuration. Our team consists of generalist engineers who work on building REST APIs, Internal tools, and Infrastructure. Some Specific Requirements Atleast 2+ years of Development Experience You have prior experience developing and working on consumer-facing web/app products Solid experience in Python with experience in building web/app-based tech products Experience in at least one of the following frameworks - Sanic, Django, Flask, Falcon, web2py, Twisted, Tornado Working knowledge of MySQL, MongoDB, Redis, Aerospike Good understanding of Data Structures, Algorithms, and Operating Systems You've worked with core AWS services in the past and have experience with EC2, ELB, AutoScaling, CloudFront, S3, Elasticache Understanding of Kafka, Docker, Kubernetes Have knowledge of Solr, Elastic search Attention to detail You can dabble in Frontend codebases using HTML, CSS, and Javascript You love doing things efficiently the work you do will have a disproportionate impact on the business. We believe in systems and processes that let us scale our impact to be larger than ourselves You might not have experience with all the tools that we use but you can learn those given the guidance and resources (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary : We are seeking a versatile and highly skilled Senior Software Engineer with expertise in full stack development, mobile application development using Flutter, and backend systems using Java/Spring Boot. The ideal candidate will have strong experience across modern development stacks, cloud platforms (AWS), containerization, and CI/CD Responsibilities : Design and develop scalable web, mobile, and backend applications. Build high-quality, performant cross-platform mobile apps using Flutter and Dart. Develop RESTful APIs and services using Node.js/Express and Java/Spring Boot. Integrate frontend components with backend logic and databases (Oracle, PostgreSQL, MongoDB). Work with containerization tools like Docker and orchestration platforms like Kubernetes or ROSA. Leverage AWS cloud services for deployment, scalability, and monitoring (e.g., EC2, S3, RDS, Lambda). Collaborate with cross-functional teams including UI/UX, QA, DevOps, and product managers. Participate in Agile ceremonies, code reviews, unit/integration testing, and performance tuning. Maintain secure coding practices and ensure compliance with security Skills & Qualifications : Strong programming in Java (Spring Boot), Node.js, and React.js. Proficiency in Flutter & Dart for mobile development. Experience with REST APIs, JSON, and third-party integrations. Hands-on experience with cloud platforms (preferably AWS). Strong skills in databases such as Oracle, PostgreSQL, MongoDB. Experience with Git, CI/CD tools (Jenkins, GitLab CI, GitHub Actions). Familiarity with containerization using Docker and orchestration via Kubernetes. Knowledge of secure application development (OAuth, JWT, encryption). Solid understanding of Agile/Scrum Qualifications : Experience with Firebase, messaging queues (Kafka/RabbitMQ), and server-side rendering (Next.js). (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary client is seeking a highly skilled and experienced Software Engineer Level 2 (Backend Developer- Java and object-oriented development skills) to design and implement solutions for complex business workflows. The ideal candidate will possess exceptional coding skills, a strong sense of ownership, and the ability to mentor junior engineers while collaborating across teams to deliver high-quality solutions in a fast-paced, startup environment. Key Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code - quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing / supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies Demonstrated ability to deliver in a fast-paced environment Preferred Skills And Attributes Experience with modern cloud platforms and microservices-based deployments. Knowledge of supply chain and eCommerce backend architectures. Excellent communication and collaboration skills to work effectively in cross-functional (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

What Youll Do Architect and scale modern data infrastructure: ingestion, transformation, warehousing, and access Define and drive enterprise data strategygovernance, quality, security, and lifecycle management Design scalable data platforms that support both operational insights and ML/AI applications Translate complex business requirements into robust, modular data systems Lead cross-functional teams of engineers, analysts, and developers on large-scale data initiatives Evaluate and implement best-in-class tools for orchestration, warehousing, and metadata management Establish technical standards and best practices for data engineering at scale Spearhead integration efforts to unify data across legacy and modern platforms What You Bring Experience in data engineering, architecture, or backend systems Strong grasp of system design, distributed data platforms, and scalable infrastructure Deep hands-on experience with cloud platforms (AWS, Azure, or GCP) and tools like Redshift, BigQuery, Snowflake, S3, Lambda Expertise in data modeling (OLTP/OLAP), ETL pipelines, and data warehousing Experience with big data ecosystems: Kafka, Spark, Hive, Presto Solid understanding of data governance, security, and compliance frameworks Proven track record of technical leadership and mentoring Strong collaboration and communication skills to align tech with business Bachelors or Masters in Computer Science, Data Engineering, or a related field Nice To Have (Your Edge) Experience with real-time data streaming and event-driven architectures Exposure to MLOps and model deployment pipelines Familiarity with data DevOps and Infra as Code (Terraform, CloudFormation, CI/CD pipelines) (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

1.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

We are looking for a skilled and motivated Node.js Backend Developer to join our dynamic team. You will be responsible for developing, maintaining, and optimizing scalable backend solutions that power our web and mobile applications. Key Responsibilities Design, develop, and maintain RESTful APIs using Node.js Write clean, scalable, and efficient backend code Integrate third-party APIs and data sources Optimize application performance and scalability Collaborate with frontend developers, designers, and product teams Troubleshoot, debug, and upgrade existing systems Ensure high-quality code through code reviews and unit testing Implement security and data protection best practices Work with databases like MongoDB, MySQL, or PostgreSQL Participate in agile development processes (Scrum/ Kanban) Requirements 1-3 years of experience in Node.js backend development Strong understanding of JavaScript (ES6+), Node.js, and Express.js Experience working with databases : MongoDB / MySQL / PostgreSQL Familiarity with RESTful API design and integration Knowledge of authentication & authorization (JWT, OAuth) Experience with version control systems like Git Good understanding of asynchronous programming Knowledge of API security, performance optimization, and scalability Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus Experience with Docker & CI/CD pipelines is a plus Soft Skills Problem-solving attitude Strong communication & collaboration skills Ability to work independently and in a team Willingness to learn and adapt to new technologies Good To Have Experience with GraphQL Familiarity with Microservices architecture Experience with message brokers like RabbitMQ or Kafka Basic knowledge of DevOps practices Salary : 6-10LPA (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Omni's team is passionate about Commerce and Digital Transformation. We've been successfully delivering Commerce solutions for clients across North America, Europe, Asia, and Australia. The team has experience executing and delivering projects in B2B and B2C solutions. Job Description This is a remote position. We are seeking a Senior Data Engineer to architect and build robust, scalable, and efficient data systems that power AI and Analytics solutions. You will design end-to-end data pipelines, optimize data storage, and ensure seamless data availability for machine learning and business analytics use cases. This role demands deep engineering excellence balancing performance, reliability, security, and cost to support real-world AI applications. Key Responsibilities Architect, design, and implement high-throughput ETL/ELT pipelines for batch and real-time data processing. Build cloud-native data platforms : data lakes, data warehouses, feature stores. Work with structured, semi-structured, and unstructured data at petabyte scale. Optimize data pipelines for latency, throughput, cost-efficiency, and fault tolerance. Implement data governance, lineage, quality checks, and metadata management. Collaborate closely with Data Scientists and ML Engineers to prepare data pipelines for model training and inference. Implement streaming data architectures using Kafka, Spark Streaming, or AWS Kinesis. Automate infrastructure deployment using Terraform, CloudFormation, or Kubernetes operators. Requirements 7+ years in Data Engineering, Big Data, or Cloud Data Platform roles. Strong proficiency in Python and SQL. Deep expertise in distributed data systems (Spark, Hive, Presto, Dask). Cloud-native engineering experience (AWS, GCP, Azure) : BigQuery, Redshift, EMR, Databricks, etc. Experience designing event-driven architectures and streaming systems (Kafka, Pub/Sub, Flink). Strong background in data modeling (star schema, OLAP cubes, graph databases). Proven experience with data security, encryption, compliance standards (e.g., GDPR, HIPAA). Preferred Skills Experience in MLOps enablement : creating feature stores, versioned datasets. Familiarity with real-time analytics platforms (Clickhouse, Apache Pinot). Exposure to data observability tools like Monte Carlo, Databand, or similar. Passionate about building high-scale, resilient, and secure data systems. Excited to support AI/ML innovation with state-of-the-art data infrastructure. Obsessed with automation, scalability, and best engineering practices. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Data Engineer - Google Cloud Location : Remote, India About Us Aviato Consulting is looking for a highly skilled and motivated Data Engineer to join our expanding team. This role is ideal for someone with a deep understanding of cloud-based data solutions, with a focus on Google Cloud (GCP) and associated technologies. GCP certification is mandatory for this position to ensure the highest level of expertise and professionalism. You will work directly with clients, translating their business requirements into scalable data solutions, while providing technical expertise and guidance. Key Responsibilities Client Engagement : Work closely with clients to understand business needs, gather technical requirements, and design solutions leveraging GCP services. Data Pipeline Design & Development : Build and manage scalable data pipelines using tools such as Apache Beam, Cloud Dataflow, and Cloud Composer. Data Warehousing & Lake Solutions : Architect, implement, and optimize BigQuery-based data lakes and warehouses. Real-Time Data Processing : Implement and manage streaming data pipelines using Kafka, Pub/Sub, and similar technologies. Data Analysis & Visualization : Create insightful data dashboards and visualizations using tools like Looker, Data Studio, or Tableau. Technical Leadership & Mentorship : Provide guidance and mentorship to team members and clients, helping them leverage the full potential of Google Cloud. Required Qualifications Experience : 5+ years as a Data Engineer working with cloud-based platforms. Proven experience in Python with libraries like Pandas and NumPy. Strong understanding and experience with FastAPI for building APIs. Expertise in building data pipelines using Apache Beam, Cloud Dataflow, or similar tools. Solid knowledge of Kafka for real-time data streaming. Proficiency with BigQuery, Google Pub/Sub, and other Google Cloud services. Familiarity with Apache Hadoop for distributed data processing. Technical Skills Strong understanding of data architecture and processing techniques. Experience with big data environments and tools like Apache Hadoop. Solid understanding of ETL pipelines, data ingestion, transformation, and storage. Knowledge of data modeling, data warehousing, and big data management principles. Certifications Google Cloud certification (Professional Data Engineer, Cloud Architect) is mandatory for this role. Soft Skills Excellent English communication skills. Client-facing experience and the ability to manage client relationships effectively. Strong problem-solving skills with a results-oriented approach. Preferred Qualifications Visualization Tools : Experience with tools like Looker, Power BI, or Tableau. Benefits Competitive salary and benefits package. Opportunities to work with cutting-edge cloud technologies with large customers. Collaborative work environment that encourages learning and professional growth. A chance to work on high-impact projects for leading clients in diverse industries. If you're passionate about data engineering, cloud technologies, and solving complex data problems for clients, wed love to hear from you! (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About Sleek Through proprietary software and AI, along with a focus on customer delight, Sleek makes the back-office easy for micro SMEs. We give Entrepreneurs time back to focus on what they love doing growing their business and being with customers. With a surging number of Entrepreneurs globally, we are innovating in a highly lucrative space. We Operate 3 Business Segments Corporate Secretary : Automating the company incorporation, secretarial, filing, Nominee Director, mailroom and immigration processes via custom online robots and SleekSign. We are the market leaders in Singapore with : 5% market share of all new business incorporations. Accounting & Bookkeeping : Redefining what it means to do Accounting, Bookkeeping, Tax and Payroll thanks to our proprietary SleekBooks ledger, AI tools and exceptional customer service. FinTech payments : Overcoming a key challenge for Entrepreneurs by offering digital banking services to new businesses. Sleek launched in 2017 and now has around 15,000 customers across our offices in Singapore, Hong Kong, Australia and the UK. We have around 450 staff with an intact startup mindset. We have achieved >70% compound annual growth in Revenue over the last 5 years and as a result have been recognised by The Financial Times, The Straits Times, Forbes and LinkedIn as one of the fastest growing companies in Asia. Role Backed by world-class investors, we are on track to be one of the few cash flow positive, tech-enabled unicorns based out of The Role : We are looking for an experienced Senior Data Engineer to join our growing team. As a key member of our data team, you will design, build, and maintain scalable data pipelines and infrastructure to enable data-driven decision-making across the organization. This role is ideal for a proactive, detail-oriented individual passionate about optimizing and leveraging data for impactful business : Work closely with cross-functional teams to translate our business vision into impactful data solutions. Drive the alignment of data architecture requirements with strategic goals, ensuring each solution not only meets analytical needs but also advances our core objectives. 3, Be pivotal in bridging the gap between business insights and technical execution by tackling complex challenges in data integration, modeling, and security, and by setting the stage for exceptional data performance and insights. Shape the data roadmap, influence design decisions, and empower our team to deliver innovative, scalable, high-quality data solutions every : Achieve and maintain a data accuracy rate of at least 99% for all business-critical dashboards by start of day (accounting for corrections and job failures), with a 24-business hour detection of error and 5-day correction SLA. 95% of data on dashboards originates from technical data pipelines to mitigate data drift. Set up strategic dashboards based on Business Needs which are robust, scalable, easy and quick to operate and maintain. Reduce costs of data warehousing and pipelines by 30%, then maintaining costs as data needs grow. Achieve 50 eNPS on data services (e.g. dashboards) from key business : Data Pipeline Development : Design, implement, and optimize robust, scalable ETL/ELT pipelines to process large volumes of structured and unstructured data. Data Modeling : Develop and maintain conceptual, logical, and physical data models to support analytics and reporting requirements. Infrastructure Management : Architect, deploy, and maintain cloud-based data platforms (e.g. , AWS, GCP). Collaboration : Work closely with data analysts, business owners, and stakeholders to understand data requirements and deliver reliable solutions, including designing and implementing robust, efficient and scalable data visualization on Tableau or LookerStudio. Data Governance : Ensure data quality, consistency, and security through robust validation and monitoring frameworks. Performance Optimization : Monitor, troubleshoot, and optimize the performance of data systems and pipelines. Innovation : Stay up to date with the latest industry trends and emerging technologies to continuously improve data engineering & Qualifications : Experience : 5+ years in data engineering, software engineering, or a related field. Technical Proficiency Proficiency in working with relational databases (e.g. , PostgreSQL, MySQL) and NoSQL databases (e.g. , MongoDB, Cassandra). Familiarity with big data frameworks like Hadoop, Hive, Spark, Airflow, BigQuery, etc. Strong expertise in programming languages such as Python, NodeJS, SQL etc. Cloud Platforms : Advanced knowledge of cloud platforms (AWS, or GCP) and their associated data services. Data Warehousing : Expertise in modern data warehouses like BigQuery, Snowflake or Redshift, etc. Tools & Frameworks : Expertise in version control systems (e.g. , Git), CI/CD, JIRA pipelines. Big Data Ecosystems / BI : BigQuery, Tableau, LookerStudio. Industry Domain Knowledge : Google Analytics (GA), Hubspot, Accounting/Compliance etc. Soft Skills : Excellent problem-solving abilities, attention to detail, and strong communication Qualifications : Degree in Computer Science, Engineering, or a related field. Experience with real-time data streaming technologies (e.g. , Kafka, Kinesis). Familiarity with machine learning pipelines and tools. Knowledge of data security best practices and regulatory The Interview Process : The successful candidate will participate in the below interview stages (note that the order might be different to what you read below). We anticipate the process to last no more than 3 weeks from start to finish. Whether the interviews are held over video call or in person will depend on your location and the role. Case study. A : 60 minute chat with the Data Analyst, where they will give you some real-life challenges that this role faces, and will ask for your approach to solving them. Career deep dive. A : 60 minute chat with the Hiring Manager (COO). They'll discuss your last 1-2 roles to understand your experience in more detail. Behavioural fit assessment. A : 60 minute chat with our Head of HR or Head of Hiring, where they will dive into some of your recent work situations to understand how you think and work. Offer + reference interviews. We'll Make a Non-binding Offer Verbally Or Over Email, Followed By a Couple Of Short Phone Or Video Calls With References That You Provide To For Background Screening Please be aware that Sleek is a regulated entity and as such is required to perform different levels of background checks on staff depending on their role. This may include using external vendors to verify the below : Your education. Any criminal history. Any political exposure. Any bankruptcy or adverse credit history. We will ask for your consent before conducting these checks. Depending on your role at Sleek, an adverse result on one of these checks may prohibit you from passing probation. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role & Responsibilities Minimum 2 years of experience in Java and related technologies Must have : Core Java, Spring, Spring boot, API and SQL. Good to have : Kafka, Angular, React. Excellent verbal and written communications skills Solid experience working with clients directly. Strong understanding of data structures, algorithm, object-oriented design and design patterns. Solid understanding and experience with agile software development Notes : Looking only for developers who are strong at and enjoy programming. Presence in office all 5 days of the week is mandatory (except in case of genuine : Engineering Grad / Process : Candidates should expect 3 rounds of personal or telephonic interviews to assess : Compensation will be competitive according to industry standards. The opportunity is now! If you are interested in being part of a dynamic team, serve clients across industry domains, learn the latest technologies, and reach your full potential (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

10.0 - 15.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Linkedin logo

Job Title : Director AI Automation & Data Sciences Experience Required : 10- 15 Years Industry : Legal Technology / Cybersecurity / Data Science Department : Technology & Innovation About The Role We are seeking an exceptional Director AI Automation & Data Sciences to lead the innovation engine behind our Managed Document Review and Cyber Incident Response services. This is a senior leadership role where youll leverage advanced AI and data science to drive automation, scalability, and differentiation in service delivery. If you are a visionary leader who thrives at the intersection of technology and operations, this is your opportunity to make a global impact. Why Join Us Cutting-edge AI & Data Science technologies at your fingertips Globally recognized Cyber Incident Response Team Prestigious clientele of Fortune 500 companies and industry leaders Award-winning, inspirational workspaces Transparent, inclusive, and growth-driven culture Industry-best compensation that recognizes excellence Key Responsibilities (KRAs) Lead and scale AI & data science initiatives across Document Review and Incident Response programs Architect intelligent automation workflows to streamline legal review, anomaly detection, and threat analytics Drive end-to-end deployment of ML and NLP models into production environments Identify and implement AI use cases that deliver measurable business outcomes Collaborate with cross-functional teams including Legal Tech, Cybersecurity, Product, and Engineering Manage and mentor a high-performing team of data scientists, ML engineers, and automation specialists Evaluate and integrate third-party AI platforms and open-source tools for accelerated innovation Ensure AI models comply with privacy, compliance, and ethical AI principles Define and monitor key metrics to track model performance and automation ROI Stay abreast of emerging trends in generative AI, LLMs, and cybersecurity analytics Technical Skills & Tools Proficiency in Python, R, or Scala for data science and automation scripting Expertise in Machine Learning, Deep Learning, and NLP techniques Hands-on experience with LLMs, Transformer models, and Vector Databases Strong knowledge of Data Engineering pipelines ETL, data lakes, and real-time analytics Familiarity with Cyber Threat Intelligence, anomaly detection, and event correlation Experience with platforms like AWS SageMaker, Azure ML, Databricks, HuggingFace Advanced use of TensorFlow, PyTorch, spaCy, Scikit-learn, or similar frameworks Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for ML Ops Strong command of SQL, NoSQL, and big data tools (Spark, Kafka) Qualifications Bachelors or Masters in Computer Science, Data Science, AI, or a related field 10- 15 years of progressive experience in AI, Data Science, or Automation Proven leadership of cross-functional technology teams in high-growth environments Experience working in LegalTech, Cybersecurity, or related high-compliance industries preferred (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Java Full Stack Developer is responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. Responsibilities: Work in an agile environment following through the best practices of agile Scrum. Analyze the requirements, seek clarifications, contribute to good acceptance criteria, estimate, and be committed. Take pride in designing solutions, developing the code free from defects and vulnerabilities, meeting functional and non-functional requirements by following modern engineering practices, reducing rework, continuously addressing technical debt. Contribute to overall team performance by helping others, peer reviewing the code diligently. Bring agility to application development through DevOps practices - automated builds, unit/functional tests, static/dynamic scans, regression tests etc. Lookout for providing best possible customer support by troubleshooting, resolving production incidents and by eliminating the problems from the root level. Bring innovative solutions to reduce the operational risks by automating mundane repetitive tasks across SDLC. Learn to become full stack developer to address end-to-end delivery of user stories. Qualifications: 2+ years of professional experience as Full Stack software engineering experience in developing enterprise scale applications. Expertise in building web applications using Java, Angular/React, and Oracle/PostgreSQL technology stack. Expertise in enterprise integrations through RESTful APIs, Kafka messaging etc. Expertise in Elastic Search, NoSQL databases, and Caching solutions. Expertise in designing and optimizing the software solutions for performance and stability. Expertise in troubleshooting and problem solving. Expertise in Test driven development. Expertise in Authentication, Authorization, and Security. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Job Description : Java/J2EE Developer Location : Jaipur, Rajasthan, India (Work From Office) Job Summary We are seeking a highly skilled and motivated Senior Java/J2EE Developer with a strong foundation in core Java and J2EE technologies. The ideal candidate will have proven expertise in designing and developing robust, scalable, and highly available applications from the ground up. This role requires a deep understanding of architectural patterns, a strong algorithmic thought process, and hands-on experience in delivering solutions across diverse deployment environments, including traditional data centers and cloud platforms. The candidate should be able to work independently, possess excellent problem-solving skills, and have a passion for learning and adopting new technologies. Responsibilities Design, develop, and implement high-performance, scalable, and secure Java/J2EE applications and microservices. Write clean, well-documented, and efficient code adhering to best practices and coding standards. Participate in all phases of the software development lifecycle, including requirements gathering, design, development, testing, deployment, and maintenance. Contribute to the application and core design, making critical architectural decisions. Apply sound algorithmic thinking to solve complex technical challenges. Develop and integrate with relational databases (e.g, MySQL, MSSQL, Oracle) and NoSQL databases (e.g, MongoDB). Implement and consume Web Services (SOAP/RESTful). Work with messaging systems like JMS, RabbitMQ, or Kafka. Ensure the performance, scalability, and availability of applications deployed across various environments (traditional data centers, public clouds like AWS, Azure, Google Cloud, and private clouds). Implement security best practices in application design and development. Troubleshoot and resolve complex technical issues, including performance bottlenecks and scalability challenges. Collaborate effectively with cross-functional teams, including product managers, QA engineers, and DevOps engineers. Contribute to the continuous improvement of development processes and methodologies. Stay up-to-date with the latest technology trends and proactively suggest adoption where beneficial. Work independently with minimal supervision and take ownership of assigned tasks. Contribute to and adhere to microservices design principles and best practices. Utilize and integrate with CI/CD pipelines (e.g, Jenkins, Bitrise, CircleCI, TravisCI). Understand and work with Content Delivery Networks (CDNs) like CloudFront, Akamai, and Cloudflare. Apply strong analytical and problem-solving skills to identify and resolve technical issues. Must Have Skills & Experience Core Java : Deep understanding of core Java concepts, including data structures, algorithms, multithreading, concurrency, and garbage collection. J2EE : Extensive experience with J2EE technologies and frameworks, including Servlets, JSP, EJBs (preferably stateless), and related APIs. Spring Framework : Strong proficiency in the Spring ecosystem, including Spring Core, Spring MVC, Spring Boot, Spring Security, and Spring Data JPA/Hibernate. Hibernate/JPA : Solid experience with object-relational mapping (ORM) frameworks like Hibernate and JPA. Messaging Systems : Hands-on experience with at least one of the following messaging systems: JMS, RabbitMQ, or Kafka. Web Services : Proven ability to design, develop, and consume Web Services (RESTful and/or SOAP). Databases : Strong working knowledge of relational databases such as MySQL, MSSQL, and Oracle, including SQL query optimization. Design Patterns : In-depth understanding and practical application of various design patterns (e.g, creational, structural, behavioral). NoSQL Databases : Familiarity with NoSQL databases like MongoDB and their use cases. Microservices Architecture : Knowledge and practical experience in designing, developing, and deploying microservices. Security Design : Understanding of security principles and best practices in application development, including authentication, authorization, and data protection. Cloud Platforms : Sound knowledge of at least one major cloud platform (AWS, Azure, or Google Cloud) and its services. CDNs : Familiarity with Content Delivery Networks (e.g, CloudFront, Akamai, Cloudflare) and their integration. Problem Solving & Analytics : Excellent analytical and problem-solving skills with a strong aptitude for identifying and resolving complex technical issues, particularly related to performance and scalability. CI/CD : Experience working with Continuous Integration/Continuous Delivery platforms like Jenkins, Bitrise, CircleCI, TravisCI. Networking Protocols : Excellent understanding of standard internet protocols such as HTTP/HTTPS, DNS, SSL/TLS. Independent Work : Demonstrated ability to work independently, manage tasks effectively, and take ownership of deliverables. Learning Agility : A strong passion for learning new technologies and proactively upgrading existing technology versions. Good To Have Skills Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of front-end technologies like HTML, CSS, JavaScript, and related frameworks (e.g, React, Angular, Vue.js). Experience with performance monitoring and logging tools (e.g, Prometheus, Grafana, ELK stack). Familiarity with Agile development methodologies. Experience with testing frameworks (e.g, JUnit, TestNG, Mockito). Education And Experience Bachelor's degree in Computer Science or a related field (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Job Summary We are seeking a highly skilled and motivated Java Developer with a strong foundation in Core Java and J2EE to join our dynamic team in Jaipur. The ideal candidate will possess hands-on experience in designing and developing robust and scalable applications from the ground up. You should have a proven track record of delivering highly available services across various technology stacks, including traditional data centers and cloud environments. This role requires a strong problem-solving aptitude, excellent analytical skills, and the ability to work independently while contributing effectively within a team. You will be involved in the full software development lifecycle, from design and implementation to testing and : Design, develop, and implement high-performance and scalable Java/J2EE applications and microservices. Write clean, well-documented, and efficient code following best practices and coding standards. Participate in the entire application lifecycle, including requirements analysis, design, development, testing, deployment, and maintenance. Design and implement robust and secure APIs and web services (RESTful/SOAP). Work with relational databases (e.g., MySQL, MSSQL, Oracle, PostgreSQL) and NoSQL databases (e.g., MongoDB) to design and optimize data models and queries. Apply design patterns and architectural best practices to ensure maintainability, scalability, and reliability of applications. Develop and implement solutions for delivering highly available services on traditional data centers, public clouds (AWS, Azure, Google Cloud), and private clouds. Implement security best practices in application development and deployment. Troubleshoot and resolve complex technical issues related to performance, scalability, and stability. Collaborate effectively with cross-functional teams, including product managers, designers, and QA engineers. Contribute to the continuous improvement of development processes and tools. Stay up-to-date with the latest Java technologies, frameworks, and industry trends. Participate in code reviews to ensure code quality and adherence to standards. Work with CI/CD pipelines (e.g., Jenkins, Bitrise, CircleCI, TravisCI) for automated build, test, and deployment processes. Understand and implement standard protocols such as HTTP/HTTPS, DNS, SSL, etc. Demonstrate a passion for learning new technologies and proactively upgrading existing technology stacks. Must Have Skills Core Java : Strong fundamentals and in-depth understanding of Core Java concepts (OOPs, data structures, algorithms, multithreading, concurrency, collections). J2EE : Proven experience with J2EE technologies and frameworks (Servlets, JSP, EJB (good to have), etc.). Spring Framework : Extensive experience with the Spring ecosystem (Spring Core, Spring MVC, Spring Boot, Spring Security, Spring Data JPA/Hibernate). Hibernate/JPA : Solid understanding and practical experience with object-relational mapping frameworks. Messaging Systems : Hands-on experience with at least one of the following : JMS, RabbitMQ, or Kafka. Web Services : Strong experience in developing and consuming RESTful and/or SOAP web services. Databases : Proficient in working with relational databases (MySQL, MSSQL, Oracle) and writing complex SQL queries. Design Patterns : Strong understanding and practical application of various design patterns (creational, structural, behavioral). Database Knowledge : In-depth knowledge of relational database design principles and NoSQL database concepts. Microservices : Ability to work independently and possess a strong understanding and experience in Microservices architecture, design principles, and security considerations. Problem-Solving : Excellent analytical and problem-solving skills with a strong aptitude for identifying and resolving technical challenges. Cloud Platforms : Sound knowledge and practical experience with at least one major cloud platform (AWS, Azure, Google Cloud). CDNs : Sound understanding of Content Delivery Networks (CloudFront, Akamai, Cloudflare) and their integration. Performance and Scalability : Strong problem-solving and analytical skills specifically related to performance optimization and ensuring scalability of applications built on the mentioned technologies. CI/CD : Experience working with Continuous Integration/Continuous Delivery platforms (Jenkins, Bitrise, CircleCI, TravisCI, etc.). Networking Protocols : Excellent understanding of standard internet protocols such as HTTP/HTTPS, DNS, SSL, etc. Learning Agility : Passion and eagerness to learn new technologies and adapt to evolving technology to Have Skills : Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Familiarity with front-end technologies like HTML, CSS, JavaScript, and modern JavaScript frameworks (React, Angular, Vue.js). Experience with testing frameworks (JUnit, Mockito, TestNG). Knowledge of security best practices and common security vulnerabilities (OWASP). Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack). Familiarity with Agile development methodologies (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Greater Lucknow Area

On-site

Linkedin logo

Job Position : Python Developer - Kafka Experience : 7+ Yrs Location : Anywhere India NP : Immediate - 30 days Skills : Python, Fast API, LLM, Sql, Kafka, Mongo DB Location : Bangalore/Any UST Job Description We are seeking an experienced Python Developer with a strong background in web development and GenAI technologies. The ideal candidate will have a minimum of five years of experience in Python development, including working with frameworks like Flask or FastAPI, and integrating AI-driven features into : Design, develop, and maintain scalable web applications using Python and frameworks such as Flask or FastAPI, with a focus on AI-powered e-commerce solutions. Develop and deploy RESTful APIs to integrate GenAI models into the e-commerce platform. Implement and optimize GenAI models and frameworks to enhance application functionality and performance. Identify, troubleshoot, and resolve technical issues to ensure seamless application performance. Collaborate with cross-functional teams (engineering, product, design) to define, develop, and ship new features. Requirements 5+ years of hands-on experience in Python development, with expertise in Flask or FastAPI. Proven experience with Large Language Models (LLMs) and Natural Language Processing (NLP) libraries. Strong working knowledge of web frameworks like Flask/FastAPI and their integration with AI technologies. Practical experience with vector databases and Retrieval-Augmented Generation (RAG) frameworks. Proficiency in working with SQL/NoSQL databases and caching mechanisms. In-depth understanding of API security principles and microservices architecture. Experience with cloud platforms (AWS preferred) and CI/CD pipelines (e.g., GitHub Actions, Jenkins). Familiarity with version control systems, particularly Git. Strong problem-solving abilities, with the capacity to work independently and collaboratively. Excellent communication skills with the ability to explain complex technical concepts to non-technical stakeholders. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies