Home
Jobs

25 Clickhouse Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 17.0 years

40 - 45 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for Python backend developer for permanent position with MNC company for Remote. Preferred candidate profile Deep hands-on experience with Python SQL & NoSQL required, Need to be able to dockerize their microservices they built, but not setting up pods, services, deploying, etc. Proven expertise in microservices architecture, containerization (Docker), and cloud-native app development (any cloud). Build and scale RESTful APIs , async jobs, background schedulers, and data pipelines for high-volume systems . Strong understanding of API design , rate limiting, secure auth (OAuth2), and best practices . Create and optimize NoSQL and SQL data models (MongoDB, DynamoDB, PostgreSQL, ClickHouse) Soft Skills Clear communication, ownership mindset and self-driven

Posted 1 week ago

Apply

7.0 - 10.0 years

30 - 35 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for Python backend developer for permanent position with MNC company for Remote. Preferred candidate profile Deep hands-on experience with Python SQL & NoSQL required, Need to be able to dockerize their microservices they built, but not setting up pods, services, deploying, etc. Proven expertise in microservices architecture, containerization (Docker), and cloud-native app development (any cloud). Build and scale RESTful APIs , async jobs, background schedulers, and data pipelines for high-volume systems . Strong understanding of API design , rate limiting, secure auth (OAuth2), and best practices . Create and optimize NoSQL and SQL data models (MongoDB, DynamoDB, PostgreSQL, ClickHouse) Soft Skills Clear communication, ownership mindset and self-driven

Posted 1 week ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Indi

Work from Office

Naukri logo

- Proficient in Python programming. - Experience with Neo4j for graph database management and querying. - Knowledge of cloud platforms including AWS, Azure, and GCP. - Familiarity with Postgres and Clickhouse for database management and optimization. - Understanding of serverless architecture for building and deploying applications. - Experience with Docker for containerization and deployment. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 16 Lacs

Pune, Bengaluru, Greater Noida

Work from Office

Naukri logo

About the Role We are seeking a skilled and security-conscious Backend Engineer to join our growing engineering team. In this role, you will be responsible for designing, developing, and maintaining secure backend systems and services. Youll work with modern technologies across cloud platforms, graph databases, and containerized environments to build scalable and resilient infrastructure. Key Responsibilities Design and implement backend services and APIs using Python. Manage and query graph data using Neo4j. Work across cloud platforms (AWS, Azure, GCP) to build and deploy secure, scalable applications. Optimize and maintain relational and analytical databases including PostgreSQL and ClickHouse. Develop and deploy serverless applications and microservices. Containerize applications using Docker and manage deployment pipelines. Collaborate with security teams to integrate best practices and tools into the development lifecycle. Mandatory Skills Proficiency in Python programming . Hands-on experience with Neo4j for graph database management and Cypher querying. Working knowledge of AWS , Azure , and Google Cloud Platform (GCP) . Experience with PostgreSQL and ClickHouse for database optimization and management. Understanding of serverless architecture and deployment strategies. Proficiency with Docker for containerization and deployment. Nice to Have Experience with AWS ECS and EKS for container orchestration. Familiarity with open-source vulnerability/secret scanning tools (e.g., Trivy, Gitleaks, etc.). Exposure to CI/CD pipelines and DevSecOps practices. What We Offer Competitive compensation and benefits. Flexible work environment. Opportunities to work on cutting-edge security and cloud technologies. A collaborative and inclusive team culture.

Posted 1 week ago

Apply

10.0 - 17.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

1. Project description Migration of User Knowledge Objects from Splunk to Clickhouse / Grafana / Bosun: PRE Observability team is undertaking a large-scale observability transformation initiative aimed at migrating from Splunk to a more cost-effective and scalable open-source observability stack based on Clickhouse and Grafana. As part of this effort, we are seeking POD-based teams to support the migration of user knowledge objects such as dashboards, reports, alerts, macros and lookups. 2. Client description 3. Key Responsibilities: Lead the design, development, and deployment of AI-based automation tools for data artifact migration. Develop machine learning models to intelligently map, transform, and validate data across different big data platforms. Build robust data pipelines to handle high-volume, high-velocity data migration. Collaborate with data engineers and architects to integrate AI-driven solutions into existing data workflows. Implement NLP and pattern recognition algorithms to automate schema conversion and data validation. Design custom algorithms for automated data quality checks and anomaly detection. Mentor junior engineers and contribute to technical leadership within the AI engineering team. Stay updated with the latest advancements in AI, big data technologies, and automation frameworks. Create comprehensive technical documentation and best practice guidelines for AI-based data migration. 4. Details on tech stack Splunk ClickHouse Grafana Data Migration Automation Grid Dynamics (NASDAQ: GDYN) is a leading provider of technology consulting, platform and product engineering, and advanced analytics services. Fusing technical vision with business acumen, we enable positive business outcomes for enterprise companies undergoing business transformation by solving their most pressing technical challenges. A key differentiator for Grid Dynamics is our 7+ years of experience and leadership in enterprise AI, supported by profound expertise and ongoing investment in data, analytics, cloud & DevOps, application modernization, and customer experience. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the Americas, Europe, and India. Follow us on LinkedIn.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

In this role, you will play a key role in designing, building, and optimizing scalable data products within the Telecom Analytics domain. You will collaborate with cross-functional teams to implement AI-driven analytics, autonomous operations, and programmable data solutions. This position offers the opportunity to work with cutting-edge Big Data and Cloud technologies, enhance your data engineering expertise, and contribute to advancing Nokias data-driven telecom strategies. If you are passionate about creating innovative data solutions, mastering cloud and big data platforms, and working in a fast-paced, collaborative environment, this role is for you! You have: Bachelors or masters degree in computer science, Data Engineering, or related field with 8+ years of experience in data engineering with a focus on Big Data, Cloud, and Telecom Analytics. Hands-on expertise in Ab Initio for data cataloguing, metadata management, and lineage. Skills in data warehousing, OLAP, and modelling using BigQuery, Clickhouse, and SQL. Experience with data persistence technologies like S3, HDFS, and Iceberg. Hold on, Python and scripting languages. It would be nice if you also had: Experience with data exploration and visualization using Superset or BI tools. Knowledge in ETL processes and streaming tools such as Kafka. Background in building data products for the telecom domain and understanding of AI and machine learning pipeline integration. Data Governance: Manage source data within the Metadata Hub and Data Catalog. ETL Development: Develop and execute data processing graphs using Express It and the Co-Operating System. ETL Optimization: Debug and optimize data processing graphs using the Graphical Development Environment (GDE). API Integration: Leverage Ab Initio APIs for metadata and graph artifact management. CI/CD Implementation: Implement and maintain CI/CD pipelines for metadata and graph deployments. Team Leadership & Mentorship: Mentor team members and foster best practices in Ab Initio development and deployment.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

Staff Data Engineer Experience: 3 - 5 Years Exp Salary : INR 50-60 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 4:00PM to 1:00AM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : ClickHouse, DuckDB, AWS, Python, SQL Good to have skills : DBT, Iceberg, Kestra, Parquet, SQLGlot Rill Data (One of Uplers' Clients) is Looking for: Staff Data Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Rill is the worlds fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution thats easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelors degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Rill is an operational BI tool that provides fast dashboards that your team will actually use. Data teams build fewer, more flexible dashboards for business users, while business users make faster decisions and perform root cause analysis, with fewer ad hoc requests. Rills unique architecture combines a last-mile ETL service, an in-memory database, and operational dashboards - all in a single solution. Our customers are leading media & advertising platforms, including Comcast's Freewheel, tvScientific, AT&T's DishTV, and more. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 weeks ago

Apply

1.0 - 4.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: We are looking for a savvy Data Engineer expert to join our analytics team. The Candidate will be responsible for expanding and optimizing our data and data pipelines, as well as optimizing data flow and collection for cross functional teams. The ideal candidate has python development experience and is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. We believe that candidate with solid Software Engineering/Development is a great fit. However, we also recognize that each candidate has a unique blend of skills. The Data Engineer will work with database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams. The right candidate will be excited by the prospect of optimizing data to support our next generation of products and data initiatives.Responsibilities for Data Engineer Create and maintain optimal data pipelines, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing for greater scalability, etc. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems. Performing ad hoc analysis and report QA testing. Follow Agile/SCRUM development methodologies within Analytics projects. Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Good communication skills, a great team player and someone who has the hunger to learn newer ways of problem solving. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge on Unix or Shell scripting Constructing methods to test user acceptance and usage of data. Knowledge of predictive analytics tools and problem solving using statistical methods is a plus. Experience supporting and working with cross-functional teams in a dynamic environment. Demonstrated understanding of the Software Development Life Cycle Ability to work independently and with a team in a diverse, fast paced, and collaborative environment Excellent written and verbal communication skills A quick learner with the ability to handle development tasks with minimum or no supervision Ability to multitask We are looking for a candidate with 7+ years of experience in a Data Engineering role. They should also have experience using the following software/tools Experience in Python, Java, etc. Experience with Google Cloud Platform. Experience with bigdata frameworks & tools - Apache Hadoop/Beam/Spark/Kafka. Exposure to workflow management & scheduling using Airflow/Prefect/Dagster Exposure to databases like (Big Query , Clickhouse). Experience to container orchestration (Kubernetes) Optional Experience on one or more BI tools (Tableau, Splunk or equivalent).. Minimum Qualifications:6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms.'Bachelor's degree and 7+ years Data Engineer/ Software Engineer (Data) Experience Minimum Qualifications: 4+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms. Bachelors / Masters or equivalent degree in computer engineering or in equivalent stream Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Designing implementing Data Pipelines Frameworks to provide a better developer experience for our dev teams. Helping other PODs in IDfy define their data landscape and onboarding them onto our platform. Keep abreast of the latest trends and technologies in Data Engineering, GenAI, and Natural Language Query. Set up logging, monitoring, and alerting mechanisms for better visibility into data pipelines and platform health. Automate repetitive data tasks to improve efficiency and free up engineering bandwidth. Maintain technical documentation to ensure knowledge sharing and onboarding efficiency. Troubleshoot and resolve bottlenecks in data processing, ingestion, and transformation pipelines. We Are the Perfect Match If You Have experience creating and managing large-scale data ingestion pipelines using the ELT (Extract, Load, Transform) model. In your current role, take ownership of defining data models, transformation logic, and data flow. Are proficient in Logstash, Apache BEAM Dataflow, Apache Airflow, ClickHouse, Grafana, InfluxDB/VictoriaMetrics, and BigQuery. Strong understanding and hands-on experience with data warehouses, with at least 3 years of experience in any data warehousing stack. Have a keen eye for data and can derive meaningful insights from it. Understand product development methodologies; we follow Agile. Have experience with Time Series Databases (we use InfluxDB VictoriaMetrics) and alerting/anomaly detection frameworks (preferred but not mandatory). Are familiar with visualization tools such as Metabase, Power BI, or Tableau. Have experience developing software in the cloud (GCP/AWS is preferred, but hands-on experience is not mandatory). Are passionate about exploring new technologies and enjoy sharing your knowledge through technical blogs.

Posted 3 weeks ago

Apply

6 - 10 years

7 - 17 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Job Scope: Responsible for creating, monitoring and maintaining various databases such as Mysql, MongoDB , postgres, etc. Job Responsibilities: Ensure optimal health, integrity, performance, and security of all databases. Develop and maintain data categorization and security standards. Develop and maintain data movement, archiving and purging scripts. Evaluate and recommend new database technologies and management tools; optimize existing and future technology investments to maximize returns. Provide day-to-day support to internal IT support groups, external partners, and customers as required. Manage outsourced database administration services to perform basic monitoring and administrative-level tasks as directed. Participate in change and problem management activities, root cause analysis, and development of knowledge articles to support the organizations program. Provide subject matter expertise to internal and external project teams, applications developers, and others as needed. Support application testing and production operations. Serve as database administration. Document, monitor, test, and adjust backup and recovery procedures to ensure important data is available in a disaster. Serve as on-call database administrator on a rotating basis. Develop, Implement, and Maintain MySQL, PostgreSQL, Mongo Instances including scripts for monitoring and maintenance of individual databases. File system management and monitoring Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability Qualification and Experience B.E./B.Tech/MCA 6-10 years of experience in managing enterprise databases Knowledge and Skills MySQL, PostgreSQL & knowledge on NoSQL like Mongo & Redis etc. Clickhouse DB Admin skills in an added advantage. Backup and Recovering MySQL, PostgreSQL and other databases. User level Access: Risks & Threats. Synchronous and Asynchronous replication, converged systems, partitioning, and storage-as-a-service (cloud technologies) Linux operating systems, including shell scripting Windows Server operating system Industry-leading database monitoring tools and platforms Data integration techniques, platforms, and tools Modern database backup technologies and strategies Why join us? Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. https://www.tanla.com

Posted 1 month ago

Apply

3 - 5 years

10 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled DevOps Engineer with deep expertise in database management (MongoDB, Redis, ClickHouse), containerization (Kubernetes, Docker), and cloud security. You will be responsible for designing, implementing, and maintaining scalable infrastructure while ensuring security, reliability, and performance across our cloud environments. Role & responsibilities Database Management: Deploy, optimize and maintain MongoDB, Redis, SQL and ClickHouse databases for high availability, scalability, and performance. Monitor database health, optimize queries, and ensure data integrity and backup strategies. Implement replication, sharding, and clustering strategies for distributed database systems. Implement database security best practices, including authentication, encryption, and access controls. Automate database provisioning and configuration. Containerization & Orchestration: Deploy and manage Docker containers and Kubernetes clusters across multiple environments. Automate container orchestration, scaling, and load balancing. Implement Helm charts, operators, and custom Kubernetes configurations to enhance deployment efficiency. Security & Compliance: Enforce RBAC, IAM policies, and security best practices across infrastructure. Perform vulnerability assessments and manage patching strategies for databases and containers. Cloud & Infrastructure: Work with cloud providers such as AWS, GCP, to optimize cloud-based workloads. Implement backup and disaster recovery strategies for critical data and infrastructure. Performance Optimization & Reliability: Enhance system performance by fine-tuning Kubernetes clusters, databases, and caching mechanisms. Implement disaster recovery, failover strategies, and high availability architectures. Work on incident response, troubleshooting, and RCA (Root Cause Analysis) for production issues. Monitor and fine-tune NGINX performance to handle high-traffic workloads efficiently. With NGINX Ingress controllers in Kubernetes environments. Required Skills & Experience: 4 + years of experience in a DevOps, SRE, or Cloud Engineering role. Strong expertise in MongoDB, Redis, and ClickHouse, including replication, clustering, and optimization Strong experience with Docker & Kubernetes, including Helm and operators. Proficiency in Linux administration, networking, and system performance tuning. Deep understanding of cloud security principles, including encryption, authentication, and compliance. Knowledge of GCP and their managed Kubernetes services (GKE,). Security-first mindset, with experience in RBAC, IAM, and security hardening Preferred candidate profile Familiarity with service mesh architecture (Istio) and API gateways. Knowledge of Kafka

Posted 1 month ago

Apply

6 - 10 years

12 - 22 Lacs

Coimbatore

Work from Office

Naukri logo

Looking for Database Developer

Posted 1 month ago

Apply

1 - 4 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

What Youll Own Full Stack Systems: Architect and build end-to-end applications using Flask, FastAPI, Node.js, React (or Next.js), and Tailwind. AI Integrations: Build and optimize pipelines involving LLMs (OpenAI, Groq, LLaMA), Whisper, TTS, embeddings, RAG, LangChain, LangGraph, and vector DBs like Pinecone/Milvus. Cloud Infrastructure: Deploy, monitor, and scale systems on AWS/GCP using EC2, S3, IAM, Lambda, Kafka, and ClickHouse. Real-time Systems: Design asynchronous workflows (Kafka, Celery, WebSockets) for voice-based agents, event tracking, or search indexing. System Orchestration: Set up scalable infra with autoscaling groups, Docker, and Kubernetes (PoC ready, if not full prod). Growth-Ready Features: Implement in-app nudges, tracking with Amplitude, AB testing, and funnel optimization. Tech Stack Youll Work With: Backend & Infrastructure Languages/Frameworks: Python (Flask, FastAPI), Node.js Databases: PostgreSQL, Redis, ClickHouse Infra: Kafka, Docker, Kubernetes, GitHub Actions, Cloudflare Cloud: AWS (EC2, S3, RDS), GCP Frontend React / Next.js, TailwindCSS, Zustand, Shadcn/UI WebGL, Three.js for 3D rendering AI/ML & Computer Vision LangChain, LangGraph, HuggingFace, OpenAI, Groq Whisper (ASR), Eleven Labs (TTS) Diffusion Models, StyleGAN, Stable Diffusion GANs, MediaPipe, ARKit/ARCore Computer Vision: Face tracking, real-time try-on, pose estimation Virtual Try-On: Face/body detection, cloth/hairstyle try-ons APIs Stripe, VAPI, Algolia, OpenAI, Amplitude Vector DB & Search Pinecone, Milvus (Zilliz), custom vector search pipelines Other Vibe Coding culture, prompt engineering, system-level optimization Must-Haves: 1+ years of experience building production-grade full-stack systems Fluency in Python and JS/TS (Node.js, React) shipping independently without handholding Deep understanding of LLM pipelines, embeddings, vector search, and retrieval-augmented generation (RAG) Experience with AR frameworks (ARKit, ARCore), 3D rendering (Three.js), and real-time computer vision (MediaPipe) Strong grasp of modern AI model architectures: Diffusion Models, GANs, AI Agent Hands-on with system debugging, performance profiling, infra cost optimization Comfort with ambiguity fast iteration, shipping prototypes, breaking things to learn faster Bonus if youve built agentic apps, AI workflows, or virtual try-ons

Posted 1 month ago

Apply

3 - 5 years

8 - 18 Lacs

Gurgaon

Work from Office

Naukri logo

Title: Sr. Business Analyst Location: Gurgaon, India Type: Hybrid (work from office) Job Description Who We Are: Fareportal is a travel technology company powering a next-generation travel concierge service. Utilizing its innovative technology and company owned and operated global contact centers, Fareportal has built strong industry partnerships providing customers access to over 600 airlines, a million lodgings, and hundreds of car rental companies around the globe. With a portfolio of consumer travel brands including CheapOair and OneTravel, Fareportal enables consumers to book-online, on mobile apps for iOS and Android, by phone, or live chat. Fareportal provides its airline partners with access to a broad customer base that books high-yielding international travel and add-on ancillaries. Fareportal is one of the leading sellers of airline tickets in the United States. We are a progressive company that leverages technology and expertise to deliver optimal solutions for our suppliers, customers, and partners. FAREPORTAL HIGHLIGHTS: Fareportal is the number 1 privately held online travel company in flight volume. Fareportal partners with over 600 airlines, 1 million lodgings, and hundreds of car rental companies worldwide. 2019 annual sales exceeded $5 billion. Fareportal sees over 150 million unique visitors annually to our desktop and mobile sites. Fareportal, with its global workforce of over 2,600 employees, is strategically positioned with 9 offices in 6 countries and headquartered in New York City. Role Overview The BI Engineer will generally support many areas of the business with analysis, visualization, and recommendations by leveraging our diverse data sources and applying them appropriately by interpreting the business needs and goals. Responsibilities Create data-driven, high-impact insights independently. Ideate, develop, and deploy dashboards and visualizations for key business metrics. Perform advanced data profiling, modeling, and business logic analysis. Implement alerting tools and systems to quickly identify issues, notify stakeholders, and coordinate to resolve the issues. Collaborate with business units to perform requirements analysis, project scoping, data analysis and business logic transformation. Support data warehousing and automation projects, including logic and validation, for use in BI analysis and insights. Provide guidance to reporting users to maximize understanding and use of reporting technologies. Efficiently manage the backlog and delivery of analytical projects. Requirements Bachelors degree in technical or analytical field, or other fields with related work experience 2-3 years of work experience with business intelligence or other data analysis roles Strong experience querying relational databases such as Microsoft SQL Server, Oracle Database, or MySQL & ClickHouse. High proficiency with visualization tools such as Power BI. Proven track record of data-driven insights Advanced Excel skills Data modeling, validation, Data Storytelling, and statistical analysis Critical thinking and problem solving Preferred Experience in travel or e-commerce industries. Disclaimer This job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee. Fareportal reserves the right to change the job duties, responsibilities, expectations or requirements posted here at any time at the Companys sole discretion, with or without notice.

Posted 1 month ago

Apply

5 - 10 years

10 - 20 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

What you'll be responsible for ? Should perform the Software design and coding, maintenance, performance tuning. Should understand the use cases and implement it Develop the new module as well as support the existing one. Interprets business plans for automation requirements. Ongoing support of existing java project and New development. Creates technical documentation and specifications. Ability to plan, organize, coordinate, and multitask. Excellent communication in English (written & verbal) and interpersonal skills. What you'd have ? 5-10 yrs of Experience in developing resilient & scalable distributed systems and microservices architecture. Strong technical background in Core Java, Servlets, XML RDBMS. Experience in developing REST API's using spring boot (or similar frameworks) and webhooks for async communication. Good understanding of async architecture using queues and messaging broker like RabbitMQ, Kafka, etc Deep insights in Java , Garbage Collection Systems, Multi-threading. Experience in container platforms like Docker, Kubernetes. Good understanding of the working of Kubernetes and exp in either EKS, GKE, AKS. Significant experience with various open-source tools and frameworks like Spring, hibernate, Apache Camel, Guava Cache, etc. Along with RDBMS, exposure to various no-SQL databases like Mongo, Redis, Clickhouse, Cassandra. • Good analytical skills Why join us? Impactful Work : Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities : Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees www.tanla.com

Posted 2 months ago

Apply

3 - 8 years

15 - 25 Lacs

Noida

Remote

Naukri logo

Customer Support Engineer (US Shift) Remote Location: Remote Shift: US (EST) 9 AM 6 PM Role Overview We’re looking for a Customer Support Engineer to be the technical backbone of our support team during US hours. You’ll handle deep troubleshooting, debugging, RCA investigations, and urgent issue resolution —from database inconsistencies to AWS infrastructure issues. Your work will reduce escalations, improve resolution speed, and keep our systems running smoothly. Key Responsibilities Technical Troubleshooting & RCA: Investigate complex customer issues, analyze AWS logs, and debug SQL databases (PostgreSQL, ClickHouse). Support Team Enablement: Act as the go-to technical expert for escalations and guide support agents. Bug Fixing & Deployments: Identify and patch minor bugs, deploy hotfixes, and document recurring issues. Database Management & Migrations: Execute safe data migrations, optimize SQL queries, and ensure data integrity. Incident Handling & Emergency Response: Own high-priority (P0/P1) incidents, deploy fixes, and write post-mortems. Proactive Monitoring: Set up alerts, monitor logs, and implement preventative solutions. Who You Are A Problem Solver: You thrive on debugging and finding root causes. An Analytical Thinker: You connect symptoms to real issues using data analysis. Experienced in SQL, AWS & Debugging: Strong skills in Node.js, PostgreSQL, AWS (CloudWatch, Lambda, EC2, S3, RDS). Reliable Under Pressure: Quick thinking and decision-making in high-priority situations. A Strong Communicator: Able to explain technical findings to non-technical stakeholders. Independent & Self-Sufficient: Comfortable making decisions and taking ownership. Ideal Qualifications 3+ years in software engineering, DevOps, or technical support. Strong SQL and backend debugging experience. Hands-on experience with AWS services & database migrations. Prior experience in SaaS or customer support engineering is a plus.

Posted 2 months ago

Apply

1 - 3 years

2 - 4 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities: Design and implement scalable, secure, and high-performance data architectures. Define and enforce data modelling standards, best practices, and data governance policies. Develop data strategies that align with business objectives and future growth. Design, optimize, and maintain relational and NoSQL databases (e.g., PostgreSQL, MySQL, ClickHouse, MongoDB). Implement and manage data warehouses and data lakes for analytics and reporting (e.g., Snowflake, BigQuery, Redshift). Ensure efficient ETL/ELT processes for data integration and transformation. Define and enforce data security policies, access controls, and compliance with regulations (GDPR, HIPAA, etc.). Implement data lineage, data cataloging, and metadata management solutions. Work closely with data engineers, analysts, and business teams to understand data requirements. Provide technical guidance and mentorship to data teams. Collaborate with IT and DevOps teams to ensure seamless integration of data solutions. Optimize query performance, indexing strategies, and storage solutions. Evaluate and integrate emerging technologies such as AI/ML-driven data processing and real-time analytics.

Posted 2 months ago

Apply

10 - 15 years

37 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role : Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation

Posted 2 months ago

Apply

1 - 5 years

8 - 18 Lacs

Navi Mumbai, Mumbai, Delhi

Work from Office

Naukri logo

Below is the JD for Click House Database Helping build production-grade systems based on ClickHouse: advise how to design schemas, plan clusters etc. Environments range from single node setups to clusters with 100s of nodes, Cloud, managed ClickHouse service. Working on infrastructure projects related to ClickHouse Improving ClickHouse itself – fixing bugs, improving docs, creating test-cases, etc. Studying new usage patterns, ClickHouse functions, & integration with other products. Working with the community – GitHub, Stack Overflow, Telegram. Installation multiple node cluster , configure, backup and recovery and maintain ClickHouse database. Monitor and optimize database performance, ensuring high availability and responsiveness. Troubleshoot database issues, identify and resolve performance bottlenecks. Design and implement database backup and recovery strategies. Develop and implement database security policies and procedures. Collaborate with development teams to optimize database schema design and queries. Provide technical guidance and support to development and operations teams. Experience with big data stack components like Hadoop, Spark, Kafka, Nifi, Experience with data science/data analysis Knowledge of SRE / DevOP stacks – monitoring/system management tools (Prometheus, Ansible, ELK, ) Version control using git Handling support calls from customers using ClickHouse. This includes diagnosing problems connecting to ClickHouse, designing applications, deploying/upgrading ClickHouse, and operations

Posted 2 months ago

Apply

7 - 10 years

10 - 15 Lacs

Chennai

Work from Office

Naukri logo

Job Scope: Responsible for creating, monitoring and maintaining various databases such as Mysql, MongoDB , postgres, etc. You'll be Responsible for, Ensure optimal health, integrity, performance, and security of all databases. Develop and maintain data categorization and security standards. Develop and maintain data movement, archiving and purging scripts. Evaluate and recommend new database technologies and management tools; optimize existing and future technology investments to maximize returns. Provide day-to-day support to internal IT support groups, external partners, and customers as required. Manage outsourced database administration services to perform basic monitoring and administrative-level tasks as directed. Participate in change and problem management activities, root cause analysis, and development of knowledge articles to support the organizations program. Provide subject matter expertise to internal and external project teams, applications developers, and others as needed. Support application testing and production operations. Serve as database administration. Document, monitor, test, and adjust backup and recovery procedures to ensure important data is available in a disaster. Serve as on-call database administrator on a rotating basis. Develop, Implement, and Maintain MySQL, PostgreSQL, Mongo Instances including scripts for monitoring and maintenance of individual databases. File system management and monitoring Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability You'd have? B.E./B.Tech/MCA 7-10 years of experience in managing enterprise databases MySQL, PostgreSQL & knowledge on NoSQL like Mongo & Redis etc. Clickhouse DB Admin skills in an added advantage. Backup and Recovering MySQL, PostgreSQL and other databases. User level Access: Risks & Threats. Synchronous and Asynchronous replication, converged systems, partitioning, and storage-as-a-service (cloud technologies) Linux operating systems, including shell scripting Windows Server operating system Industry-leading database monitoring tools and platforms Data integration techniques, platforms, and tools Modern database backup technologies and strategies Why Join us? Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com

Posted 3 months ago

Apply

5 - 10 years

15 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Description: Keyskills Must Have Platform/Framework Hadoop Spark Kafka Requirements: Skill Set Hadoop, Spark, ClickHouse, Kafka Job Responsibilities: Skill Set Hadoop, Spark, ClickHouse, Kafka What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 3 months ago

Apply

4 - 7 years

6 - 9 Lacs

Chennai

Work from Office

Naukri logo

Applied's AIx Products group is searching for front-end developers to join our team. AIx (Actionable Insight Accelerator) is an ML/AI Data Analytics platform that enables development and deployment of new chip technologies. AIx allows engineers to innovate and optimize semiconductor processes in real-time, and control thousands of variables to improve semiconductor performance, power, area-cost and time to market (PPACt). Essential Skillset: Experience working with columnar DBs, relational DBs, and document DBs. Experience with setup, scaling and maintenance of Clickhouse and Mongo at a minimum. Experience with schema development and performance optimization. Qualifications Education: High School Diploma/ GED.

Posted 3 months ago

Apply

3 - 6 years

7 - 12 Lacs

Chennai

Work from Office

Naukri logo

Primary Responsibilities: Design and build data pipelines to process terabytes of data Orchestrate in Airflow the data tasks to run on Kubernetes/Hadoop for the ingestion, processing and cleaning of data Create Docker images for various applications and deploy them on Kubernetes Design and build best in class processes to clean and standardize data Troubleshoot production issues in our Elastic Environment Tuning and optimizing data processes Advancing the team"s DataOps culture (CI/CD, Orchestration, Testing, Monitoring) and building out standard development patterns Drive innovation by testing new technology and approaches to continually advance the capability of the data engineering function Drive efficiencies in current engineering processes via standardization and migration of existing on-premises processes to the cloud Ensuring Data Quality - building best in class data quality monitoring that ensure that all data products exceed customer expectations Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor"s degree in Computer Science or similar Hands-on experience on the following technologies: Developing processes in Spark Writing complex SQL queries Building ETL/data pipelines Related/complementary open-source software platforms and languages (e.g. Scala, Python, Java, Linux) Experience building cloud-native data pipelines on either AWS, Azure or GCP following best practices in cloud deployments Solid DataOps experience (CI/CD, Orchestration, Testing, Monitoring) Good experience handling real-time, near real-time and batch data ingestions Good understanding of Data Modelling techniques i.e. DataVault, Kimble Star Proven excellent understanding of Column-Store RDBMS (DataBricks, Snowflake, Redshift, Vertica, Clickhouse) Proven track record of designing effective data strategies and leveraging modern data architectures that resulted in business value Demonstrated effective interpersonal, influence, collaboration and listening skills Demonstrated solid stakeholder management skills

Posted 3 months ago

Apply

12 - 18 years

35 - 55 Lacs

Bengaluru, Kochi

Work from Office

Naukri logo

As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients' goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you'll collaborate with clients to optimize and trailblaze new solutions that address real business challenges.If you are passionate about success with both your career and solving clients' business challenges, this role is for you. To help achieve this win-win outcome, a 'day-in-the-life' of this opportunity may include, but not be limited to Solving Client Challenges Effectively: Understanding clients' main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning assets, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels from engineers to CxOswith experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise 12 - 18 years of relevant experience on any of the products - AIOps, Netcool OMNIbus, Netcool Impact, IBM Turbonomics, IBM Instana and Observability Tools Relevant experience on any of the products - HP Openview , Truesite, BMC Patrol, Moogsoft, Splunk Event Management, HP (OSEM) , CA Tech, BMC Event Manager, Science Logic, Bigpanda, Servicenow, PagerDuty, ManageEngine, Now, Dynatrace, OpsRamp, App Dynamics, VM ware Knowledge of Cloud Pak for Watson AIOps Extensive knowledge of deploying, maintaining, and automating a wide range of operational tasks for the Instana observability and/or equal experience with other application performance monitoring (APM) tools. Knowledge of Turbonomic Application Resource Management Extensive experience in the following: Virtualization (VMware / Hyper-V) Experience with at least one of these clouds: AWS, Google Cloud Platform(GCP), IBM Cloud Containers (Kubernetes / OpenShift / Docker) Infrastructure Management (Server / Storage / Network) System administration/engineering experience (Ubuntu and RedHat) Experience with infrastructure as code and configuration management tools (e.g. Terraform, Chef, Ansible) Experience with at least one of these datastores: Kafka, Cassandra, Elasticsearch, Clickhouse, CockroachDB Preferred technical and professional experience Networking (HTTP, Cloudflare, TLS, Akamai, DNS) to troubleshoot network and load balancer issues. Source control (Git, GitHub) and CI/CD pipeline (Jenkins) Software development experience (Golang and Java preferred) Expertise in design, build, deploy and run large-scale production environments and ensuring reliability by analyzing and troubleshooting. Ability to debug, optimize code, and automate routine tasks. Systematic problem-solving approach is key Effective communication skills and a sense of ownership and drive. Familiarity of languages like Java, C, C++, .NET, Python, Shell, Perl, and/or JavaScript, Go, Python, Ruby is an added advantage Familiarity of Java virtual machines and web technologies and debugging Java exceptions in logs, HTTP, HTML, DNS, TCP, etc. Experience with developing monitoring for production components and instrumenting code for observability using Instana or LogDNA. Cloud Pak for Watson AIOps, Turbonomic certification

Posted 3 months ago

Apply

3 - 6 years

15 - 20 Lacs

Delhi NCR, Bengaluru, Kolkata

Hybrid

Naukri logo

Develop backend REST APIs for new & ongoing betting projects, contribute to architectural design & optimize system performance. Propose and implement architectural solutions. understand business processes, and maintain technical documentation. Required Candidate profile 3+ Yrs in Python, Django, Celery, PostgreSQL, Redis/ ClickHouse. Hands-on microservice architecture, service-to-service interaction, OOP, SOLID principles, distributed message brokers, Kafka/ RabbitMQ

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies