Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
4 - 20 Lacs
Hyderābād
On-site
Job Description: We are currently looking to onboard a Platform Engineer to support one of our critical engagements with a global investment banking client. This is a permanent position based out of Hyderabad , working closely with teams in South Africa and Czech Republic. Please find below the detailed JD and expectations for the role. Role Overview: You will be part of a cross-functional team supporting infrastructure, automation, and deployment pipelines for enterprise-scale financial applications. This role focuses on platform stability, infrastructure readiness, deployment scripting, environment setup, and CI/CD automation. Key Responsibilities: Maintain and enhance platform infrastructure across Linux and Windows environments Develop scripts to automate system monitoring, deployment, and recovery processes Troubleshoot and resolve environment-level issues impacting application performance Build and manage CI/CD pipelines using tools like Jenkins, Azure DevOps, or GitHub Actions Collaborate with development, support, and cloud teams to ensure high platform availability Support and automate tasks like patching, environment readiness, and DR test setups Work with DBAs, application teams, and product vendors to resolve infra-related performance bottlenecks Document processes, create knowledge articles, and ensure knowledge continuity Mandatory Skills: 4-9 years of experience in infrastructure/platform engineering Strong hands-on skills in Windows, Linux, Bash scripting and Powershell Experience with CI/CD pipelines and deployment automation Proficiency in tools such as Ansible, Jenkins, Azure DevOps, Git Experience with log aggregation and monitoring (e.g., ELK, Grafana, Prometheus) Comfortable supporting enterprise-grade applications in a financial services environment Preferred Skills: Exposure to Cloud platforms like AWS (especially EC2, S3, IAM, CloudWatch) Familiarity with application support tools and release pipelines SQL knowledge and ability to work with DB teams for performance tuning Prior experience working with geographically distributed teams Position: Platform Engineer Location: Hyderabad (Onsite) Engagement Type: Full-Time, Permanent Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Provident Fund Work Location: In person
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
Gurgaon
On-site
202505104 Gurugram, Haryana, India Bevorzugt Description Job Responsibility: Design, develop, and optimize MongoDB data models for various business and analytics use cases. Implement and maintain efficient MongoDB CRUD operations, indexes, and schema evolution strategies. Experience with self-hosted MongoDB deployments, including installation, configuration, scaling, backup/restore, and monitoring. Build and maintain reporting and analytics pipelines using MongoDB Reporting suite. Develop, monitor, and tune MongoDB (both self-hosted and cloud-managed) deployments for scalability, reliability, and security. Collaborate with engineering and product teams to translate requirements into MongoDB-backed solutions. Support integration with Azure cloud services (e.g., Azure Cosmos DB for MongoDB, Azure Functions, Blob Storage). Maintain documentation and contribute to database standards and best practices. (Nice to have) Support data ingestion and automation tasks using Python. Qualifications Qualifications: Bachelor’s or master’s in computer science, Engineering, or related quantitative discipline. Experience: 5 to 8 years of hands-on experience in data engineering or backend development with MongoDB. Demonstrated experience with self-hosted MongoDB, including cluster setup, maintenance, and troubleshooting. Technical Competencies: Deep hands-on experience with MongoDB data modelling , schema design, and normalization/denormalization strategies. Strong proficiency in MongoDB development : aggregation pipelines, CRUD, performance tuning, and index management. Experience in building reporting and analytics using MongoDB Reporting suite. Experience with self-hosted MongoDB deployments (e.g., sharding, replication, monitoring, security configuration). Working knowledge of Azure cloud services (Azure Cosmos DB, VMs, App Service, networking for secure deployments). (Nice to have) Experience in Python for backend integration, data processing, or scripting
Posted 6 days ago
6.0 years
3 - 6 Lacs
Chennai
On-site
6+ Years on IT experience and 4+ years of experirnce in ne04j Design and implement efficient graph models using Neo4j to represent complex relationships. Write optimized Cypher queries for data retrieval, manipulation, and aggregation. Develop and maintain ETL pipelines to integrate data from various sources into the graph database. Integrate Neo4j databases with existing systems using APIs and other middleware technologie About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 6 days ago
0 years
2 - 3 Lacs
Chennai
On-site
Responsible for designing, developing, and optimizing data processing solutions using a combination of Big Data technologies. Focus on building scalable and efficient data pipelines for handling large datasets and enabling batch & real-time data streaming and processing. Responsibilities: > Develop Spark applications using Scala or Python (Pyspark) for data transformation, aggregation, and analysis. > Develop and maintain Kafka-based data pipelines: This includes designing Kafka Streams, setting up Kafka Clusters, and ensuring efficient data flow. > Create and optimize Spark applications using Scala and PySpark: They leverage these languages to process large datasets and implement data transformations and aggregations. > Integrate Kafka with Spark for real-time processing: They build systems that ingest real-time data from Kafka and process it using Spark Streaming or Structured Streaming. > Collaborate with data teams: This includes data engineers, data scientists, and DevOps, to design and implement data solutions. > Tune and optimize Spark and Kafka clusters: Ensuring high performance, scalability, and efficiency of data processing workflows. > Write clean, functional, and optimized code: Adhering to coding standards and best practices. > Troubleshoot and resolve issues: Identifying and addressing any problems related to Kafka and Spark applications. > Maintain documentation: Creating and maintaining documentation for Kafka configurations, Spark jobs, and other processes. > Stay updated on technology trends: Continuously learning and applying new advancements in functional programming, big data, and related technologies. Proficiency in: Hadoop ecosystem big data tech stack(HDFS, YARN, MapReduce, Hive, Impala). Spark (Scala, Python) for data processing and analysis. Kafka for real-time data ingestion and processing. ETL processes and data ingestion tools Deep hands-on expertise in Pyspark, Scala, Kafka Programming Languages: Scala, Python, or Java for developing Spark applications. SQL for data querying and analysis. Other Skills: Data warehousing concepts. Linux/Unix operating systems. Problem-solving and analytical skills. Version control systems - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 6 days ago
3.0 years
6 Lacs
Calcutta
On-site
MERN Stack Developer (Full Stack - Single Hand Project Delivery Expert) - Onsite Kolkata Location: Onsite (Kolkata) Salary: ₹6 Lakhs/Annum Working Hours: 10:00 AM to 7:00 PM IST Work Days: Monday to Friday, with work required on the 2nd and 4th Saturday of the month. Verification: Background + Employment History Checks About the Role: We are actively seeking a highly autonomous and experienced Full Stack MERN Developer who can excel in single-hand project delivery to join our growing team in Kolkata. This is a critical role for individuals who thrive on end-to-end ownership, from initial concept and design to development, testing, and deployment. If you possess the expertise to independently drive projects, solve complex challenges across the entire stack, and deliver high-quality, impactful solutions, we want to hear from you. You will be instrumental in building and enhancing our core web applications and mobile experiences, requiring a blend of technical prowess, strategic thinking, and exceptional execution capabilities within a collaborative onsite environment. What You'll Do: Lead the end-to-end development lifecycle of assigned projects , from conceptualization and architectural design to deployment and maintenance. Design, develop, and maintain responsive and performant web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js). Develop and maintain mobile applications using React Native. Develop robust, scalable, and secure RESTful APIs using Node.js and Express.js, ensuring optimal backend functionality. Implement intuitive and interactive user interfaces using React.js and React Native, focusing on exceptional user experience and responsiveness. Architect and optimize database schemas and queries for MongoDB, ensuring efficient data management and retrieval. Independently integrate front-end components (web and mobile) with back-end services, ensuring seamless data flow and complete application functionality. Write clean, well-documented, and highly maintainable code following industry best practices and coding standards. Rigorously test, debug, and resolve issues across the entire stack (web and mobile) , ensuring high availability, performance, and reliability of applications. Take full ownership of project timelines, quality assurance, and successful delivery. Collaborate effectively with stakeholders to gather requirements, provide technical insights, and communicate project progress and challenges. Stay updated with the latest trends and best practices in MERN stack, React Native, and broader web/mobile technologies, proactively suggesting improvements. Who You Are: 3+ years of proven professional experience as a Full Stack Developer with the MERN stack (MongoDB, Express.js, React.js, Node.js), demonstrating capability in delivering projects independently. Strong proficiency and hands-on experience with React Native for mobile application development. Exceptional proficiency in JavaScript (ES6+), including asynchronous programming, modern JavaScript features, and strong problem-solving skills. Extensive experience with React.js and its core principles, including advanced state management (e.g., Redux, Context API), hooks, and routing. Solid expertise in Node.js and Express.js for building high-performance, scalable backend services and APIs. Proficient in working with MongoDB, including advanced schema design, indexing, aggregation pipelines, and performance optimization. Demonstrated ability to architect, design, and implement complex features and entire applications (web and mobile) from scratch. Strong command of version control systems, particularly Git, and best practices for managing solo projects or leading contributions. Deep understanding of web fundamentals (HTML5, CSS3, JavaScript, RESTful APIs, security best practices). Experience with various testing frameworks (e.g., Jest, React Testing Library, Mocha, Chai) and a commitment to writing comprehensive tests. Proven track record of taking ownership of projects, managing priorities, and ensuring successful delivery with minimal supervision. Excellent analytical and problem-solving skills, with the ability to identify and resolve issues across the full stack independently. Outstanding written and verbal communication skills, able to clearly articulate technical concepts, progress, and potential blockers. Highly self-motivated, proactive, and capable of driving initiatives with a strong sense of urgency and accountability. Must be able to work onsite from our Kolkata office. Interview Process: Round 1: Application Review + Comprehensive Technical Assessment: We review your resume and provide a challenging take-home coding assignment that requires full-stack (web and potentially mobile aspects) implementation and demonstrates independent problem-solving and clean code. Round 2: Technical Deep-Dive & Project Ownership Interview (Live): An in-depth discussion about your past projects where you had significant individual ownership. We'll explore architectural decisions, challenges faced, how you managed the project end-to-end, and a live coding session focused on complex MERN/React Native scenarios. Round 3: System Design & Autonomy Assessment: Focus on your ability to design scalable and maintainable MERN/React Native architectures from a high level, troubleshoot complex system-wide issues, and discuss how you plan and execute projects when given significant autonomy. We'll also review your GitHub profile or personal projects that showcase independent work. Round 4: Final Round with Leadership: A conversation to assess cultural fit, discuss your approach to self-management, continuous learning, and how you thrive in a highly autonomous role. This will also include the offer discussion. Job Type: Full-time Pay: ₹600,000.00 per year Schedule: Monday to Friday
Posted 6 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description myfollo.com is a technology brand of Valion E Assets Pvt Ltd, India's premier "Private Real Estate Family Office". The platform employs an "Aggregate, Control & Transact" model, engaging landlords, property owners, and developers online to aggregate inventory for optimal distribution and transactions. myfollo.com aims to lead the online real estate transaction space with an O2O (Online to Offline) activation model through its property management brand, PropCare, and its free aggregation master platform, myfollo.com. Valion P.R.E.F.O. has over a decade of experience in Real Estate Advisory, serving more than 350 companies and 1700+ families in India and Singapore, with investments and assets totaling over INR 9600 crores. Role Description This is a full-time on-site role for an Investor Relationship Manager located in Gurugram. The role involves managing relationships with investors, providing them with up-to-date information on their real estate portfolio, and assisting them in real estate transactions. Daily tasks include client communications, market research, preparing and presenting investment proposals, coordinating with internal teams, and ensuring a seamless transaction process for investors. Qualifications Excellent verbal and written communication skills Strong customer relationship management abilities and experience Proficiency in market research, analysis, and understanding of real estate trends Ability to prepare and present investment proposals effectively Strong organizational skills and the ability to manage multiple tasks simultaneously Experience in real estate, finance, or a related field is preferred Bachelor's degree in Business, Finance, Real Estate, or a related discipline
Posted 1 week ago
0.0 - 3.0 years
0 - 0 Lacs
Panchavati, Nashik, Maharashtra
On-site
We’re Hiring: MERN Stack Developer (1.5–3 Years Experience) Location: Nashik, Maharashtra (On-site) Job Type: Full-time About Us At AnantKamalStudios , we craft high-impact digital experiences that solve real-world problems. Join our team of creative technologists and help us build scalable web applications using the MERN Stack. Role Overview We are looking for a MERN Stack Developer with a passion for building modern, user-friendly web applications. You will be responsible for developing and maintaining full-stack applications using MongoDB, Express.js, React.js, and Node.js . Responsibilities Develop and maintain scalable web applications using the MERN stack Collaborate with designers, developers, and product managers to deliver high-quality features Build and consume RESTful APIs Optimize applications for speed and scalability Write clean, modular, and reusable code Participate in code reviews and team discussions Requirements 1.5–3 years of experience with MERN stack development Strong proficiency in JavaScript (ES6+), Node.js, and Express.js Solid experience with React.js (hooks, context API, component architecture) Good understanding of MongoDB (Mongoose, aggregation, schema design) Familiarity with Git, REST APIs, and deployment tools (Heroku, Vercel, or AWS) Ability to write clean and testable code Bonus Points Experience with Next.js or TypeScript Understanding of CI/CD pipelines Knowledge of socket.io or real-time data apps Familiarity with UI libraries like Material-UI, Tailwind CSS What We Offer Competitive salary (based on experience) Opportunity to work on live client projects Creative, growth-oriented environment Performance-based incentives Job Type: Full-time Pay: ₹13,042.99 - ₹68,822.98 per month Work Location: In person
Posted 1 week ago
2.0 years
0 Lacs
New Delhi, Delhi, India
On-site
We're Hiring: Human Resources Manager 📍 Location : On-Site – Delhi NCR 🚛 Company : Tender Truck Tender Truck is India’s leading truck aggregation platform connecting fleet owners with verified logistics service providers. We’re looking for a smart and driven Junior HR Executive who can handle end-to-end hiring, employee coordination, and also support the leadership team with calendar and meeting management. Key Responsibilities: Handle hiring across roles – sourcing, screening, and scheduling interviews Maintain employee records and assist in HR operations Manage onboarding and exit formalities Support in maintaining a positive and professional work environment Assist senior leadership with calendar management, scheduling, and follow-ups Coordinate internal meetings, reminders, and basic documentation Who You Are: 0.5–2 years of experience in HR or admin roles Strong communication and coordination skills Organized, proactive, and willing to take ownership Familiar with basic hiring tools, Google Calendar, and Excel 📩 Apply Now : Send your resume to hr@tendertruck.com or apply via LinkedIn. Be a key part of Tender Truck’s growth journey — across people and process! 🚚👥
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
It was great talking to you; I was very much impressed by your vast experience and in-depth knowledge on the skill sets we discussed. I would like to propose your candidature to our client for further technical discussion. As discussed over the phone, please find below detailed JD for your reference. Client : LTIMINDTREE Job Type : C2H Role: Python Developer IT Experience: 7+ years Relavant: 5+ Years Work Location: Hyderbad,Pune Payroll on : People Prime World Wide Notice : 0-15 days Job Description Design, develop and maintain scalable RESTful and asynchronous APIs using fastapi Build and design data pipelines integrated with kafka for real time streaming and message driven architectures Implement robust, backend services with a focus on modularity , testability and cloud readiness Write unit and integration test cases Proficiency with AWS services like ECS, EC2, Lambda , cloudwatch, IAM etc Solid working knowledge on mongodb, including data modelling, aggregation pipelines and performance tuning
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Key Responsibilities : Full Stack Development : Design, develop, and maintain high-performance, scalable, and secure web applications across the MERN stack (MongoDB, Express.js, React, Node.js). Frontend Expertise : Implement responsive and intuitive user interfaces using React.js (or similar modern JavaScript frameworks), alongside HTML5 and CSS3. Backend & API Development : Design and develop robust RESTful APIs using Node.js and Express.js to support frontend functionalities and integrate with external systems. Database Management : Work with MongoDB for database design, optimization, and management. Collaboration & Feature Delivery : Collaborate effectively with cross-functional teams, including product managers, UI/UX designers, and other developers, to define, design, and ship new features. Code Quality & Optimization : Write clean, efficient, well-documented, and testable code following best practices and coding standards. Optimize applications for maximum speed, scalability, and security. Code Reviews & Mentorship : Participate in and conduct code reviews to ensure code quality, performance, and adherence to coding standards. Potentially mentor junior developers, sharing knowledge and best practices. Troubleshooting & Maintenance : Identify, troubleshoot, and debug complex issues reported by users or discovered during testing phases. Required Skills & Qualifications Experience : 5 to 9 years of hands-on experience as a MERN Stack Developer. Frontend Expertise in React.js (including Hooks, Context API/Redux, React Router). Strong proficiency in HTML5, CSS3 (with experience in pre-processors like Sass/Less and frameworks like Bootstrap/Tailwind CSS). Deep understanding of JavaScript (ES6+) and its asynchronous patterns (Promises, Async/Await). Backend Strong experience with Node.js and Express.js for building scalable backend services. Proficiency in designing and developing RESTful APIs. Database : Hands-on experience with MongoDB (including Mongoose ODM, aggregation framework, indexing, performance tuning). Version Control : Proficient with Git and collaborative development workflows (e.g., GitHub, GitLab, Bitbucket). Problem-Solving : Excellent analytical, debugging, and problem-solving skills. Communication : Strong verbal and written communication skills, with the ability to articulate technical concepts clearly to both technical and non-technical audiences. (ref:hirist.tech)
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
PLSQL Developer - Pune Job Title : PLSQL Developer Experience : 5 to 7 years Location : Pune/hybrid Notice Period : Immediate to 15days Mandatory Skills Languages : SQL, T-SQL, PL/SQL, Python libraries(PySpark, Pandas, NumPy, Matplotlib, Seaborn) Roles & Responsibilities Design and maintain efficient data pipelines and ETL processes using SQL and Python. Write optimized queries (T-SQL, PL/SQL) for data manipulation across multiple RDBMS. Use Python libraries for data processing, analysis, and visualization. Perform EOD (end-of-day) data aggregation and reporting based on business needs. Work on Azure Synapse Analytics for scalable data transformations. Monitor and manage database performance across Oracle, SQL Server, Synapse, and PostgreSQL. Collaborate with cross-functional teams to understand and translate reporting requirements. Ensure secure data handling and compliance with organizational data policies. Debug Unix-based scripts and automate batch jobs as needed. Qualifications Bachelors/Masters degree in Computer Science, IT, or related field. 5-8 years of hands-on experience in data engineering and analytics. Solid understanding of database architecture and performance tuning. Experience in end-of-day reporting setups and cloud-based analytics platforms. (ref:hirist.tech)
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsible for designing, developing, and optimizing data processing solutions using a combination of Big Data technologies. Focus on building scalable and efficient data pipelines for handling large datasets and enabling batch & real-time data streaming and processing. Responsibilities: > Develop Spark applications using Scala or Python (Pyspark) for data transformation, aggregation, and analysis. > Develop and maintain Kafka-based data pipelines: This includes designing Kafka Streams, setting up Kafka Clusters, and ensuring efficient data flow. > Create and optimize Spark applications using Scala and PySpark: They leverage these languages to process large datasets and implement data transformations and aggregations. > Integrate Kafka with Spark for real-time processing: They build systems that ingest real-time data from Kafka and process it using Spark Streaming or Structured Streaming. > Collaborate with data teams: This includes data engineers, data scientists, and DevOps, to design and implement data solutions. > Tune and optimize Spark and Kafka clusters: Ensuring high performance, scalability, and efficiency of data processing workflows. > Write clean, functional, and optimized code: Adhering to coding standards and best practices. > Troubleshoot and resolve issues: Identifying and addressing any problems related to Kafka and Spark applications. > Maintain documentation: Creating and maintaining documentation for Kafka configurations, Spark jobs, and other processes. > Stay updated on technology trends: Continuously learning and applying new advancements in functional programming, big data, and related technologies. Proficiency in: Hadoop ecosystem big data tech stack(HDFS, YARN, MapReduce, Hive, Impala). Spark (Scala, Python) for data processing and analysis. Kafka for real-time data ingestion and processing. ETL processes and data ingestion tools Deep hands-on expertise in Pyspark, Scala, Kafka Programming Languages: Scala, Python, or Java for developing Spark applications. SQL for data querying and analysis. Other Skills: Data warehousing concepts. Linux/Unix operating systems. Problem-solving and analytical skills. Version control systems ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Delhi, India
On-site
About AlphaSense The world’s most sophisticated companies rely on AlphaSense to remove uncertainty from decision-making. With market intelligence and search built on proven AI, AlphaSense delivers insights that matter from content you can trust. Our universe of public and private content includes equity research, company filings, event transcripts, expert calls, news, trade journals, and clients’ own research content. The acquisition of Tegus by AlphaSense in 2024 advances our shared mission to empower professionals to make smarter decisions through AI-driven market intelligence. Together, AlphaSense and Tegus will accelerate growth, innovation, and content expansion, with complementary product and content capabilities that enable users to unearth even more comprehensive insights from thousands of content sets. Our platform is trusted by over 6,000 enterprise customers, including a majority of the S&P 500. Founded in 2011, AlphaSense is headquartered in New York City with more than 2,000 employees across the globe and offices in the U.S., U.K., Finland, India, Singapore, Canada, and Ireland. Come join us! The Role As a Senior Product Content Analyst, you will serve as a key steward of AlphaSense’s entity reference data, with a focus on driving data integrity through advanced analysis, investigative research, and workflow improvements. In this role, you will conduct root-cause analyses of complex and systematic issues: such as duplicate LEIs or incorrect merges while collaborating with product, engineering, and vendor teams. You’ll write and execute SQL queries to investigate inconsistencies, support product fixes, and surface insights through reports and trend analysis. Your work will directly inform improvements to both the quality of our entity data and the efficiency of our workflows, including identifying opportunities for automation. The ideal candidate brings strong SQL skills, experience with large-scale entity or financial data, and a proven ability to analyze, communicate, and resolve data quality issues in high-volume environments. Experience with big data technologies like ClickHouse is a plus. Roles And Responsibilities Serve as a senior member for AlphaSense’s entity reference data, focusing on data quality, consistency, and integrity across multiple vendor sources. Conduct root-cause analysis of complex, systematic data issues such as duplicate LEIs, incorrect entity merges, or mismatched hierarchies. Write and run SQL queries to investigate inconsistencies, extract relevant datasets, and provide actionable insights. Support ad-hoc data investigations to troubleshoot client-reported issues, validate vendor integrations, or inform product enhancements. Generate recurring reports and trend analyses to identify patterns in data discrepancies and proactively flag emerging risks. Recommend and support workflow enhancements, including identifying opportunities for automation and process optimization. Collaborate with product managers, engineers, and external content vendors to resolve issues and ensure data changes align with system requirements and business logic. Document investigation results, data lineage, and remediation plans to ensure transparency and traceability across teams. Communicate findings and propose strategic data integrity improvements to internal stakeholders. Stay current with industry standards related to entity resolution, corporate hierarchies, and legal identifiers (e.g., LEI, DUNS, etc.). Candidate Requirements 5-7 years of experience in Entity Data/Reference and data management experience SQL proficiency (ability to query, analyze, and extract data for insights). Experience working with large-scale entity or financial data. Strong problem-solving skills to track down and prevent future data inconsistencies. Ability to communicate findings and recommend data integrity improvements. Outstanding oral and written communication skills Knowledge of Google Suite and advanced excel skills. Must be able to work a late shift to support the U.S. team. Bachelor’s Degree. Experience with financial information / data and analyst workflows Optional/Strong Plus Qualifications Experience managing content aggregation processes and mentoring junior analysts Familiarity with corporate entity structures and business classifications is a plus. Knowledge of ClickHouse or similar big data environments is a plus. AlphaSense is an equal-opportunity employer. We are committed to a work environment that supports, inspires, and respects all individuals. All employees share in the responsibility for fulfilling AlphaSense’s commitment to equal employment opportunity. AlphaSense does not discriminate against any employee or applicant on the basis of race, color, sex (including pregnancy), national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any other non-merit factor. This policy applies to every aspect of employment at AlphaSense, including recruitment, hiring, training, advancement, and termination. In addition, it is the policy of AlphaSense to provide reasonable accommodation to qualified employees who have protected disabilities to the extent required by applicable laws, regulations, and ordinances where a particular employee works. Recruiting Scams and Fraud We At AlphaSense Have Been Made Aware Of Fraudulent Job Postings And Individuals Impersonating AlphaSense Recruiters. These Scams May Involve Fake Job Offers, Requests For Sensitive Personal Information, Or Demands For Payment. Please Note AlphaSense never asks candidates to pay for job applications, equipment, or training. All official communications will come from an @alpha-sense.com email address. If you’re unsure about a job posting or recruiter, verify it on our Careers page. If you believe you’ve been targeted by a scam or have any doubts regarding the authenticity of any job listing purportedly from or on behalf of AlphaSense please contact us. Your security and trust matter to us.
Posted 1 week ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
About Klizo Klizo Solutions builds and launches multiple in‑house AI products while delivering complex platforms for US clients across healthcare, staffing, e‑commerce, and data aggregation. We are scaling fast and need Business Analysts who can own discovery, scope, and documentation—freeing leadership from daily calls and accelerating delivery. Apply directly here: https://klizos.com/careers/?job=6883c8ad991cca6fef44b3f3 Role Summary You will lead requirement discovery, turn client and internal ideas into clear SOWs, user stories, and roadmaps, and drive execution through Jira. You’ll use AI tools to move faster, reduce manual effort, and ensure every release is documented, testable, and billable. This shift will be from 5pm - 2:30am, working with the CEO directly. He is an American, and spends 6 months out of the year in India. Key Responsibilities Run discovery sessions with clients and internal product owners; capture goals, constraints, and success metrics Produce SOWs, BRDs, user stories, acceptance criteria, process/UML diagrams Use AI tools (ChatGPT/internal models) to draft documentation, test cases, and summaries Build and groom Jira backlogs; maintain release roadmaps and priority alignment Partner with Design (Figma) and Engineering to validate feasibility and edge cases Define data flows/integrations for AI, web, and mobile products Support estimation and pricing; identify and document upsell opportunities Coordinate UAT: prepare test scenarios, validate deliverables before client review Control scope creep and manage change requests with versioned documentation Provide weekly status, risk, and action reports Improve BA templates, processes, and playbooks continuously Must-Have Qualifications 3+ years as a Business Analyst / Product Analyst in software development Strong requirement elicitation & documentation skills (SOWs, user stories, UML/process maps) Hands-on experience with Jira, Confluence; Figma familiarity Excellent written and spoken English; confident leading US client calls during night shift Ability to challenge assumptions, propose better workflows, and manage deadlines Organized, detail-oriented, and proactive Nice to Have Exposure to AI/ML or data-heavy platforms Basic understanding of APIs, databases, and system architecture Experience creating test cases and supporting QA Demonstrated use of AI tools to accelerate BA outputs Compensation & Perks Salary: ₹20,000 – ₹40,000 per month (based on experience & assessment) Performance bonuses; housing assistance for relocators Paid leave (Casual, Sick, Holidays) after confirmation Late shift drop facility Work across 10+ in‑house AI products plus diverse client builds How to Apply Apply online at: https://klizos.com/careers/?job=6883c8ad991cca6fef44b3f3 Or email your CV and a sample (redacted) SOW or user story to jobs@klizos.com with subject: “Business Analyst – Your Name”. Bring structure, speed, and AI-driven efficiency to how we build. Apply now.
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About AlphaSense The world’s most sophisticated companies rely on AlphaSense to remove uncertainty from decision-making. With market intelligence and search built on proven AI, AlphaSense delivers insights that matter from content you can trust. Our universe of public and private content includes equity research, company filings, event transcripts, expert calls, news, trade journals, and clients’ own research content. The acquisition of Tegus by AlphaSense in 2024 advances our shared mission to empower professionals to make smarter decisions through AI-driven market intelligence. Together, AlphaSense and Tegus will accelerate growth, innovation, and content expansion, with complementary product and content capabilities that enable users to unearth even more comprehensive insights from thousands of content sets. Our platform is trusted by over 6,000 enterprise customers, including a majority of the S&P 500. Founded in 2011, AlphaSense is headquartered in New York City with more than 2,000 employees across the globe and offices in the U.S., U.K., Finland, India, Singapore, Canada, and Ireland. Come join us! The Role As a Senior Product Content Analyst, you will serve as a key steward of AlphaSense’s entity reference data, with a focus on driving data integrity through advanced analysis, investigative research, and workflow improvements. In this role, you will conduct root-cause analyses of complex and systematic issues: such as duplicate LEIs or incorrect merges while collaborating with product, engineering, and vendor teams. You’ll write and execute SQL queries to investigate inconsistencies, support product fixes, and surface insights through reports and trend analysis. Your work will directly inform improvements to both the quality of our entity data and the efficiency of our workflows, including identifying opportunities for automation. The ideal candidate brings strong SQL skills, experience with large-scale entity or financial data, and a proven ability to analyze, communicate, and resolve data quality issues in high-volume environments. Experience with big data technologies like ClickHouse is a plus. Roles And Responsibilities Serve as a senior member for AlphaSense’s entity reference data, focusing on data quality, consistency, and integrity across multiple vendor sources. Conduct root-cause analysis of complex, systematic data issues such as duplicate LEIs, incorrect entity merges, or mismatched hierarchies. Write and run SQL queries to investigate inconsistencies, extract relevant datasets, and provide actionable insights. Support ad-hoc data investigations to troubleshoot client-reported issues, validate vendor integrations, or inform product enhancements. Generate recurring reports and trend analyses to identify patterns in data discrepancies and proactively flag emerging risks. Recommend and support workflow enhancements, including identifying opportunities for automation and process optimization. Collaborate with product managers, engineers, and external content vendors to resolve issues and ensure data changes align with system requirements and business logic. Document investigation results, data lineage, and remediation plans to ensure transparency and traceability across teams. Communicate findings and propose strategic data integrity improvements to internal stakeholders. Stay current with industry standards related to entity resolution, corporate hierarchies, and legal identifiers (e.g., LEI, DUNS, etc.). Candidate Requirements 5-7 years of experience in Entity Data/Reference and data management experience SQL proficiency (ability to query, analyze, and extract data for insights). Experience working with large-scale entity or financial data. Strong problem-solving skills to track down and prevent future data inconsistencies. Ability to communicate findings and recommend data integrity improvements. Outstanding oral and written communication skills Knowledge of Google Suite and advanced excel skills. Must be able to work a late shift to support the U.S. team. Bachelor’s Degree. Experience with financial information / data and analyst workflows Optional/Strong Plus Qualifications Experience managing content aggregation processes and mentoring junior analysts Familiarity with corporate entity structures and business classifications is a plus. Knowledge of ClickHouse or similar big data environments is a plus. AlphaSense is an equal-opportunity employer. We are committed to a work environment that supports, inspires, and respects all individuals. All employees share in the responsibility for fulfilling AlphaSense’s commitment to equal employment opportunity. AlphaSense does not discriminate against any employee or applicant on the basis of race, color, sex (including pregnancy), national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any other non-merit factor. This policy applies to every aspect of employment at AlphaSense, including recruitment, hiring, training, advancement, and termination. In addition, it is the policy of AlphaSense to provide reasonable accommodation to qualified employees who have protected disabilities to the extent required by applicable laws, regulations, and ordinances where a particular employee works. Recruiting Scams and Fraud We At AlphaSense Have Been Made Aware Of Fraudulent Job Postings And Individuals Impersonating AlphaSense Recruiters. These Scams May Involve Fake Job Offers, Requests For Sensitive Personal Information, Or Demands For Payment. Please Note AlphaSense never asks candidates to pay for job applications, equipment, or training. All official communications will come from an @alpha-sense.com email address. If you’re unsure about a job posting or recruiter, verify it on our Careers page. If you believe you’ve been targeted by a scam or have any doubts regarding the authenticity of any job listing purportedly from or on behalf of AlphaSense please contact us. Your security and trust matter to us.
Posted 1 week ago
0 years
0 Lacs
Bhagalpur, Bihar, India
On-site
Company Description Infracred is a new-age B2B SaaS commerce and fintech startup offering smart procurement and financing solutions for SMEs. Our platform aggregates materials such as TMT, bars, polymers, billet, cement, and other building materials, passing on the aggregation benefits to SMEs. By providing unsecured credit lines, we serve as a single-window for SMEs in the manufacturing and trading space, making financing more accessible and cost-effective. Role Description This is a full-time hybrid role for an Area Sales Officer located in Bhagalpur, with some work-from-home flexibility. The Area Sales Officer will be responsible for lead generation, managing customer service, executing sales operations, and supporting channel sales activities. They will focus on building relationships with SMEs, understanding their needs, and providing tailored solutions to enhance their procurement and financing experiences. Qualifications Proficiency in Customer Service and Communication skills Experience in Lead Generation and Sales Operations Knowledge in Channel Sales Excellent negotiation and interpersonal skills Ability to work independently and collaboratively in a hybrid work environment Experience in the manufacturing and trading sectors is a plus Bachelor's degree in Business, Marketing, or related field
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Description Job Responsibility: Design, develop, and optimize MongoDB data models for various business and analytics use cases. Implement and maintain efficient MongoDB CRUD operations, indexes, and schema evolution strategies. Experience with self-hosted MongoDB deployments, including installation, configuration, scaling, backup/restore, and monitoring. Build and maintain reporting and analytics pipelines using MongoDB Reporting suite. Develop, monitor, and tune MongoDB (both self-hosted and cloud-managed) deployments for scalability, reliability, and security. Collaborate with engineering and product teams to translate requirements into MongoDB-backed solutions. Support integration with Azure cloud services (e.g., Azure Cosmos DB for MongoDB, Azure Functions, Blob Storage). Maintain documentation and contribute to database standards and best practices. (Nice to have) Support data ingestion and automation tasks using Python. Qualifications Qualifications: Bachelor’s or master’s in computer science, Engineering, or related quantitative discipline. Experience 5 to 8 years of hands-on experience in data engineering or backend development with MongoDB. Demonstrated experience with self-hosted MongoDB, including cluster setup, maintenance, and troubleshooting. Technical Competencies Deep hands-on experience with MongoDB data modelling, schema design, and normalization/denormalization strategies. Strong proficiency in MongoDB development: aggregation pipelines, CRUD, performance tuning, and index management. Experience in building reporting and analytics using MongoDB Reporting suite. Experience with self-hosted MongoDB deployments (e.g., sharding, replication, monitoring, security configuration). Working knowledge of Azure cloud services (Azure Cosmos DB, VMs, App Service, networking for secure deployments). (Nice to have) Experience in Python for backend integration, data processing, or scripting
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
This role is for one of Weekday's clients Min Experience: 5 years Location: Bengaluru JobType: full-time Requirements We are looking for an experienced Data Scientist with a strong background in the CPG (Consumer Packaged Goods) or Retail domain , focusing on category and product analytics , forecasting , and machine learning workflows . The ideal candidate will possess advanced analytical skills, business acumen, and hands-on expertise in modern data science tools and platforms such as Python, SQL, Databricks, PySpark , and CI/CD ML pipelines . As a Data Scientist, you will be responsible for generating actionable insights across product assortment, category performance, sales trends, and customer behaviors. Your work will directly influence decision-making for new product launches , inventory optimization , campaign effectiveness , and category planning , enabling our teams to enhance operational efficiency and drive business growth. Key Responsibilities Category & Product Analytics: Conduct deep-dive analysis into product assortment, SKU performance, pricing effectiveness, and category trends. Evaluate new product launches and provide recommendations for optimization based on early performance indicators. Sales Data Analysis & Forecasting: Analyze historical and real-time sales data to identify key growth drivers, seasonality, and demand patterns. Build statistical and ML-based models to forecast demand and category-level performance at multiple aggregation levels. Customer Analytics (Nice to Have): Analyze loyalty program data and campaign performance metrics to assess customer retention and ROI of promotions. ML Model Development & Deployment: Design, build, and deploy machine learning models using Python and PySpark to address business problems in forecasting, product clustering, and sales optimization. Maintain and scale CI/CD pipelines for ML workflows using tools like MLflow, Azure ML, or similar. Data Engineering and Tooling: Develop and optimize data pipelines on Databricks and ensure reliable data ingestion and transformation for analytics use cases. Use SQL and PySpark to manipulate and analyze large datasets with performance and scalability in mind. Visualization & Stakeholder Communication: Build impactful dashboards using Power BI (preferred) to enable self-service analytics for cross-functional teams. Translate data insights into clear, compelling business narratives for leadership and non-technical stakeholders. Collaboration & Strategic Insights: Work closely with category managers, marketing, and supply chain teams to align data science initiatives with key business objectives. Proactively identify opportunities for innovation and efficiency across product and sales functions. Required Skills & Qualifications Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or a related quantitative field. 5+ years of experience in applied data science, preferably in CPG/Retail/FMCG domains. Proficient in Python, SQL, Databricks, and MLflow. Experience with PySpark and Azure ML is a strong plus. Deep experience with time-series forecasting, product affinity modeling, and campaign analytics. Familiarity with Power BI for dashboarding and visualization. Strong storytelling skills, with the ability to explain complex data-driven insights to senior stakeholders. Solid understanding of challenges and opportunities within the retail and FMCG space. Ability to work independently as well as in cross-functional teams in a fast-paced environment.
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Mandate 3 – Employee will be working from the respective base location of the office/ on field on all the days of the weeks About Swiggy Instamart Swiggy Instamart, is building the convenience grocery segment in India. We offer more than 40000 + assortments / products to our customers within 10-15 mins. We are striving to augment our consumer promise of enabling unparalleled convenience by making grocery delivery instant and delightful. Instamart has been operating in 120+ cities across India and plan to expand to a few more soon. We have seen immense love from the customers till now and are excited to redefine how India shops Job Description Demand Planning: Responsible to run the weekly demand planning and monitoring of forecast accuracy Collaborate with cross functional teams (Growth/Category) to understand demand forecast drivers; Ensure demand plan optimization in line with Customer & Product Category strategy Include causal factors (weather, natural disasters, competitor actions etc) and model events (sales promotions, marketing events etc.) into demand forecasts Develop demand forecasts at multiple levels of aggregation for multiple time horizons, across multiple regions Release of final demand plans by incorporating all inputs as per agreed timelines Supply/Replenishment Planning Manage the entire replenishment process Ensure timely release of POs to all vendors across all cities Manage buying quantities – ensuring optimal order quantities by considering inventory norms, stock on hand, forecast and vendor constraints Collaborate with category team to ensure high levels of supply reliability (OTIF) Inventory Management Manage inventory health – availability and ageing Define and manage Inventory Norms at a Region X SKU level Maintain desired inventory levels at each of the regions Monitor and plan for liquidation of ageing/slow moving SKUs Collaborate with City Teams and Operations to ensure accuracy of stock levels Desired Candidate MBA/B.Tech with 2-4 years of relevant experience in an inventory management role in FMCG/Retail/Q-Com/E-Com Solid understanding of inventory management practices and procedures Rational decision-making, negotiating and influencing skills with an analytical work style Excellent inter-personal and communication skills The ability to thrive in a fast-paced environment "We are an equal opportunity employer, and all qualified applicants will receive consideration for employment without regards to race, colour, religion, sex, disability status, or any other characteristic protected by the law"
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
ESSENTIAL DUTIES AND RESPONSIBILITIES: The position is wide in scope of responsibilities and requires the candidate to be able analyze and execute quickly, tackling and straddling multiple responsibilities simultaneously. Review historical sales trends, research demand drivers, prepare forecast data, develop statistical forecast models, and evaluate forecast results. Create demand forecasts at various levels of aggregation and seasonality as a component of demand planning. Help develop sales and marketing attributes and communicate them to the sales team to drive sales. Provide input to the National Product Team in developing inventory strategies on existing items, new products, and discontinued products. Review competitive market dynamics to identify opportunities and maximize monetization. Oversee incoming and outgoing internal transfers to maintain optimal inventory level to meet turns target. Identify slow moving and obsolete inventory (SLOB) and manage markdown/close out process. Address demand related queries and issues in a timely manner. Analyze product life cycle at a regional level Create ad-hoc reports. The candidate must have a high level of curiosity and propensity to ask the right questions. The candidate must be aggressive in execution but fluid and agreeable in collaborative settings. DESIRED CANDIDATE PROFILE Bachelor’s degree is required. Minimum of 1 years of analytical role preferred. High level of comfort in working with details (Self learner). Experience in sourcing and product development is desired but not necessary. Superb analytical and critical thinking skills, as proven by academic records and/or demonstrable work experience. Must be highly capable working in fast-paced, fast-growth environment. Requires strong organization skills and must be able to work independently with minimal supervision. Excellent communication skills: presentation, writing, and delivery. Excel experience required: intermediate (pivot, v-, h- lookups) to advanced (macro)
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Executive Summary: Skyroot Aerospace is India’s leading spacetech startup developing world-class launch vehicles for the small-satellite segment. We are preparing for the maiden launch of Vikram-1, India’s first private orbital class vehicle. As we commence our commercial operations, we are seeking an accomplished Lead – Business Development to own revenue generation, partnerships and customer success for orbital launch services. Role Purpose: Drive topline growth by converting the Vikram launch vehicle family into the preferred ride to Low-Earth Orbit for commercial, civil-government and defence customers worldwide. The role spans the complete sales life-cycle—from market strategy and lead generation through contract execution and long-term account management. Key Responsibilities: Build and execute the multi-year commercial launch sales strategy, including annual revenue, backlog and margin targets. Identify, qualify and close new business with satellite operators, manufacturers, space agencies, constellation primes and rideshare aggregators. Own end-to-end customer engagements: requirements capture, orbit and mass trade-offs, mission design coordination, pricing, proposal preparation, and negotiation of Launch Service Agreements (LSAs). Maintain a robust opportunity pipeline in CRM; prepare accurate forecasts, win–loss analyses and executive briefings. Monitor global launch-market dynamics—pricing, regulatory changes, competitor capabilities—and brief leadership on strategic implications. Represent Skyroot at international conferences, trade shows, launch campaigns and customer site visits; host VIP delegations at the Hyderabad headquarters. Build, mentor and scale a high-performing business-development team as flight rates increase. Required Qualifications: Bachelor’s degree in Aerospace, Mechanical, Electrical or related Engineering discipline; Master’s in Engineering or MBA is advantageous. Minimum 10 years’ experience in B2B technical sales or business development , with at least 5 years selling high-value, complex hardware or services (space, defence, avionics, satellites or adjacent sectors). Prior international experience is must. Proven track record of closing multi-stakeholder, cross-border contracts. Expertise in contract law, ITAR/EAR, export controls and IN-SPACe/ISRO customer interfaces. Proficiency with Hubspot (or similar CRM), MS-Office/Google Workspace. Excellent negotiation, relationship-building and communication abilities; fluency in English (additional languages a plus). Willingness to travel. Preferred Attributes: Prior experience working at a launch provider, satellite OEM, or space consultancy. Established network within commercial and governmental small-satellite markets. Familiarity with rideshare aggregation platforms and multi-manifest campaign planning. Start-up or scale-up mindset—comfort with rapid iteration, ambiguity and lean resources. What Skyroot Offers: Opportunity to shape India’s private space sector and influence global access to space. Fast-growing, mission-driven culture that combines deep engineering heritage with entrepreneurial agility. Competitive compensation with performance incentives, employee stock options and comprehensive benefits. Front-row seat at Vikram flight campaigns and the chance to leave a lasting legacy in the NewSpace era.
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About The Company Tata Communications Redefines Connectivity with Innovation and IntelligenceDriving the next level of intelligence powered by Cloud, Mobility, Internet of Things, Collaboration, Security, Media services and Network services, we at Tata Communications are envisaging a New World of Communications Roles/Responsibilities Experience in SIEM Tool like ArcSight, LogRhythm SIEM, Threat Intelligence, Malware Analysis, Incident Response Experience in handling SOC customer in MSSP/multi-tenant environment Responsible for the technical Administration or troubleshooting in SIEM ensuring the efficient functionality of the solution Responsible for Incident Validation, Incident Analysis, Solution Recommendation, Good knowledge on implementation, installation, integration troubleshooting and overall functionalities of LogRhythm/ArcSight/QRadar/Splunk Arc Sight/LR/QRadar platform administration, management experience, platform upgrade Experience in troubleshooting platform related issues, Data backup, restoration, retention Experience in creating content based on MITRE Framework Exposure to SOAR, alert aggregation, automation, Playbook creation ArcSight/LR rule base fine tuning, Ongoing log source modifications, Configuration/policy changes, General SIEM Administration, SIEM Content Development Troubleshooting of an incident within IT Security incident response teams of SOC. Maintains awareness of new and emerging cyber-attack threats with potential to harm company systems and networks. Devises and implements countermeasures to mitigate potential security threats. Assists with the development and maintenance of IT security measurement and reporting systems to aid in monitoring effectiveness of IT Security programs. Assists with the development, revision, and maintenance of Standard Operating Procedures and Working Instructions related to IT Security. Good Coordination skills with various other teams for faster resolution/completion. Good to have threat hunting knowledge. Education/Skills BE/B.Tech or equivalent with minimum 7-10 years of experience Work experience of minimum 6 years in SOC Incident Handling, Incident Response Trend Analysis, administration/monitoring of SIEM Tool like ArcSight, LogRhythm SIEM, Threat Intelligence, Malware Analysis, Ability to adapt and follow the processes and guidelines Possess an impeccable work ethic and a high degree of integrity Good Analytical & Problem Solving skills Able to communicate with technical staff/management Flexible to work after office and over weekend if required Highly motivated & customer centric
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As a MongoDB Data Engineer, you will be a key contributor in architecting, modelling, and developing data solutions using MongoDB to support our document and metadata workflows. You will collaborate closely with cross-functional teams to deliver scalable, performant, and secure data platforms, with exposure to Azure cloud infrastructure. You will play a central role in modelling document and transactional data, building aggregation and reporting pipelines, and ensuring best practices in database performance and reliability,including deploying, configuring, and tuning self-hosted MongoDB environments. You will work in a start-up-like environment but with the scale and mission of a global business behind you. The Role: Design, develop, and optimize MongoDB data models for various business and analytics use cases. Implement and maintain efficient MongoDB CRUD operations, indexes, and schema evolution strategies. Experience with self-hosted MongoDB deployments, including installation, configuration, scaling, backup/restore, and monitoring. Build and maintain reporting and analytics pipelines using MongoDB Reporting suite. Develop, monitor, and tune MongoDB (both self-hosted and cloud-managed) deployments for scalability, reliability, and security. Collaborate with engineering and product teams to translate requirements into MongoDB-backed solutions. Support integration with Azure cloud services (e.g., Azure Cosmos DB for MongoDB, Azure Functions, Blob Storage). Maintain documentation and contribute to database standards and best practices. (Nice to have) Support data ingestion and automation tasks using Python. Qualifications: Bachelor’s or master’s in computer science, Engineering, or related quantitative discipline. Experience: 5 to 8 years of hands-on experience in data engineering or backend development with MongoDB. Demonstrated experience with self-hosted MongoDB, including cluster setup, maintenance, and troubleshooting. . Technical Competencies: Deep hands-on experience with MongoDB data modelling , schema design, and normalization/denormalization strategies. Strong proficiency in MongoDB development : aggregation pipelines, CRUD, performance tuning, and index management. Experience in building reporting and analytics using MongoDB Reporting suite. Experience with self-hosted MongoDB deployments (e.g., sharding, replication, monitoring, security configuration). Working knowledge of Azure cloud services (Azure Cosmos DB, VMs, App Service, networking for secure deployments). (Nice to have) Experience in Python for backend integration, data processing, or scripting
Posted 1 week ago
4.0 years
10 - 13 Lacs
Janakpuri
Remote
Key Responsibilities: Application Deployment & Environment Setup: Deploy and manage Ruby on Rails applications across development, staging, and production environments. Implement automated deployment pipelines using tools like Capistrano, Docker, or CI/CD services (GitHub Actions, GitLab CI, etc.). Configure and maintain environment-specific settings and secrets (e.g., using dotenv, Rails credentials, or ENV variables). Set up and manage hosting environments (Heroku, AWS, DigitalOcean, etc.). Configuration Management: Manage and maintain environment variables, feature toggles, and application settings. Ensure consistency across all environments through configuration tracking and versioning. Support feature rollout strategies (e.g., flags, toggles) for safe production releases. Background Job Monitoring (Sidekiq/Queue Management): Set up, monitor, and manage Sidekiq for background job processing. Monitor queue performance and implement strategies for scaling workers. Troubleshoot and recover failed jobs with retry strategies and alert mechanisms. Maintain Sidekiq Web UI and implement job prioritization. Log Analysis & Performance Monitoring: Parse and analyze Rails logs to identify performance bottlenecks and recurring errors. Integrate log aggregation tools (e.g., Logstash, Fluentd, or ELK stack) for better visibility. Implement tools for proactive error detection (e.g., Sentry, Honeybadger, Rollbar). Collaborate with development teams to troubleshoot issues and suggest improvements. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,300,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Paid sick time Paid time off Work from home Ability to commute/relocate: Janakpuri, Delhi, Delhi: Reliably commute or planning to relocate before starting work (Preferred) Experience: Ruby on Rails: 4 years (Required) Location: Janakpuri, Delhi, Delhi (Required) Work Location: In person Application Deadline: 04/08/2025 Expected Start Date: 04/08/2025
Posted 1 week ago
8.0 years
0 Lacs
Delhi
On-site
Full time | Work From Office This Position is Currently Open Department / Category: TEAM LEAD Listed on Jul 25, 2025 Work Location: NEW DELHI BANGALORE HYDERABAD Job Descritpion of Application Support Lead 8+ Years Relevant Experience We are seeking an experienced Application Support Lead with over 8 years of hands-on expertise in production support, incident management, and performance monitoring. The ideal candidate should possess strong problem-solving capabilities, leadership qualities, and a deep understanding of modern monitoring and alerting tools, API troubleshooting, and SLA/KPI compliance. You will lead a team of support engineers, ensuring the stability, performance, and reliability of production systems. Key Responsibilities: Lead and coordinate day-to-day application support activities for production environments. Use tools like Postman to troubleshoot API/web page issues and understand HTTP response codes (2xx, 3xx, 4xx, 5xx). Leverage monitoring/reporting tools such as Grafana, Coralogix, Datadog for system performance and health checks. Analyze and optimize key business performance indicators (KPIs) and service level metrics. Collect and analyze performance data to perform root cause analysis (RCA) and implement corrective and preventive actions (CAPA). Drive incident resolution and ensure SLA adherence with proper documentation and tracking. Train new team members, set clear expectations on SOPs/SLAs, and perform regular knowledge-sharing sessions. Proactively identify and suggest process improvements or automation opportunities. Collaborate with cross-functional teams and stakeholders to resolve issues and implement improvements. Maintain a clear view of personal and team priorities and tasks; manage a 3–5 member team effectively. Participate in Agile ceremonies and provide support documentation and backlog grooming when needed. Required Skills & Qualifications: 8+ years of experience in application/production support and leading support teams. Hands-on experience with Postman and API troubleshooting. Familiarity with HTTP status codes and common server error diagnostics. Proficiency in tools such as Grafana, Datadog, Coralogix, or similar. Strong troubleshooting, debugging, and performance-tuning abilities. Experience in incident management, change control processes, and technical documentation. Strong leadership, organizational, and communication skills. Ability to function both independently and collaboratively in a fast-paced environment. Good to Have: Experience with automation scripting/tools to reduce manual effort. Working knowledge of AWS cloud services, including alert configuration and tracing in APM/log aggregation tools (e.g., ELK, Splunk). Familiarity with Agile methodologies, Kanban, and JIRA. Understanding of CI/CD pipelines and DevOps culture. Required Skills for Application Support Lead Job postman Grafana Coralogix Datadog Our Hiring Process Screening (HR Round) Technical Round 1 Technical Round 2 Final HR Round
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough