Jobs
Interviews

149365 Python Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

20 - 37 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Kinaxis RapidResponse Consultant Industry & Sector: Enterprise supply-chain software and consulting for manufacturing, retail, high-tech and life-sciences companies focused on digital supply chain planning, S&OP/IBP and demand-supply orchestration. We deliver on-site implementations of advanced planning systems and business-process transformation to improve forecast accuracy, inventory efficiency and resilience. Location & Workplace: India — On-site role (client-facing consulting and implementation). Role & Responsibilities Lead end-to-end Kinaxis RapidResponse implementations on-site: requirements gathering, solution design, configuration, testing, cutover and hypercare for S&OP/IBP, demand & supply planning use-cases. Design and build RapidResponse models, rulesets, schedules and scenario-based what-if engines; translate business processes into robust data and planning models. Develop and validate data integration pipelines: ETL/data mapping, SQL transforms, flat-file and API-based integrations (REST/SOAP), ensuring end-to-end data quality and lineage. Create analytics, dashboards and KPIs inside RapidResponse; implement alerts, exception management and executive reporting to enable faster decision-making. Drive testing and deployment activities—unit, system and UAT—deliver training, user documentation and provide on-site go-live support and hypercare. Collaborate with cross-functional client teams (IT, supply chain, finance), mentor junior consultants and recommend continuous improvements and best practices. Skills & Qualifications Must-Have 4+ years hands-on Kinaxis RapidResponse implementation experience (model building, rules, scheduler and scenario planning). Strong understanding of supply-chain domains: demand planning, supply planning, inventory optimisation and S&OP/IBP processes. Proficient in SQL, data mapping and ETL processes; experience integrating enterprise data via APIs/web services. Proven client-facing consulting experience with strong communication, workshop facilitation and stakeholder-management skills. Experience with UAT, testing cycles, go-live support and creation of training materials and documentation. Bachelor’s degree in Engineering, Supply Chain, Computer Science, Business or related field; willing to work on-site across client locations in India. Preferred Kinaxis RapidResponse certification or formal RapidResponse training. Familiarity with Kinaxis adapters/MDS, JavaScript within RapidResponse and Python-based automation for data processing. Experience implementing RapidResponse for manufacturing, automotive, retail, semiconductor or pharmaceutical clients. Benefits & Culture Highlights Competitive compensation with opportunities for certification, professional development and rapid career progression within supply-chain consulting. High-impact, client-facing role in a fast-paced transformation practice—exposure to large-scale global projects and senior stakeholders. Collaborative, mentor-driven environment that values practical problem-solving and continuous improvement. How to Apply: If you are an experienced RapidResponse implementation professional available for on-site assignments in India, submit your CV highlighting Kinaxis RapidResponse projects, supply-chain domain experience and availability. Shortlisted candidates will be contacted for an initial technical and behavioural interview. Skills: kinaxis,rapid response,data mapping

Posted 11 hours ago

Apply

3.0 - 4.0 years

0 Lacs

Mohali district, India

On-site

The Digital Customer Solutions department at Accelleron is developing performance and condition-based maintenance solutions. We are looking for a dedicated, motivated, and highly skilled Data Engineer with strong Python skills and extensive experience in building from scratch, maintaining and automating data pipelines. The candidate will be part of an agile team responsible for developing analytics solutions based on operational sensor data from a wide range of applications in various industries such as marine, power generation and locomotive. This data is the foundation to develop and provide advanced, value-adding analytic solutions for our external and internal customers. Your Responsibilities: Drive development of data transformation and management systems. Continuously improve existing systems to maintain required performance. Design, implement and maintain the appropriate APIs and solutions to ensure that the analytics team can query time-series and transactional data. Manage, monitor, and improve our workflow automation data pipelines with tools such as Apache Airflow. Support the Data Science team in accessing needed data. Proactively search for components in the data collection and analysis processes that need adjustments and build improvements. Ensure the scalability of the developed solutions with proper infrastructure (e.g. Cloud-based), in collaboration with the frontend and backend developer teams. Your Background: 3-4 years of experience in data engineering/software development and academic background in S.T.E.M. Advanced Python, SQL, and Bash knowledge - any other language is a plus Worked with cloud services, preferably MS Azure Experience in building and automating data pipelines Familiarity with Apache Airflow is a strong plus Experience with time series databases (e.g. InfluxDB) is a strong plus Hands-on experience with relational databases (e.g. PostgreSQL) Fluent in English (written and spoken)

Posted 11 hours ago

Apply

162.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About Birlasoft: Birlasoft, a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. We take pride in our consultative and design thinking approach, driving societal progress by enabling our customers to run businesses with unmatched efficiency and innovation. As part of the CKA Birla Group, a multibillion-dollar enterprise, we boast a 12,500+ professional team committed to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our dedication to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. About the Job – Our project is hiring a skilled Datacience Candidate who has good hands on experience on Agentic AI frameworks Job Title - DataScience Location: Noida Educational Background: Bachelor’s degree in computer science, Information Technology, or related field. Mode of Work- Hybrid Experience Required - 8+ years Key Responsibilities Must have 10+ years of experience working in Data science, Machine learning and especially NLP technologies. Solid understanding of Model development, model serving, training/re-training techniques in a data sparse environment. Must have experience with Agentic AI frameworks – LangGraph, LlamaIndex, MCP etc. Expert in using paid (OpenAI on Azure) and open source LLMs Strong understanding of Agents development. Experience with Python programming language in a must. Ability to develop Python code as needed and train the Developers based on business needs. Experience with AWS / Azure ecosystem is a must. Preferable candidate should possess Pharma R&D background Strong understanding of cloud native application development in AWS / Azure. Able to apply deep learning and generative modeling techniques to develop LLM solutions in the field of Artificial Intelligence. Utilize your extensive knowledge and expertise in machine learning (ML) with a focus on generative models, cluding but not limited to generative adversarial networks (GANs), variational autoencoders (VAEs), and transformer-based architectures. Very good understanding of Prompt engineering techniques in developing Instruction based LLMs. Must be able to design, and implement state-of-the-art generative models for natural language processing (NLP) tasks such as text generation, text completion, language translation, and document summarization. Work with SAs and collaborate with cross-functional teams to identify business requirements and deliver solutions that meet the customer needs.• Passionate to learn and stay updated with the latest advancements in generative AI and LLM. Nice to have -contributions to the research community through publications, presentations, and participation in relevant conferences or workshops. Evaluate and preprocess large-scale datasets, ensuring data quality and integrity, and develop data pipelines for training and evaluation of generative models. Ability to articulate to business stakeholders on the hallucination effects and various model behavioral analysis techniques followed. Exposure to developing Guardrails for LLMs both with open source and cloud native models. Collaborate with software engineers to deploy and optimize generative models in production environments, considering factors such as scalability, efficiency, and real-time performance. Nice to have- provide guidance to junior data scientists, sharing expertise and knowledge in generative AI and LLM, and contribute to the overall growth and success of the data science team. Expert in RDBMS database Experience on Marklogic / No SQL database Experience on Elastic search

Posted 11 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Amber Amber is a global student accommodation platform that helps students find and book their ideal housing near top universities across the world. With a presence in 100+ cities and partnerships with 300+ property providers, we're transforming the student housing experience through technology, transparency, and data-driven growth. Role Overview We’re looking for a skilled and analytical SEO Analyst with 1–3 years of hands-on experience to help scale our organic growth across global markets. You’ll be responsible for driving technical and content SEO improvements, analyzing performance data, tracking keyword/rank movements, and staying ahead of evolving trends such as Google’s Generative Search (SGE/GEO) and Answer Engine Optimization (AEO) . CTC Offered - 4.5-5LPA It is a strictly work from office role Key Responsibilities Conduct detailed keyword research , URL mapping , and content gap analysis for service, city, and blog pages. Execute regular technical SEO audits using Screaming Frog, Google Search Console, and PageSpeed tools and work with developers to fix crawl, indexation, and CWV issues. Monitor and report rank tracking across high-priority keywords and regions using tools like Ahrefs and SEMrush. Build automated SEO dashboards and reports using Google Sheets , Looker Studio , and formulas Perform competitor analysis to benchmark Amber’s performance and discover new backlink/content opportunities. Implement and optimize schema markup (FAQ, Review, How-to, etc.) to enhance rich results and AI Overview visibility. Stay updated on the latest Google algorithm updates, SGE/AEO developments, and zero-click SERP trends. Support link-building and content strategy teams by surfacing high-opportunity pages and keywords. Required Skills & Experience 2–3 years of experience in SEO, ideally with marketplaces or global B2C platforms. Proficient in Screaming Frog , SEMrush , Ahrefs , Google Search Console , and GA4 . Strong analytical skills with fluency in Google Sheets (including advanced formulas and visual reporting). Experience in rank tracking , SERP feature targeting , and competitor benchmarking . Knowledge of emerging SEO trends like GEO/SGE , AEO , and entity-based search . Ability to work collaboratively with product, content, and engineering teams. Bonus Points Experience with mobile/app SEO (App Store Optimization for iOS/Android). Experience with China-specific SEO platforms or strategies (e.g., Baidu, Sogou, Shenma). Exposure to international SEO Familiarity with Looker Studio connectors, automated reporting, or SEO Python scripts. Why Join Amber? Work at one of the fastest-growing global student platforms. Exposure to high-scale SEO challenges across international markets. Flat hierarchy, high ownership, and direct access to leadership. A collaborative team that values innovation, data, and transparency.

Posted 11 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

CCAI Lead Location: Pune/Hyderabad/ Bangalore/ Noida/ Delhi NCR/ Gurgaon Exp: 5+ Years We are looking for a Conversational AI Developer to join our team. Your role involves creating chatbots and virtual assistants to improve user experiences. You will work closely with other teams to understand needs, design solutions, and deploy them effectively. Develop chatbots and virtual assistants using AI techniques. Collaborate with teams to gather requirements and design conversational flows. Integrate AI capabilities with existing systems and services. Test and refine AI models for accuracy and usability. Stay updated on AI advancements to improve our solutions. Help mentor junior developers. Required Skills and Abilities: Experience with AI platforms like Dialogflow or Lex. 5+ years of experience in software development, with a focus on AI or NLP. Proficient in Python, Java, or JavaScript. Familiarity with NLP libraries like NLTK or spaCy. Strong problem-solving skills and ability to work in a team. Good communication skills. About Us! A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation. We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, DataStage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization. Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake. We have our own products! Eagle – Data warehouse Assessment & Migration Planning Product Raven – Automated Workload Conversion Product Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud. Why join us! Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success. Check out more about us on our website below! www.datametica.com

Posted 11 hours ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

Remote

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do: Power BI We are looking for experienced Power BI developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. Power Apps We are looking for experienced Power Apps developers who have the following set of technical skillsets and experience. Create multi-page complex Canvas PowerApps using CDS/ SharePoint, SQL etc. Create model driven app and in depth-understanding of Dataverse, Business Rules, java script embedding, PCF component. Detailed Understanding of Power BI Concepts and DAX is a standout skill required. Use data modelling and transformation techniques to create complex tools/ processes. Strong understanding of Power Automate, Power Automate Desktop and using Automate flows in PowerApps. Good Understanding of Python is must. Strong understanding of various controls and limitations in PowerApps like delegation, charts etc. In depth understanding of Components within Apps, integration of components with Canvas and Model Driven Apps. Take ownership of high-quality deliverables by QCing end to end Tools (both Functionality and Performance). Understanding of basic concepts of Agile/ Waterfall development methodologies. What you’ll bring: Power BI Bachelor’s or master’s degree in computer science, Engineering, or a related field. 1+ years of professional experience in Power BI development. Data Visualization: Proficiency in creating compelling and effective visualizations to communicate insights using Power BI's various chart types and features. Power BI Desktop: Mastery of Power BI Desktop for designing reports and dashboards, including data loading, data modeling, and creating calculated measures. Data Transformation: Ability to clean, transform, and shape data using Power Query in Power BI, ensuring data quality and relevance. DAX (Data Analysis Expressions): Strong understanding and application of DAX, a formula language used in Power BI for creating custom calculations and aggregations. Power BI Service: Knowledge of Power BI Service for publishing, sharing, and collaborating on Power BI reports and dashboards. Data Connectivity: Experience connecting Power BI to various data sources, including databases, cloud services, and on-premises data sources. Performance Optimization: Knowledge of techniques to optimize PowerBI dashboards for speed and efficiency. Data Modeling: Proficiency in designing effective data models within Power BI, including relationships between tables and optimizing data for reporting. Power Apps Bachelor’s or master’s degree in computer science, Engineering, or a related field. 1+ years of professional experience in Power Apps development.. Proficient in understanding data and excel or SQL data transformations. It is preferred that candidate has working experience of connecting PowerApps with multiple sources like Dynamics Dataverse, SharePoint, Excel, API etc. Awareness and familiarity with the evolving nature of constant updates in Power Platform. Experienced with designing and developing complex processes and functions using the best and efficient manner. Strong analytic, problem solving, and programming ability. Innovative mindset with motivation to try new methodologies and contribute. Strong oral and written communication skills with fluency in English. Experience in Python coding. Ability to work in a cross-office environment. PowerApps App Maker or Power Platform + Dynamics Certification like PL 100, PL 900, MB 200 etc. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams. Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com

Posted 11 hours ago

Apply

4.0 - 8.0 years

0 Lacs

Nagpur, Maharashtra, India

On-site

About Company At Delaplex, we believe true organizational distinction comes from exceptional products and services. Founded in 2008 by a team of like-minded business enthusiasts, we have grown into a trusted name in technology consulting and supply chain solutions. Our reputation is built on trust, innovation, and the dedication of our people who go the extra mile for our clients. Guided by our core values, we don’t just deliver solutions, we create meaningful impact. QA Automation Engineer We are seeking a highly skilled and experienced QA Automation Engineer to join our team. The ideal candidate will have a strong background in test automation, with specific expertise in Warehouse Management Systems (WMS) projects and a proven ability to manage the complete test automation cycle . This role is critical for ensuring the quality, reliability, and performance of our software solutions. Key Responsibilities Design, develop, and maintain robust test automation frameworks and scripts. Lead the entire test automation cycle, from test case design and execution to defect tracking and reporting. Specialize in testing applications related to Warehouse Management Systems (WMS), including but not limited to inventory control, order processing, and logistics workflows. Collaborate closely with product managers, developers, and other QA team members to ensure comprehensive test coverage. Analyze and document test results, providing detailed reports on application quality and performance. Continuously research and implement new testing tools, strategies, and technologies to improve the overall QA process. Required Skills & Experience 4-8 years of professional experience in Quality Assurance, with a significant focus on test automation. Demonstrable experience working on projects involving Warehouse Management Systems (WMS). Strong proficiency in developing and maintaining end-to-end automation test suites and frameworks. Hands-on experience with at least one major automation tool such as Selenium, Cypress, or Playwright. Solid programming skills in a language like Python, Java, or JavaScript. Familiarity with CI/CD tools and integrating automated tests into the development pipeline. Excellent analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Skills: test automation,qa automation,warehouse management systems

Posted 11 hours ago

Apply

3.0 years

10 - 18 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities Partner with product managers, engineers, and business stakeholders to define KPIs and success metrics for Creator Success Create comprehensive dashboards and self-service analytics tools using QuickSight, Tableau, or similar BI platforms Design, build, and maintain robust ETL/ELT pipelines to process large volumes of streaming and batch data from Creator Success platform Develop and optimize data warehouses, data lakes, and real-time analytics systems using AWS services (Redshift, S3, Kinesis, EMR, Glue) 3+ years of experience in business intelligence/analytic roles with proficiency in SQL, Python, and/or Scala Strong experience with AWS cloud services (Redshift, S3, EMR, Glue, Lambda, Kinesis) Expertise in building and optimizing ETL pipelines and data warehousing solutions Proficiency with big data technologies (Spark, Hadoop) and distributed computing frameworks High proficiency in SQL and Python Expertise in building and optimizing ETL pipelines and data warehousing solutions Experience with business intelligence tools (QuickSight, Tableau, Looker) and data visualization best practices Experience with AWS cloud services (Redshift, S3, EMR) Skills: business intelligence,sql,aws,python,power bi,tableau

Posted 11 hours ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Mandatory: Proficiency in Python with experience in Databricks (PySpark) Good to Have: Hands-on experience with Apache Airflow. Working knowledge of PostgreSQL, MongoDB. Basic experience on cloud technologies like Azure, AWS and Google.

Posted 11 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description We are seeking a skilled professional with expertise in Machine Learning, Data Science, Statistics, Generative AI, Retrieval-Augmented Generation (RAG), and Python programming . The role involves end-to-end project ownership , from requirement gathering and model design to deployment and performance monitoring. The candidate will lead and mentor a small team of 2–3 members , ensuring high-quality deliverables and adherence to timelines. Key Responsibilities: Design, develop, and deploy ML and AI-based solutions. Apply statistical analysis and data modeling to derive actionable insights. Implement Generative AI and RAG techniques in real-world applications. Write efficient, maintainable code in Python. Manage full project lifecycle, from concept to delivery. Mentor junior team members, providing technical guidance and performance feedback. Qualifications: Proven experience in ML, Data Science, Statistics, and Python. Hands-on expertise in Generative AI and RAG. Strong problem-solving and analytical skills. Experience leading small teams and managing end-to-end projects.

Posted 11 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role We are seeking a highly skilled and experienced Senior Data Engineer with a deep expertise in PySpark to join our ML/Data engineering team. This team is responsible for feature development, data quality checks, deploying and integrating ML models with backend services and the overall Tide platform. In this role, you will be instrumental in designing, developing, and optimizing our next-generation data pipelines and data platforms. You will work with large-scale datasets, solve complex data challenges, and contribute to building robust, scalable, and efficient data solutions that drive business value. This is an exciting opportunity for someone passionate about big data technologies, performance optimization, and building resilient data infrastructure. As a Data Engineer You’ll Be Performance Optimization: Identify, diagnose, and resolve complex performance bottlenecks in PySpark jobs and Spark clusters, leveraging Spark UI, query plans, and advanced optimization techniques (e.g., partitioning, caching, broadcasting, AQE, UDF optimization). Design & Development: Lead the design and implementation of highly scalable, fault-tolerant, and optimized ETL/ELT pipelines using PySpark for batch and potentially real-time data processing. Data Modeling: Collaborate with data scientists, analysts, and product teams to understand data requirements and design efficient data models (e.g., star/snowflake schemas, SCDs) for analytical and operational use cases. Data Quality & Governance: Implement robust data quality checks, monitoring, and alerting mechanisms to ensure the accuracy, consistency, and reliability of our data assets. Architectural Contributions: Contribute to the overall data architecture strategy, evaluating new technologies and best practices to enhance our data platform's capabilities and efficiency. Code Review & Best Practices: Promote and enforce engineering best practices, including code quality, testing, documentation, and version control (Git). Participate actively in code reviews. Mentorship & Leadership: Mentor junior data engineers, share knowledge, and contribute to a culture of continuous learning and improvement within the team. Collaboration: Work closely with cross-functional teams including software engineers, data scientists, product managers, and business stakeholders to deliver impactful data solutions. What Are We Looking For 8+ years of professional experience in data engineering, with at least 4+ years specifically focused on PySpark development and optimization in a production environment. Expert-level proficiency in PySpark including Spark SQL, DataFrames, RDDs, and understanding of Spark's architecture (Driver, Executors, Cluster Manager, DAG). Strong hands-on experience with optimizing PySpark performance on large datasets, debugging slow jobs using Spark UI, and addressing common issues like data skew, shuffles, and memory management. Excellent programming skills in Python with a focus on writing clean, efficient, and maintainable code. Proficiency in SQL for complex data manipulation, aggregation, and querying. Basic understanding of data warehousing concepts (dimensional modeling, ETL/ELT processes, data lakes, data marts). Experience with distributed data storage solutions such as Delta Lake, Apache Parquet etc. Familiarity with version control systems (Git). Strong problem-solving abilities, analytical skills, and attention to detail. Excellent communication and interpersonal skills, with the ability to explain complex technical concepts to both technical and non-technical audiences. Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. What You Will Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .

Posted 11 hours ago

Apply

7.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

🚀 We’re Hiring: Senior DevOps Engineer – II Industry : IT Services Job Type : Full-time Department : Engineering Technology Focus : DevOps Experience : 5–7 years Location : Udaipur & Gurugram, India We’re seeking a Senior DevOps Engineer who excels in building secure, scalable, and high-performance infrastructure for complex, high-load environments. You’ll partner with leading engineering teams to design, optimise, and deploy resilient systems that power mission-critical applications. What You’ll Do Design, implement, and manage CI/CD pipelines (Jenkins, GitHub Actions, CircleCI, Azure DevOps). Architect enterprise-scale infrastructure using AWS, Kubernetes, Terraform, Ansible, Linux . Manage and optimise distributed databases ( PostgreSQL, MySQL, MongoDB ). Implement infrastructure-as-code, automation, and monitoring ( Prometheus, Grafana, ELK, Datadog, CloudWatch ). Build secure, compliant systems aligned with SOC 2, HIPAA , and CIS standards. Lead multi-cloud and hybrid deployments, including AWS, Azure, and on-prem . Mentor junior engineers and drive best practices in DevOps. Mandatory Skills 5+ years in DevOps with production-grade systems. Strong expertise in AWS, Kubernetes, Terraform, Linux, Networking & Security . Proven CI/CD pipeline implementation experience. Proficiency in Bash/Python scripting and infrastructure automation. Deep understanding of network stacks, load balancing, SSL, IAM . Preferred : AWS Solutions Architect / RHCE certification, container orchestration (ECS, Docker Swarm). Why Join Us? Premium MacBook setup, flexible hours, quarterly bonuses, learning budget, healthcare for family, celebrations, retreats, and more. 📩 Apply Now and help us build the future of secure, scalable systems.

Posted 11 hours ago

Apply

0.0 - 4.0 years

6 - 32 Lacs

Hyderabad, Telangana

On-site

Senior Software Engineer Location: Bangalore or Hyderabad, India Workplace Type: Hybrid About the Role We are seeking a talented and passionate Senior Software Engineer to join our dynamic team. In this role, you will be instrumental in crafting innovative software solutions, building greenfield products, and mentoring junior developers. You will thrive in our flat, transparent culture, collaborating with high-impact teams to shape the future of software engineering. If you have a passion for software engineering, a customer-centric mindset, and a desire to work with modern technologies, we encourage you to apply. Key Responsibilities Craft beautiful software experiences using Design Thinking, Lean, and Agile methodologies. Build greenfield products with modern tech stacks such as Java, Python, JavaScript, Go, and Scala. Collaborate effectively in a flat, transparent culture within high-impact teams. Mentor junior developers, providing guidance and support to foster their growth. Participate in code reviews to ensure code quality and adherence to best practices. Contribute to the design and architecture of new and existing systems. Troubleshoot and resolve complex technical issues. Stay up-to-date with the latest technologies and trends in software development. Required Skills & Qualifications Minimum of 4 years of experience in software development. Hands-on development experience with a broad mix of languages such as Java, Python, and JavaScript. Strong server-side development experience, primarily in Java (Python and NodeJS are also considerable). Experience with UI development using ReactJS, AngularJS, PolymerJS, EmberJS, or jQuery. Passion for software engineering and following best coding practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Additional Information This is a hybrid role based in either Bangalore or Hyderabad, India. We are looking for immediate joiners or candidates who can join within 15 days. The interview process consists of two technical rounds: the first round is virtual, and the second round is face-to-face in Hyderabad. Nice to Have: Product and customer-centric mindset. Great OO skills, including design patterns. Experience with DevOps, continuous integration & deployment. Exposure to big data technologies, Machine Learning, and NLP. Job Types: Full-time, Permanent Pay: ₹660,995.08 - ₹3,200,000.00 per year Benefits: Paid sick time Provident Fund Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Are you available for face to face interview in Hyderabad? Willingness to travel: 100% (Required) Work Location: In person

Posted 11 hours ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Mohali, Punjab

On-site

The Role As a DevOps Engineer , you will be an integral part of the product and service division, working closely with development teams to ensure seamless deployment, scalability, and reliability of our infrastructure. You'll help build and maintain CI/CD pipelines, manage cloud infrastructure, and contribute to system automation. Your work will directly impact the performance and uptime of our flagship product, BotPenguin. What you need for this role Education: Bachelor's degree in Computer Science, IT, or a related field. Experience: 2-5 years in DevOps or similar roles. Technical Skills: Proficiency in CI/CD tools like Jenkins, GitLab CI, or GitHub Actions. Experience with containerization and orchestration using Docker and Kubernetes. Strong understanding of cloud platforms, especially AWS & Azure. Familiarity with infrastructure as code tools such as Terraform or CloudFormation. Knowledge of monitoring and logging tools like Prometheus, Grafana, and ELK Stack. Good scripting skills in Bash, Python, or similar languages. What you will be doing Build, maintain, and optimize CI/CD pipelines. Monitor and improve system performance, uptime, and scalability. Manage and automate cloud infrastructure deployments. Work closely with developers to support release processes and environments. Implement security best practices in deployment and infrastructure management. Ensure high availability and reliability of services. Document procedures and provide support for technical troubleshooting. Contribute to training junior team members, and assist HR and operations teams with tech-related concerns as required. Top reasons to work with us Be part of a cutting-edge AI startup driving innovation in chatbot automation. Work with a passionate and talented team that values knowledge-sharing and problem-solving. Growth-oriented environment with ample learning opportunities. Exposure to top-tier global clients and projects with real-world impact. A culture that fosters creativity, ownership, and collaboration. Detail-oriented with a focus on automation and efficiency. Strong problem-solving abilities and proactive mindset. Effective communication and collaboration skills. Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per year Benefits: Health insurance Leave encashment Provident Fund Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: DevOps: 3 years (Required) Work Location: In person

Posted 11 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : MySQL,Python,Bigdata,Data Science,Data Analytics,Data Analysis,Cloud,AWS,Business Intelligence (BI),Statistical Modeling,R,Big Data Platforms,Tableau Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirement: Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Preferred Skills: Experience with big data technologies (Google BigQuery, AWS, etc.). Familiarity with machine learning techniques and predictive modeling. Knowledge of business intelligence (BI) tools and reporting frameworks. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 12 hours ago

Apply

3.0 years

0 - 0 Lacs

Bhopal, Madhya Pradesh

On-site

Cybersecurity Application Security Consultant – DevSecOps Position Type: Permanent Location: Bhopal, Madhya Pradesh, India Salary: ₹50,000 INR per month About the Role We are seeking a highly motivated and skilled Cybersecurity Application Security Consultant with expertise in DevSecOps practices to join our growing team in Bhopal. This is a permanent position where you will play a crucial role in integrating security into every phase of the Software Development Life Cycle (SDLC), from design to deployment and operations. You will work closely with development, operations, and QA teams to ensure our applications are secure by design and by default. Key Responsibilities Security Integration: Integrate security tools and processes into CI/CD pipelines (DevSecOps) to automate security testing, vulnerability scanning, and compliance checks. Application Security Testing: Conduct various application security tests, including Static Application Security Testing (SAST), Dynamic Application Security Testing (DAST), Interactive Application Security Testing (IAST), and Software Composition Analysis (SCA). Vulnerability Management: Identify, analyze, and prioritize security vulnerabilities in applications and provide actionable recommendations for remediation. Security Architecture Review: Participate in the design and architecture reviews of new and existing applications to identify potential security risks and recommend secure design patterns. Threat Modeling: Perform threat modeling exercises to identify potential threats and vulnerabilities early in the development lifecycle. Security Best Practices: Advocate for and implement secure coding guidelines, industry standards (e.g., OWASP Top 10, SANS Top 25), and security best practices within development teams. Security Training & Awareness: Provide guidance and training to development teams on secure coding practices and application security principles. Incident Response Support: Assist in the investigation and resolution of application security incidents. Documentation: Maintain comprehensive documentation of security findings, remediation efforts, and security policies. Required Skills and Qualifications Education: Bachelor's degree or Engineering in Computer Science, Information Technology, Cybersecurity, or a related field. Experience: Proven experience (3+ years) in application security, with a strong focus on DevSecOps principles and practices. Development Experience: Practical experience in software development and understanding the full development lifecycle. Technical Proficiency: Strong understanding of web application security vulnerabilities (OWASP Top 10) and secure coding practices. Experience with security testing tools (e.g., Burp Suite, OWASP ZAP, Nessus, SonarQube, Checkmarx, Fortify). Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI/CD, Azure DevOps, GitHub Actions, Semgrep, OpenGrep). Proficiency in at least one scripting language (e.g., Python, Bash) for automation. Understanding of cloud security principles (AWS, Azure, GCP) is a plus. Knowledge of containerisation technologies (Docker, Kubernetes) and their security implications. DevSecOps Mindset: Strong understanding of how to embed security into agile and DevOps methodologies. Communication: Excellent written and verbal communication skills, with the ability to explain complex security concepts to technical and non-technical stakeholders. Problem-Solving: Strong analytical and problem-solving skills with a keen eye for detail. Preferred Qualifications Engineering in Computer Science or Cybersecurity Job Type: Full-time Pay: ₹40,000.00 - ₹45,000.00 per month Work Location: In person

Posted 12 hours ago

Apply

0.0 years

0 - 0 Lacs

Viman Nagar, Pune, Maharashtra

On-site

About the Role: We’re looking for a sharp and motivated Software Testing Intern with manual and automation testing knowledge. Join our QA team in Pune to work on real-world applications and gain hands-on experience in ensuring software quality. Roles & Responsibilities: Execute manual and automated test cases for web/mobile apps Develop test scripts using tools like Selenium, Postman, JUnit , etc. Identify and report bugs; track issues using Jira/Bugzilla Collaborate with developers and designers to improve product quality Participate in sprint planning and QA discussions. Qualifications: Pursuing/completed a degree in CS, IT, or related field Familiarity with manual + automation testing tools Basic coding/scripting (Java, Python, or JavaScript) Understanding of SDLC, STLC, and Agile workflows Detail-oriented with strong communication skills What You’ll Gain: Real-world QA experience (manual + automation) Mentorship from experienced professionals Internship certificate & PPO opportunity Exposure to live projects in an agile environment. Job Type: Internship Contract length: 6 months Pay: ₹5,000.00 - ₹10,000.00 per month Benefits: Flexible schedule Education: Bachelor's (Required) Location: Viman Nagar, Pune, Maharashtra (Required) Shift availability: Night Shift (Preferred) Day Shift (Preferred) Work Location: In person

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

JOB PURPOSE Responsible for building growth product experiences and infrastructure to grow the business across Mobile APP & Web platforms . Role will entail owning the strategy, roadmap, and execution of key growth product initiatives focused on unlocking remarkable experiences for our existing customers through data driven customer insights. PRINCIPAL ACCOUNTABILITIES Product Vision & Roadmap for driving X-Sell/Up-Sell of Financial Services through APP & Web Channels Be responsible for creating a product vision and a roadmap that aligns with the overall cross-sell strategy of DMI vision leveraging customer research, customer experience and new technologies. Work with cross-functional teams to design, develop, and launch new products to drive higher X-Sell/Up-Sell/retention. End-to-end ownership of delivering the products in line with the roadmap Think holistically across the end-to-end journey of a customer and develop strategy and roadmap for when and how we should engage customers across different automated and human-assisted channels (sales, support, self-serve emails, …) to successfully address awareness gaps and friction points in the most cost-effective way. Write crystal clear User Stories & product requirements & work with the team to be create design/wireframe. Cooperating with the team to ensure quick implementation of product growth plans. Prior Experience in Mobile APP is a must. Responsible for monetization of the customer base by launching new products & business lines e.g., Insurance, Warranty, Payments, UPI etc. Optimize the product for customer and business needs through Funnel Optimization Be outcome-oriented; everything we do needs metrics, and the role will be expected to define and improve these metrics. Setting and ensuring the company achieves specific KPIs in product-related metrics, such as activation, acquisition, conversion, retention, referrals, and revenue. Performing detailed product analysis which covers the product management funnel with all its aspects. Improving various aspects of customer experience, such as early marketing impressions and new account onboarding Determining and monitoring the effects of the metrics optimization Drive retention strategies & drive higher DAU, MAU & reduce Churn. P&L Management Maximizing the customer lifetime value by managing the lifecycle & increasing retention Increase the X-Sell ratio for both the Credit & Fee products. Manage the marketing budget across channels and ensure blended CAC is within the target range. Manage the P&L for few of the business lines (Repeat Consumption Loans / Fee Income) Stakeholder Management Interacting with cross functional teams such as Product, Design, Tech, Credit, Operations, Communication & Customer Success to drive the Growth agenda & ensuring correct implementation of all the product growth hacks. Lead discussions with external stakeholders/ channel partners during engagement calls Role requires a very high level of cross-functional interaction to successfully drive the Product Growth initiatives. Qualifications MBA from a reputed institute with 5-7+ years of fulltime experience as a product manager launching consumer or B2B products. Knowledge of tools- WebEngage, MoEngage, Clevertap, Adobe, Google Analytics, AppsFlyer, Branch etc. Work Experience 5-7+ years of relevant proven experience from a numbers-driven and analytical marketing environments - perhaps in an online consumer business (Fintech) or a bank/NBFC Experience with iterative hypothesis-driven product development and experimentation Experience working with a broad set of stakeholders and driving alignment. Strong written and verbal communication skills Takes an ownership mindset and works on whatever it takes to solve problems and delight users. Spikes in cutting through ambiguity to produce strategic insights and opportunities. Expert knowledge of various testing methodologies (A/B testing, multivariate testing, incrementality testing, usability testing) Strong Analytical Skills & business acumen Skills: SQL, Advanced Excel including Pivot Tables, R/Python is good to have.

Posted 12 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Python – Team Lead Location: Chennai. Job Type: Full-time Experience: 5 - 10 Years Key Responsibilities: · Designing and Developing robust, secure and reliable web applications and web · services · Designing and developing front-end and back-end web architecture, Creating user · interactions on web pages, developing web APIs and developing and configuring · servers and databases. · Software Requirement collection, customer interaction, ensuring Project · documents are prepared and maintained · Involving in support activities including site support · Staying up-to-date with industry developments SPECIAL ASSIGNMENT / PROJECT · All existing Python projects including Vesuvius, JKTIL, YOHT, Rane Etc · All upcoming Python projects Requirements: · Python · Django · MongoDB · HTML, CSS, JS, ReactJS · Server Configuration (IIS,Apache) · ELK (Elastic Search/Logstrash/Kibana) · Manufacturing Industry Experience

Posted 12 hours ago

Apply

50.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

About The Opportunity Job Type: Permanent Application Deadline: 31 August 2025 Job Description Title Analyst Programmer - Site Reliability Engineer Department ISS Distribution Location Gurgaon Level 2 Fidelity International offers investment solutions and services and retirement expertise to more than 2.5 million customers globally. As a privately-held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 locations and with $ 739.9 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals. Our Workplace & Personal Financial Health business provides individuals, advisers and employers with access to world-class investment choices, third-party solutions, administration services and pension guidance. Together with our Investment Solutions & Services business, we invest $567 billion on behalf of our clients. By combining our asset management expertise with our solutions for workplace and personal investing, we work together to build better financial futures. Find out more about what we do, our history, and how you could be a part of our future at careers.fidelityinternational.com/about-us. Our clients come from all walks of life and so do we. We are proud of our inclusive culture and encourage applications from the widest mix of talent, whatever your age, gender, ethnicity, sexual orientation, gender identity, social background and more. As a flexible employer, we trust our people to perform their role in the way that works best for them, our clients and our business. We are a disability-friendly company and would welcome a conversation with you if you feel you might benefit from any reasonable adjustments to perform to the best of your ability during the recruitment process and beyond. About Your Team The ISS Distribution business comprises of Fidelity’s Institutional Business Units in the UK, EMEA and Asia Pac and is a strategic area targeted for growth over the coming years. The Technology Department has been acting as the key enablers for the business in achieving their goals. The Institutional portfolio of projects will include a large collection of strategic initiatives as well as tactical ones to support day-to-day operations and strengthen the technical environment. Primary technologies used in these applications are: Java/J2EE, AWS, Snowflake, SpringMVC, React, Layer-7 About Your Role We are seeking a talented Site Reliability Engineer (SRE) to join our Technology team supporting critical applications within the ISS Production Services . This role blends traditional software engineering practices with reliability-focused operations, aiming to enhance the scalability, availability, and performance of client- and market-facing applications. The SRE will work directly with application development, architecture, DevOps, and business teams to ensure systems are designed and maintained with reliability and performance in mind, while meeting the demanding requirements of financial services operations. About You Reliability & Performance Engineering Partner with development teams to define SLOs, SLIs, and error budgets that align with business needs. Influence the design and architecture of systems to ensure high availability, resilience, and scalability across trading, portfolio management, compliance, and research platforms. Proactively identify bottlenecks and implement performance improvements for latency-sensitive applications. Application Support & Incident Management Serve as an escalation point for production issues affecting business-critical client reporting applications. Perform real-time troubleshooting and root cause analysis during incidents, followed by detailed postmortems and action items. Collaborate with product and operations teams to prioritize and remediate reliability risks. Observability & Automation Implement and evolve observability stacks (metrics, logging, tracing) to provide actionable insights into application health and user experience. Automate manual processes for deployment, monitoring, and incident remediation using scripting and configuration management tools (e.g., Ansible, Terraform, Python). Business Context & Domain Alignment Apply understanding of trading workflows, portfolio analytics, risk management, and regulatory reporting to prioritize engineering efforts. Translate domain-specific requirements into technical reliability strategies for applications handling large volumes of financial data. Experience And Qualifications Required We are seeking a motivated and skilled SRE with 3-4 years of experience to join our team. The ideal candidate should have hands-on experience automation, monitoring, and good knowledge of Containerization concepts. Strong programming/scripting background (e.g., Python, Go, Shell) with a focus on automation and tooling. Deep understanding of distributed systems and modern application architectures (microservices, containers, service mesh). Experience supporting mission-critical applications in a highly regulated financial services environment. Familiarity with event-driven systems, message queues (e.g., Kafka), databases (Oracle), and cloud-native platforms. Knowledge of financial services processes such as trade lifecycle, NAV calculations, order management, and market data integration is highly desirable. Essential Skills: 2+ years of hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and infrastructure as code practices. Knowledge of ITIL practices, support experience Good knowledge in Oracle database concepts, SQL statements (DML/DDL), stored procedures & Functions Strong collaboration and communication skills, with an ability to influence development teams and business stakeholders. Experience in python and Shell Scripting Understanding container orchestration principles (Kubernetes), and infrastructure-as-code tools Exepience in using monitoring tools like ELK, New Relic Experience of GitHub/Bitbucket as source control tool and build tools like Jenkins, UrbanDeploy Proven ability to work well under pressure and in a team environment Self-motivated, flexible, responsible, and a penchant for quality Ability to work closely with cross-functional teams. Ability to prioritise own activities, work under hard deadlines. Desirable Skills Good analytical, problem-solving and documentation skills. Calm approach when under pressure Solid organisational skills A real desire to do things the right way whilst remaining delivery focused Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 12 hours ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Mohali, Punjab

On-site

Job Title: QA Engineer ( Automation) Experience: 2–4 Years Location: Mohali Contact at 9915991662 Apply here - https://beyondroot.keka.com/#/hire/jobs/view/34740/candidates/sourced Job Summary: We are seeking a QA Engineer with 2–4 years of experience in both manual and automation testing for web, mobile, and API applications. The ideal candidate should have good hands-on experience with automation tools like Selenium, Appium, JMeter , or similar, and a strong understanding of manual testing processes , including test planning, test case design, and defect tracking. Key Responsibilities: Design and execute manual test cases for functional, integration, regression, and system testing. Develop, maintain, and execute automated test scripts for web, mobile, and APIs using tools like Selenium, Appium, Postman, JMeter , or similar. Prepare and maintain QA documentation such as: Test Plans Test Cases Test Scenarios Traceability Matrices Defect Reports Participate in requirement analysis and review meetings to ensure test coverage. Log and track bugs in tools like JIRA , and work with developers to ensure timely resolution. Contribute to building and maintaining CI/CD pipelines for automated testing. Ensure test environments are properly set up for both manual and automated testing. Required Skills & Qualifications: 2–4 years of experience in manual and automation testing . Strong knowledge of testing methodologies , SDLC , and STLC . Hands-on experience with automation tools such as: Selenium (Web Testing) Appium (Mobile Testing) Postman , JMeter , or RestAssured (API & Performance Testing) Experience in preparing and executing QA documents : test plans, test cases, and reports. Good knowledge of bug tracking and test management tools (e.g., JIRA, TestRail). Basic knowledge of programming/scripting in Java, Python, or JavaScript. Familiar with Git , CI/CD tools , and Agile development environments. Nice to Have: Knowledge of BDD frameworks (Cucumber, SpecFlow). Experience with Docker , Kubernetes , or cloud platforms (AWS, Azure). Exposure to security or performance testing. Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹50,000.00 per month Experience: Test Automation Engineer: 2 years (Required) Location: Mohali, Punjab (Required) Work Location: In person Speak with the employer +91 9817558892

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are seeking an AI/ML Engineer with expertise in generative AI, large language models (LLMs), and image-to-text workflows to develop cutting-edge solutions for the medical imaging domain. You will work closely with software engineers and domain experts to design, train, and deploy advanced AI models that push the boundaries of healthcare innovation. Requirements and Qualifications: Have 3+ years of full-time experience in the industry. Have hands-on experience in contemporary AI, such as training generative AI models like LLMs and image-to-text models, improving upon pre-trained models, evaluating these models, feedback loop etc. Have specialized expertise in model fine-tuning, RLHF, RAG, LLM tool use, etc. Have experience with LLM prompt engineering and familiarity with LLM-based workflows/architectures Have proficiency in Python, PySpark, TensorFlow, PyTorch, Keras, Transformer, and cloud platforms such as Google Cloud Platform (GCP) or Vertex AI or similar platforms. Have to collaborate with software engineers to integrate generative models into production systems. Ensure scalability, reliability, and efficiency of the deployed models. Have experience in effective data visualization approaches and a keen eye for detail in the visual communication of findings Your Responsibilities: Develop new LLMs for medical imaging based on state-of-the-art LLMs Developing and implementing methods that improve training efficiency and extend or improve LLM capabilities, reliability, and safety in the realm of image-to-text generation using medical data Perform data preprocessing, indexing, and feature engineering specific to healthcare image and text data. Keep up to date with the research literature and think beyond the state of the art to address the needs of our users.

Posted 12 hours ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Company Description At PropGrow, we are transforming the real estate investment experience in Gurugram's residential and commercial property markets, making it easier and more rewarding for both seasoned investors and first-time buyers. Our mission is to simplify real estate investment complexities through personalized, data-driven guidance that aligns high-growth opportunities with investor goals. Leveraging advanced technology, market surveys, and robust data analytics, we identify prime investment opportunities focused on maximizing ROI. We foster long-term relationships to ensure our clients feel valued and supported at all stages of their journey. Join PropGrow to be part of a community dedicated to building wealth and achieving real estate success. Role Description This is a full-time on-site role for a Python Developer located in Gurugram. The Python Developer will be responsible for back-end web development, developing software solutions, utilizing object-oriented programming (OOP) principles, and working with various databases. Day-to-day tasks involve designing, coding, testing, debugging, and maintaining Python applications to enhance our real estate investment platform and ensure seamless user experiences. Qualifications Proficiency in Back-End Web Development and Software Development Strong skills in Object-Oriented Programming (OOP) and general Programming Experience working with Databases Excellent problem-solving skills and attention to detail Ability to work collaboratively in a team environment on-site Bachelor's degree in Computer Science, Information Technology, or a related field

Posted 12 hours ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title: AWS Data Engineer Key Skills: Python, Pyspark, SQL, Glue, Redshift, Lambda, DMS, RDS ,Cloud Formation and other AWS serverless Job Locations: Chennai, Hyderabad, Pune, Bangalore Experience: 6 – 10 Years Work timings :- 2 to 11 PM Budget: 13 – 16 LPA Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Interview Mode: Online test followed by technical Evaluation 1 Rounds of Technical Interview + Including Client round Job Description: Primary Skills: Python, Pyspark, Glue, Redshift, Lambda, DMS, RDS ,Cloud Formation and other AWS serverless Strong exp in SQL Detailed JD Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS, RDS, Redshift, Cloud Formation and other AWS serverless resources Interested Candidates please share your CV to pnomula@people-prime.com

Posted 12 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies