Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 10.0 years
25 - 70 Lacs
India
On-site
Sr. Software Engineer - 1 | Java Location: Gurugram, Haryana, India About Naehas Naehas is a pre-IPO, fast-paced, and rapidly growing Saas company based in Silicon Valley. We help regulated enterprises streamline and accelerate customer communications through intelligent content management and automation. Our platform empowers marketing, legal, and compliance teams to deliver personalized, compliant experiences at scale—faster and more efficiently. Trusted by some of the world's largest financial institutions, Naehas is driving innovation, accuracy, and governance in customer engagement. We're growing our team in the Gurgaon area (India) and are looking for individuals who thrive in a culture of autonomy, collaboration, and purpose-driven work. Our formula for success is simple: passionate, engaged employees lead to satisfied customers and a world-class product. About The Position The software engineer will be part of a team working on development of a custom-made workflow engine which will be integrated with Naehas' s core product to provide automation to a certain degree. Since this is a new product in the nascent stages of the SDLC, the engineer is expected to write clean, optimized and efficient code, unit test cases, maintain Javadoc, document APIs, perform peer reviews and create relevant documentation of the architecture and features being built. This position also requires the engineer to be proactive in collaborating with the front-end developers and other team members to design and develop scalable and robust solutions. Skills & Responsibilities: Analyze, design and develop extremely reliable, scalable and high-performing web applications. Collaborate with product management from time to time to understand and evaluate business requirements and translate them into new features within the timeline Should be a proactive self-starter who can propose and implement solutions and underlying techniques for problem statements in the effort of obtaining results Work with other developers and team to ensure that new features are delivered and that issues are fully tested with minimal defects Contribute towards the development and application of advanced concepts, technologies and hold an area of expertise within the team Address architecture and design issues of products or technologies and provide strategic reasoning of introducing new technologies in his/her area of expertise Requirements Education must be equivalent to a Bachelor's or Master's degree in Computer Science or a related field, with a strong preference for candidates who have completed their B.Tech/M.Tech from top Tier-1 Institutes. 4-10 years of hands-on coding experience in Java 8, Spring Boot, Spring Data JPA, Hibernate. Experience building microservices and RESTful web services. Experience working with RDBMS preferably MySQL and NoSQL databases like MongoDB, Neo4j etc. Working experience with OAuth2, OpenID Connect/SAML, JWT and Spring Security. Understanding of Java build tools like Maven/Gradle. Software Engineering - design, test and implement software systems that optimize all phases of data operations process and new solution designs. Perform requirements analysis, understanding the business requirement and design and develop optimized and customized solutions for customer Demonstrated leadership ability to effectively work with cross functional teams. Involved in research and development activities to understand and identify the product requirements aligned to Naehas' s vision and business needs Ability to function in a fast-paced environment. Nice to Have: Hands-on experience with Cloud platforms and services preferably AWS. Exposure to UI and front-end technologies like HTML, CSS, JavaScript and frameworks and libraries such as Angular/React.js. An understanding of CI/CD pipeline and build automation tools like Jenkins. Experience in Linux and shell scripting. Working knowledge of messaging queues like RabbitMQ, ActiveMQ etc. Working knowledge of Apache Kafka and pub-sub systems. Experience with enterprise authentication and authorization solutions like Okta. Experience with SSO. Knowledge of deployment using Docker will be an added advantage Benefits At Naehas, people and culture are at the heart of everything we do. The best way to understand our work environment is through the values we live by: Reality - Recognize and address challenges early Ownership - Be self-aware and take personal responsibility Courage - Always give your best, even in tough situations Trust - Respect and uplift your teammates; focus on building each other up Curiosity - Stay driven to explore and understand new ideas and solutions Flexibility - Embrace adaptability and innovation over rigid efficiency Integrity - Uphold our reputation above all else If these values speak to you, we're excited to meet you. Here's what you can expect as a team member at Naehas: Competitive compensation A comprehensive benefits package, including health coverage A casual, inclusive workplace where your ideas are valued and respected The agility and energy of a fast-growing, profitable startup Flexible work arrangements to support work-life balance Complimentary meals to keep you energized throughout the day For more information, please visit https://www.naehas.com/
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Responsibilities: Design, build, and optimize data pipelines to ingest, process, transform, and load data from various sources into our data platform Implement and maintain ETL workfl ows using tools like Debezium, Kafka, Airfl ow, and Jenkins to ensure reliable and timely data processing Develop and optimize SQL and NoSQL database schemas, queries, and stored procedures for effi cient data retrieval and processing *** Work with both relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DocumentDB) to build scalable data solutions Design and implement data warehouse solutions that support analytical needs and machine learning applications Collaborate with data scientists and ML engineers to prepare data for AI/ML models and implement data-driven features Implement data quality checks, monitoring, and alerting to ensure data accuracy and reliability Optimize query performance across various database systems through indexing, partitioning, and query refactoring Develop and maintain documentation for data models, pipelines, and processes Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs Stay current with emerging technologies and best practices in data engineering Requirements: 5+ years of experience in data engineering or related roles with a proven track record of building data pipelines and infrastructure Strong profi ciency in SQL and experience with relational databases like MySQL and PostgreSQL Hands-on experience with NoSQL databases such as MongoDB or AWS DocumentDB Expertise in designing, implementing, and optimizing ETL processes using tools like Kafka, Debezium, Airfl ow, or similar technologies Experience with data warehousing concepts and technologies Solid understanding of data modeling principles and best practices for both operational and analytical systems Proven ability to optimize database performance, including query optimization, indexing strategies, and database tuning Experience with AWS data services such as RDS, Redshift, S3, Glue, Kinesis, and ELK stack Profi ciency in at least one programming language (Python, Node.js, Java) Experience with version control systems (Git) and CI/CD pipelines Job Description: Experience with graph databases (Neo4j, Amazon Neptune) Knowledge of big data technologies such as Hadoop, Spark, Hive, and data lake architectures Experience working with streaming data technologies and real-time data processing Familiarity with data governance and data security best practices Experience with containerization technologies (Docker, Kubernetes) Understanding of fi nancial back-offi ce operations and FinTech domain Experience working in a high-growth startup environment
Posted 1 week ago
6.0 years
9 - 11 Lacs
Bengaluru
On-site
Experience: 6+ years Mandatory Skills: Generative AI (GenAI) Python Cloud Experience ( Azure preferred ) Job Summary: We are looking for a highly skilled and innovative Generative AI Engineer with: Strong foundational knowledge in GenAI concepts Hands-on experience in building Retrieval-Augmented Generation (RAG) pipelines Expertise in Python and prompt engineering Exposure to agent-based architectures or AI agent POCs Working knowledge of DevOps environments Key Responsibilities: Design and optimize GenAI solutions, including RAG pipelines Build and test LLM agent-based architectures Develop scalable Python code with prompt engineering techniques Collaborate across DevOps, Data Science, and Product teams Use Git, Jira, or Azure DevOps to manage progress Contribute to scalable deployment of AI applications Must-Have Qualifications: Solid understanding of Generative AI fundamentals Proven experience with RAG pipelines Strong Python programming and prompt engineering skills Exposure to agent-based frameworks (e.g., LangGraph, Autogen) Familiarity with CI/CD and DevOps practices Good-to-Have Skills: Microsoft Document Intelligence or similar tools Graph Databases (e.g., Neo4j, Amazon Neptune) Exposure to NLP or Computer Vision Tools: Git, Azure DevOps (ADO), Jira Preferred Qualities: Proactive problem-solver Able to work independently and in teams Strong communication and documentation skills Job Type: Contractual / Temporary Contract length: 6-12 months Pay: ₹900,000.00 - ₹1,100,000.00 per year Work Location: In person
Posted 1 week ago
0 years
5 - 7 Lacs
Bengaluru
On-site
A Principal AI Engineer leads the design, development, and deployment of advanced AI systems. This role blends deep technical expertise with strategic leadership to drive innovation across AI initiatives. Key Responsibilities Architect AI Solutions: Design scalable AI systems using frameworks like LangChain, LangGraph, or Strands. Lead GenAI Projects: Build and optimize GenAI agents using RAG (Retrieval-Augmented Generation), knowledge graphs, and LLMs. Mentor Teams: Guide junior engineers and foster a culture of learning and experimentation. Collaborate Cross-Functionally: Work with product managers, data scientists, and stakeholders to align AI strategies with business goals. Stay Ahead of Trends: Research emerging technologies and integrate best practices into development cycles. Ensure Scalability & Reliability: Maintain high-performance AI platforms with robust data pipelines and cloud infrastructure. Preferred Skills Experience with multi-agent systems, reinforcement learning, and graph databases (e.g., Neo4j, Gremlin) Familiarity with MLOps, AutoGPT, CrewAI, and OpenAI Function Calling Contributions to open-source AI projects or publications in top-tier conferences About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do: We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What you’ll bring: Bachelor’s or master’s degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance Optimization: Knowledge of techniques to optimize knowledge graph operations like data inserts. Data Modeling: Proficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the team’s requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
Remote
AI Engineer – Computer Vision, NLP & Deep Learning Type, Location, Full Time @ Pune Desired Experience 4+ years Job Description Role Develop, fine-tune, and deploy deep learning models for computer vision and NLP use cases using PyTorch or TensorFlow. Design and maintain model pipelines, data loaders, and API layers in Python for scalable inference and integration. Implement frontend and backend integrations of AI models using TypeScript/JavaScript (e.g., Node.js, LangChain.js, Transformers.js). Work with datasets for annotation, augmentation, and visualization using tools like LabelImg, Albumentations, and FiftyOne. Build semantic search and recommendation systems using vector databases like Pinecone, Weaviate, or Milvus. Integrate NoSQL and graph-based storage systems like MongoDB and Neo4j for AI-related data operations. Collaborate with cross-functional teams to deliver production-ready, observable, and testable ML components. Contribute to infrastructure and CI/CD for AI model deployment and versioning. Document architectures, APIs, model behavior, and performance tuning guidelines. Participate in sprint planning, reviews, and architecture discussions within a remote-first engineering team. Qualifications 5+ years of experience in AI/ML engineering with deep expertise in computer vision, NLP, and model lifecycle management. Advanced proficiency in Python for deep learning, data pipelines, and API development. Hands-on experience with PyTorch or TensorFlow , with strong skills in training, fine-tuning, and optimizing DL models. Experience in TypeScript/JavaScript for integrating AI models into web applications using Node.js or frontend frameworks. Familiarity with C++ for performance-critical or system-level tasks is a plus. Experience working with vision tools (OpenCV, Detectron2/MMDetection) and managing large annotated datasets. Strong understanding of NoSQL (MongoDB, Redis), vector databases (Pinecone, Milvus), and graph DBs (Neo4j). Proficiency in building scalable ML services, with observability (logging, metrics, alerting) and test coverage. Exposure to orchestration tools (Airflow, Prefect) and cloud deployment workflows (Azure preferred). Comfortable working in remote, agile teams with strong communication and problem-solving skills. Ability to work independently and drive AI projects from experimentation to production.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should have expertise in Neo4j, including its core concepts, Cypher query language, and best practices. You will be responsible for designing and implementing graph database solutions, creating and maintaining graph schemas, models, and architectures. Your role will involve migrating data from relational or other databases into Neo4j and optimizing Cypher queries for performance to ensure efficient data retrieval and manipulation. It is essential to have familiarity with graph theory, graph data modeling, and other graph database technologies. You will also be developing and optimizing Cypher queries, as well as integrating Neo4j with BI and other systems. Additionally, it would be good to have experience in developing Spark applications using Scala or Python (Pyspark) for data transformation, aggregation, and analysis. You should also be able to develop and maintain Kafka-based data pipelines, create and optimize Spark applications using Scala and PySpark, and have proficiency in the Hadoop ecosystem big data tech stack (HDFS, YARN, MapReduce, Hive, Impala). Furthermore, hands-on expertise in building Neo4j Graph solutions, Spark (Scala, Python) for data processing and analysis, Kafka for real-time data ingestion and processing, ETL processes, data ingestion tools, Pyspark, Scala, and Kafka would be beneficial for this role. You will be part of the Technology department in the Applications Development job family, working full-time to leverage your skills and experience in graph databases, data processing, and big data technologies to drive impactful business solutions. If you require a reasonable accommodation due to a disability to use search tools or apply for a career opportunity, please review Citi's Accessibility information. Additionally, you can refer to Citis EEO Policy Statement and the Know Your Rights poster for more details.,
Posted 1 week ago
10.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
You have 10 to 18 years of relevant experience in Data Science. As a Data Scientist, your responsibilities will include modeling and data processing using Scala Spark/PySpark. You should have expert level knowledge of Python for data science purposes. Additionally, you will be required to work on data science concepts, model building using sklearn/PyTorch, and Graph Analytics using networkX, Neo4j, or similar graph databases. Experience in model deployment and monitoring (MLOps) is also desirable. The required skills for this Data Science position include: - Data Science - Python - Scala - Spark/PySpark - MLOps - GraphDB - Neo4j - NetworkX Our hiring process consists of the following steps: 1. Screening (HR Round) 2. Technical Round 1 3. Technical Round 2 4. Final HR Round This position is based in Pune.,
Posted 1 week ago
6.0 years
20 - 35 Lacs
India
On-site
Data Engineer (6–8 Years) | Hyderabad, India | SaaS Product | MongoDB | Finance Automation Resourcedekho is hiring for a leading client in the agentic AI-based finance automation space. We’re looking for a passionate and experienced Data Engineer to join a high-impact team in Hyderabad. Why Join Us? Open budget for the right talent—compensation based on your expertise and interview performance. Work with cutting-edge technologies in a high-growth, product-driven environment. Collaborate with top minds from reputed institutions (IIT/IIM or similar). What You’ll Do: Design, build, and optimize robust data pipelines for ingesting, processing, and transforming data from diverse sources. Implement and maintain ETL workflows using tools like Debezium, Kafka, Airflow, Jenkins . Develop and optimize SQL/NoSQL schemas, queries, and stored procedures for efficient data retrieval. Work with both relational (MySQL, PostgreSQL) and NoSQL (MongoDB, DocumentDB) databases. Design and implement scalable data warehouse solutions for analytics and ML applications. Collaborate with data scientists and ML engineers to prepare data for AI/ML models. Ensure data quality, monitoring, and alerting for accuracy and reliability. Optimize query performance through indexing, partitioning, and query refactoring. Maintain comprehensive documentation for data models, pipelines, and processes. Stay updated with the latest in data engineering tech and best practices. What We’re Looking For: 6+ years of experience in data engineering or related roles. Strong proficiency in SQL and experience with MySQL, PostgreSQL . Hands-on expertise with MongoDB (or AWS DocumentDB)— mandatory . Proven experience designing and optimizing ETL processes (Kafka, Debezium, Airflow, etc.). Solid understanding of data modeling, warehousing, and performance optimization. Experience with AWS data services (RDS, Redshift, S3, Glue, Kinesis, ELK stack). Proficiency in at least one programming language ( Python, Node.js, Java ). Experience with Git and CI/CD pipelines. Bachelor’s degree in Computer Science, Engineering, or related field. SaaS product experience is a must. Preference for candidates from reputed colleges (IIT/IIM or similar) and with stable career history. Bonus Points For: Experience with graph databases (Neo4j, Amazon Neptune). Knowledge of big data tech (Hadoop, Spark, Hive, data lakes). Real-time/streaming data processing. Familiarity with data governance, security, Docker, Kubernetes. FinTech or financial back-office domain experience. Startup/high-growth environment exposure. Ready to take your data engineering career to the next level? Apply now or reach out to us at career@resourceDekho.com to learn more! Please note: Only candidates with relevant SaaS product experience and strong MongoDB skills will be considered. Job Type: Full-time Pay: ₹2,000,000.00 - ₹3,500,000.00 per year Application Deadline: 22/07/2025 Expected Start Date: 18/08/2025
Posted 1 week ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Scientist Location: [Insert Location] Experience: 5–10 years (flexible based on expertise) Employment Type: Full-Time Compensation: [Insert Budget / Competitive as per industry standards] About the Role: We are looking for a highly skilled and innovative Data Scientist with deep expertise in Machine Learning, AI, and Cloud Technologies to join our dynamic analytics team. The ideal candidate will have hands-on experience in NLP, LLMs, Computer Vision , and advanced statistical techniques, along with the ability to lead cross-functional teams and drive data-driven strategies in a fast-paced environment. Key Responsibilities: Develop and deploy end-to-end machine learning pipelines including data preprocessing, modeling, evaluation, and production deployment. Work on cutting-edge AI/ML applications such as LLM-finetuning, NLP, Computer Vision, Hybrid Recommendation Systems , and RAG/CAG techniques . Leverage platforms like AWS (SageMaker, EC2) and Databricks for scalable model development and deployment. Handle data at scale using Spark, Python, SQL , and integrate with NoSQL and Vector Databases (Neo4j, Cassandra) . Design interactive dashboards and visualizations using Tableau for actionable insights. Collaborate with cross-functional stakeholders to translate business problems into analytical solutions. Guide data curation efforts and ensure high-quality training datasets for supervised and unsupervised learning. Lead initiatives around AutoML, XGBoost, Topic Modeling (LDA/LSA), Doc2Vec , and Object Detection & Tracking . Drive agile practices including Sprint Planning, Resource Allocation, and Change Management . Communicate results and recommendations effectively to executive leadership and business teams. Mentor junior team members and foster a culture of continuous learning and innovation. Technical Skills Required: Programming: Python, SQL, Spark Machine Learning & AI: NLP, LLMs, Deep Learning, Computer Vision, Hybrid Recommenders Techniques: RAG, CAG, LLM-Finetuning, Statistical Modeling, AutoML, Doc2Vec Data Platforms: AWS (SageMaker, EC2), Databricks Databases: SQL, NoSQL, Neo4j, Cassandra, Vector DBs Visualization Tools: Tableau Certifications (Preferred): IBM Data Science Specialization Deep Learning Nanodegree (Udacity) SAFe® DevOps Practitioner Certified Agile Scrum Master Professional Competencies: Proven experience in team leadership, stakeholder management , and strategic planning . Strong cross-functional collaboration and ability to drive alignment across product, engineering, and analytics teams. Excellent problem-solving, communication, and decision-making skills. Ability to manage conflict resolution, negotiation , and performance optimization within teams.
Posted 2 weeks ago
50.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Your Team Responsibilities The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Your Key Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Your Skills And Experience That Will Help You Excel Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
7.0 - 10.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Our Team: The Integration Process & Tooling Landscape (ITL) function plays a strategic role in orchestrating diverse ecosystems for Siemens Healthineers global engineering communities. This function is instrumental in delivering comprehensive solutions across a broad spectrum of operational domains, including: Manufacturing and Shop-Floor Management Requirements Engineering Materials Management Engineering Designs Software Realization System Simulations Verification & Validation Development of customized Generative AI solutions Through these efforts, the ITL function significantly enhances capabilities and supports the global organizations objectives. Role Overview: We are seeking a highly skilled and adaptable individual to join our team, focusing on driving innovation and delivering robust solutions. This role requires a blend of technical expertise, collaborative spirit, and a proactive approach to problem-solving. Key Qualifications and Attributes: Education & Experience: Comes from a Computer Science background, with over 7-10 years of prototyping experience. Integration Expertise: Proficient in integration principles, with a strong ability to understand and work with data schemas across diverse IT landscapes, including platforms such as Team Center, Polarian, Magic Draw, Azure DevOps, etc.. Data Management & Engineering: Demonstrated hands-on experience with industry-standard platforms for data management, data streaming, and graph modeling, including technologies like MongoDB, Snaplogic, Apache Kafka, and Neo4j. Programming Proficiency: A seasoned programmer with extensive experience in Python and C#, capable of developing high-quality, scalable solutions. Collaborative Prototyping: Proven experience collaborating directly with Senior Key Experts and Product Management groups, translating strategic visions into functional working prototypes. Prototype Development & Design: Prior experience in the pre-development phase of prototypes, including system or software simulations and detailed design, coupled with a meticulous approach to requirement elicitation with subject matter experts. Exceptional Communication: An outstanding communicator, adept at deconstructing complex challenges into clear, actionable problem statements through constructive questioning and effective stakeholder engagement. Innovation & Resilience: Possesses the courage to iterate rapidly on innovations and prototypes, embracing iterative learning and suggesting necessary course corrections to achieve successful outcomes. Accountable Contributor: Functions as a highly independent contributor, taking full accountability for commitments, understanding the broader impact of deliverables, and consistently ensuring project success. Adaptive Learning: A rapid learner with a strong capability to swiftly adapt to and master new technologies as project demands evolve. AI-Enhanced Productivity: Demonstrates an appreciation for and ability to leverage advanced AI platforms, including Copilot and other Generative AI tools, to significantly enhance productivity and accelerate development cycles. Agile Development: Proficient in adapting to evolving requirements within prototype development and adept at prioritizing deliveries to meet dynamic project needs.
Posted 2 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a skilled Data Engineer to join our growing data team in India. You will be responsible for designing, building, and maintaining scalable data infrastructure and pipelines that enable data-driven decision making across our organization and client projects. This role offers the opportunity to work with cutting-edge technologies and contribute to innovative data solutions for global clients. What you do Technical Skills Minimum 3+ years of experience in data engineering or related field Strong programming skills in Python and/or Scala/Java Experience with SQL and database technologies (PostgreSQL, MySQL, MongoDB) Hands-on experience with data processing frameworks: Apache Spark, Hadoop ecosystem Apache Kafka for streaming data Apache Airflow or similar workflow orchestration tools Knowledge of data warehouse concepts and technologies Experience with containerization (Docker, Kubernetes) Understanding of data modeling principles and best practices Cloud & Platform Experience Experience with at least one major cloud platform (AWS, Azure, or GCP) Familiarity with cloud-native data services: Data lakes, data warehouses, and analytics services Server less computing and event-driven architectures Identity and access management for data systems Knowledge of Infrastructure as Code (Terraform, CloudFormation, ARM templates) Data & Analytics Understanding of data governance and security principles Experience with data quality frameworks and monitoring Knowledge of dimensional modeling and data warehouse design Familiarity with business intelligence and analytics tools Understanding of data privacy regulations (GDPR, CCPA) Preferred Qualifications Advanced Technical Skills Experience with modern data stack tools (dbt, Fivetran, Snowflake, Databricks) Knowledge of machine learning pipelines and MLOps practices Experience with event-driven architectures and microservices Familiarity with data mesh and data fabric concepts Experience with graph databases (Neo4j, Amazon Neptune) Industry Experience Experience in digital agency or consulting environment Background in financial services, e-commerce, retail, or customer experience platforms Knowledge of marketing technology and customer data platforms Experience with real-time analytics and personalization systems Soft Skills Strong problem-solving and analytical thinking abilities Excellent communication skills for client-facing interactions Ability to work independently and manage multiple projects Adaptability to rapidly changing technology landscape Experience mentoring junior team members What we ask Data Infrastructure & Architecture Design and implement robust, scalable data architectures and pipelines Build and maintain ETL/ELT processes for batch and real-time data processing Develop data models and schemas optimized for analytics and reporting Ensure data quality, consistency, and reliability across all data systems Platform-Agnostic Development Work with multiple cloud platforms (AWS, Azure, GCP) based on client requirements Implement data solutions using various technologies and frameworks Adapt quickly to new tools and platforms as project needs evolve Maintain expertise across different cloud ecosystems and services Data Pipeline Development Create automated data ingestion pipelines from various sources (APIs, databases, files, streaming) Implement data transformation logic using modern data processing frameworks Build monitoring and alerting systems for data pipeline health Optimize pipeline performance and cost-efficiency Collaboration & Integration Work closely with data scientists, analysts, and business stakeholders Collaborate with DevOps teams to implement CI/CD for data pipelines Partner with client teams to understand data requirements and deliver solutions Participate in architecture reviews and technical decision-making What we offer You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply. At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here. Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team!
Posted 2 weeks ago
6.0 years
0 Lacs
India
Remote
Ekyam.ai: Integrating Systems, Unleashing Intelligence - Join Our India Expansion! Are you ready to solve complex integration challenges and build the next generation of AI-driven retail technology? Ekyam.ai, headquartered in New York, US, is expanding globally and establishing its new team in India! We are looking for talented individuals like you to be foundational members of our Indian presence. Ekyam.ai is developing a groundbreaking AI-native middleware platform that connects disparate retail systems (ERP, OMS, WMS, POS, etc.) and creates a unified, real-time, vectorized data layer. We enable intelligent automation and transform how retailers leverage their data by integrating cutting-edge AI capabilities. Role We are seeking an experienced AI Developer (4–6 years) skilled in applying Large Language Models (LLMs) and building AI-driven applications to join our growing team. A significant part of this role involves designing and developing AI Agents within our platform with an initial focus on integrating external LLM APIs (e.g., OpenAI, Anthropic, Google) via sophisticated prompt engineering and RAG techniques into these agents, built using Python + FastAPI . You will architect the logic for these agents, enabling them to perform complex tasks within our e-commerce and retail data orchestration pipelines. Furthermore, as Ekyam.ai evolves, this role offers the potential to grow into customizing and deploying LLMs in-house , so adaptability and a strong foundation in ML/LLM principles are key. Key Responsibilities AI Agent Development: Design, develop, test, and maintain the core logic for AI Agents within FastAPI services. Orchestrate agent tasks, manage state, interact with platform data/workflows, and integrate LLM capabilities. LLM API Integration & Prompt Engineering: Integrate with external LLM provider APIs . Design, implement, and rigorously test effective prompts for diverse retail-specific tasks (generation, Q&A, summarization). RAG Implementation: Implement and optimize Retrieval-Augmented Generation (RAG) patterns using vector databases to provide relevant context to LLM API calls made by agents. FastAPI Microservice Development: Build and maintain the scalable FastAPI microservices that host AI Agent logic and handle interactions with LLMs and other platform components in a containerized environment ( Docker, Kubernetes ). Data Processing for AI: Prepare and preprocess data required for effective prompt context, RAG retrieval, and potentially for future fine-tuning tasks. Collaboration & Future Adaptation: Work with cross-functional teams to deliver AI features. Stay updated on LLM advancements and be prepared to learn and contribute to potential future in-house LLM fine-tuning and deployment efforts. Required Skills & Qualifications 3–6 years of hands-on experience in software development with a strong focus on AI/ML application development . Demonstrable experience integrating and utilizing external LLM APIs (e.g., OpenAI, Anthropic, Google) in applications. Proven experience with Prompt Engineering techniques. Strong Python programming skills. Practical experience building and deploying RESTful APIs using FastAPI . Experience designing and implementing application logic for AI-driven features or agents . Understanding and practical experience with RAG concepts and vector databases (Pinecone, FAISS, etc.). Solid understanding of core Machine Learning concepts and familiarity with frameworks like PyTorch, TensorFlow, or Hugging Face (important for understanding models and future adaptation). Familiarity with cloud platforms ( AWS, GCP, or Azure ) and containerization ( Docker, Kubernetes ) for application deployment. Solid problem-solving skills and clear communication abilities. Experience working effectively in an agile environment. Willingness and capacity to learn and adapt towards future work involving deeper LLM customization and deployment. Bachelor's or Master's degree in Computer Science, AI, or a related field. Ability to work independently and collaborate effectively in a remote setting. Preferred Qualifications Experience with frameworks like LangChain or LlamaIndex. Experience with observability and debugging tools for LLM applications, such as LangSmith. Experience with graph databases (e.g., Neo4j) and query languages (e.g., Cypher). Experience with MLOps practices, applicable to both current application monitoring and future model lifecycle management. Experience optimizing API call performance (latency/cost) or model inference. Knowledge of AI security considerations and bias mitigation . Why Join Ekyam.ai? Be a foundational member of our new India team! This role offers a unique blend: build intelligent AI Agents leveraging cutting-edge external LLMs today, while positioning yourself at the forefront of our future plans for deeper AI customization. You'll gain expertise across the AI application stack (APIs, RAG, Agents, potential future MLOps) and collaborate within a vibrant global team shaping the future of AI in e-commerce. We offer competitive compensation that values your current skills and growth potential.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description Varseno Solutions is a software and product development company in Pune that specializes in rapid development of engineering software by providing fast, reliable and accurate solutions. As a true software services organization, Varseno Solutions is a long-term partner providing solutions and efficiently managing the software needs of its clients. The team consists of experienced techies who have spent years in the IT industry. The company prides itself on delivering innovative solutions that use tomorrow’s technology today with AI. Role Description To lead the Development team and should have strong technical and team management skills, including the ability to drive analytical rigor in decision making, possess deep technical insights and have the ability to speak in both business and technical terms within the team and with clients. Candidate should be understanding of all project activities to agreed quality, time and processes. Candidate has to ensure adherence to Development process defined at Varseno by Technology team. The role will also be responsible for managing end to end delivery of Varseno products/projects with defined processes. Must Haves: Experience in working on cloud enabled, independent applications and systems using .Net and open-source technologies. Good understanding and expertise in JavaScript, Angular/ReactJS, HTML, CSS, jQuery, Bootstrap, JS/VUE JS. Experience in JavaScript, jQuery, HTML and CSS. Strong technical expertise and work experience in .NET, .NET Core, ASP.net, C# and LINQ, MS SQL Server Good technical expertise working with NoSQL solutions such as MongoDB, Neo4J, Redis, Cassandra, etc. Proven ability to use Design Patterns to accomplish scalable architecture. Good understanding in WebAPI, REST. Exposure to Azure Technologies and components. Ability to understand customer business and thereby assess the criticality of their needs and requests Proven success implementing client side MVVM frameworks such as Angular or React as well as expert proficiency with JavaScript. Proven ability to use Design Patterns to accomplish scalable architecture. Excellent verbal and written communication skills. Strong analytical and Problem-solving skills. Manage and mentor junior team members. Experience with continuous integration, unit testing, static analysis, and automated integration tests. Continuous delivery experience preferred Strong experience in one of the cloud technologies – Azure or AWS Managing change effectively
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As an ITIDATA, an EXl Company, you will be responsible for utilizing Cypher or Gremlin query languages, Neo4J, Python, PySpark, Hive, and Hadoop to work on tasks related to graph theory. Specifically, your role will involve creating and managing knowledge graphs using Neo4J. We are seeking Neo4J Developers with 7-10 years of experience in data engineering, including 2-3 years of hands-on experience with Neo4J. If you are looking for an exciting opportunity in graph databases, this position is ideal for you. Key Skills & Responsibilities: - Expertise in Cypher or Gremlin query languages - Strong understanding of graph theory - Experience in creating and managing knowledge graphs using Neo4J - Optimizing performance and scalability of graph databases - Researching & implementing new technology solutions - Working with application teams to integrate graph database solutions Candidates who can be available immediately or within 30 days will be given preference. Join us and be a part of our dynamic team working on cutting-edge graph database technologies.,
Posted 2 weeks ago
4.0 - 9.0 years
25 - 40 Lacs
Pune, Chennai
Hybrid
Hi, Wishes from GSN! Pleasure connecting with you. About the job: This is a golden opportunity with a leading BigTech IT Services company, a valued client of GSN HR. Exp Range : 4+ yrs Work Loc : PUNE Work Mode : WFO - Hybrid Work Timing : General CTC Range : 25LPA to 40 LPA ******** Looking for SHORT JOINERS ******** Required Skills : Neo4j Expertise : 4+ yrs of Proven, in-depth EXP with Neo4j, including its core concepts (nodes, relationships, properties, labels), architectural components, and deployment models (standalone, causal cluster). Strong in Cypher query language for complex graph traversals, pattern matching and data manipulation. Strong understanding of Neo4j indexing strategies (schema indexes, full-text indexes) and their impact on query performance. Graph Database Solutions: Strong EXP in designing, implementing and maintaining scalable graph database solutions and architectures. Familiarity with graph theory concepts, graph data modeling principles , and their application in real-world scenarios. ******** Looking for SHORT JOINERS ******** If this role excites you and aligns with your aspirations, dont hesitate to call me @ 9840035825 directly or click APPLY . Lets explore this opportunity together! Best Regards, Ananth | GSN | 9840035825 | Google review : https://g.co/kgs/UAsF9W
Posted 2 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Pune, Bengaluru, Delhi / NCR
Work from Office
Neo4J Developer Locations: India Type: Full-time Experience: 5+ years Functions: Consulting, Finance, Information Technology, Data Governance Industries: Capital Markets, Investment Banking, Alternative Investments, Financial Services, Management Consulting, Information Technology and Services, Business Travel, Healthcare. Role: We are looking for a talented Knowledge Graph Engineer to join our team. As a key member of our data engineering team, you will be responsible for designing, implementing, and optimizing graph databases to efficiently store and retrieve high-dimensional data. Responsibilities: Help design, build, and continuously improve the client's online platform. Design and implement graph databases to efficiently store and retrieve dimensional data. Utilize Neo4j for creating and managing knowledge graphs, ensuring optimal performance and scalability. Research, suggest, and implement new technology solutions following best practices/standards. Develop and maintain knowledge graphs using Neo4j, incorporating domain-specific data and referential integrities. Work with application teams to integrate graph databases, knowledge graphs solutions within the existing infrastructure. Provide support for query optimization and data modeling for application-specific requirements. Ensure that proper data security measures are implemented for graph databases and knowledge graphs. Requirements : • 5-10 years of experience working with data engineering, with at least 2-3 years of experience working on graph databases. • Proficiency in query languages like Cypher or Gremlin and a solid foundation in graph theory ar crucial for success in this position. • Outstanding written and verbal communication skills. • Superior analytical and problem-solving skills. • Experience in working in dual shore engagement is preferred.
Posted 2 weeks ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are looking for a Full stack core software engineer with deep understanding of Java/Python and its ecosystems, and strong hands-on experience in building high-performing, scalable, enterprise-grade applications. You will be part of a talented software team that works on mission-critical applications. As a full stack core software engineer, your responsibilities include understanding user requirements and working with a development team on the design, implementation and deliver of Java/Python application while providing expertise in the full software development lifecycle, from concept and design to testing. Candidate will be working closely with business architecture group to design and implement current and target state business process by using various tools and technologies. Candidate should ideally be having knowledge in few of these technologies like Java/Python/Unix technology stack, Angular, java script, SQL / NonSQL and Graph DB are used for data storage (we tailor the tools to the needs) and is integrated with other bank systems via RESTful APIs/web services and Kafka Streams. Qualifications: 7+ years of industry experience, with a strong hands-on experience in the hands-on development of mission-critical applications using Java/Python technologies, aligning each project with the firm's strategic objectives, and overseeing team operations to ensure project success. Experience with complex system integration projects. Java, Spring, Spring Boot, Spring Cloud, J2EE Design Patterns, REST services. Front End Technologies like JavaScript and Angular version, CSS2/CSS3, HTML Strong Knowledge of SQL, JDBC, Unix commands. Hands-on Database experience in relational (Oracle/DB2) and No-SQL (MongoDB). Hands-on experience on working / deploying application on Cloud. Hands-on experience in code testing tools like Junit / Mockito / Cucumber. Deployment Acquaintance in Apache Tomcat, Open shift or other cloud environments. Expertise in Test driven development (Junit, JMeter), Continuous Integration (Jenkins), Build tool (Maven) and Version Control (Git), Development tools (Eclipse, IntelliJ). Excellent communication skills (written and verbal), ability to work in a team environment. Excellent analytical and problem-solving skills and the ability to work well independently. Experience working with business analysts, database administrators, project managers and technical architects in multiple geographical areas. Experience in the Financial Services industry is added advantage. Understanding Financial and Reporting Hierarchies will be beneficial. Education : Bachelor’s or equivalent degree in Computer Science Experience : Minimum 7 + years of relevant experience developing applications/solutions preferably in the financial services industry. Required Skills: Minimum 7 + years of application development experience in Java/Python with: Spring Boot & Microservices; REST Web Services; JPA with hibernate; Core Java/Python. Minimum 3+ years of Hands-on experience in designing architecture for enterprise applications. Angular and Java Script Experience in working on a native cloud platform. Experience with development IDEs such as Eclipse and IntelliJ Experience with SQL/NONSQL such as Oracle, PostgreSQL, Neo4j, and MongoDB Experience with caching framework such as Redis. Experience with CI/CD systems such as helm and harness. Experience with messaging services such as Kafka. Experience in Python, Unix shell scripting will be an added plus Excellent trouble shooting skills. Strong problem-solving skills, business acumen, and demonstrated excellent oral and written communication skills with both technical and non-technical audiences. Experience with Agile Software Development Lifecycle methodology and related tooling. For example -JIRA, Scrum. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
0 years
5 - 8 Lacs
Hyderābād
On-site
Summary As a key member of the Software Engineering team of 400-people strong Informatics organization (NX), you will be responsible for developing next generation software tools that optimize the acquisition, storage, integration, mining, analysis, visualization and interpretation of chemical, biological, clinical and operational data. About the Role Major accountabilities: Develop state-of-the-art software tools and methodologies to support the discovery process across the entire design-make-test-analyze cycle Play a key role in the design and development of tools and technologies for integrating, processing, analyzing and visualizing data at scale Operate as part of a cross-functional product team to translate business needs into powerful, functional and beautiful products Participate in the full development cycle from product inception, research and prototyping to production release Embrace a bias-to-action mindset, agile development principles, and industry standard software development best practices Balance strong technical and thought leadership with a learning and listening mindset. Embody and integrate software development best practices into your everyday work and inspire others within the engineering community to emulate these practices Ensure adherence to Novartis global Information Security and Quality standards and policies for all products/services. If applicable, ensure Regulatory Compliance (e.g. GLP & GCP) standards and policies for GxP products/services. Key performance indicators: Timely execution of of projects and data requests -Feedback from project sponsors and key stakeholders Minimum Requirements: Skills: Experience building commercial-quality cloud-based solutions at scale effectively via various SDLC, and product-led approaches, delivering performance, quality, and reliability Demonstrated ability to act as technical lead of products, enabling the team to be more impactful Excellent interpersonal skills with the ability to communicate effectively in a matrix environment Experience with modern programming languages (Java, JavaScript, Python, etc.), operating systems and software development environments Experience with relational and non-relational databases (Oracle, SQL Server, PostgreSQL, Couch, Mongo, Neo4j, etc.) Experience with cloud technologies (AWS, Azure) & container technologies (e.g. Docker, Kubernetes, etc.) Experience with web service development Experience with software development code management principles and tools (e.g. JIRA, Bitbucket, Jenkins, CI/CD, etc.) Well-structured working style with open and clear communication that enables effective collaboration across multiple teams, sites and time zones Attention to detail and passion for the end-user experience Languages : English. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Biomedical Research Business Unit Pharma Research Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Research & Development Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 2 weeks ago
2.0 years
4 - 6 Lacs
Mohali
Remote
Job Profile: Python developer with AI Experience Work From Home Opportunity Experience: 2 - 3 Years We are seeking a talented and experienced * Python and Django Developer * with expertise in modern web frameworks, databases, AI development, and front-end technologies . The ideal candidate will work on cutting-edge projects, including AI-driven applications , knowledge graph systems, and scalable web solutions. The job can be either part-time or full-time. Minimum 2 years of experience is needed. Key Responsibilities: Design, develop, and maintain scalable backend systems using *Python* and *Django*. Must have experience with AI and automation tools such as n8n, make.com and Zapier. Develop and integrate user-facing features with * ReactJS* or *Next.js * for responsive, intuitive front-end applications. Build and optimize knowledge graphs and vector-based databases for advanced data retrieval and AI applications. Collaborate with AI/ML teams to integrate and deploy AI models in production environments. Implement APIs and microservices to support various application functionalities. Optimize application performance, ensuring scalability and efficiency. Research and implement best practices in AI systems, including natural language processing (NLP) and recommendation systems. Work collaboratively in an Agile environment with cross-functional teams to deliver high-quality solutions. Troubleshoot, debug, and upgrade existing systems to ensure security and efficiency. Required Skills and Qualifications: 3+ years of professional experience* in Python development. Proficiency in *Django* and related backend technologies. Strong expertise in *ReactJS* for front-end development. Hands-on experience with *vector databases* (e.g., Pinecone, Weaviate) and *knowledge graph databases* (e.g., Neo4j, Stardog). Familiarity with *building AI models* and deploying them in production, including frameworks like TensorFlow or PyTorch. Knowledge of *RESTful APIs* and/or *GraphQL* for efficient data transfer. Experience with cloud platforms like *AWS, **Azure, or **Google Cloud*. Proficiency with *DevOps tools* like Docker, Kubernetes, and CI/CD pipelines. Solid understanding of database systems (SQL and NoSQL). Experience with *version control systems*, especially Git. Knowledge of data structures, algorithms, and system design principles. Nice-to-Have Skills: Familiarity with *LLMs (Large Language Models)* and prompt engineering. Experience with *real-time data streaming* frameworks like Apache Kafka. Knowledge of *data visualization tools* such as Plotly, Dash, or D3.js. Knowledge of React Understanding of *cybersecurity practices* and secure coding principles. Proficiency in other backend frameworks like Flask or FastAPI Soft Skills: Strong problem-solving and analytical skills. Excellent written and verbal communication abilities. Team player with the ability to work in a collaborative, fast-paced environment. A passion for staying updated with the latest technologies and trends . Job Type - Full-time, Day shift, Night shift, Flexible Schedule. Working Days - Monday to Friday Interested Candidates can apply directly or can share their CVs on hr.infugin@gmail.com or can contact us at 8360228824 Job Types: Full-time, Permanent Pay: ₹35,925.20 - ₹50,771.82 per month Benefits: Flexible schedule Work from home Schedule: Day shift Evening shift Monday to Friday Night shift Rotational shift Supplemental Pay: Overtime pay Performance bonus Yearly bonus Education: Bachelor's (Required) Experience: n8n: 2 years (Required) AI Automation Tools: 2 years (Required) Python: 2 years (Required) Django: 2 years (Required) React Js: 1 year (Required) Language: English (Required) Work Location: Remote
Posted 2 weeks ago
4.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key responsibilities: Minimum 4-7 years software development/deployment experience as per below mentioned 1. Full-Stack Developer(API & Serverless Specialist) Experience: 5-7 years Develop RESTful & GraphQL APIs using Node.js and TypeScript Implement serverless software development using AWS Lambda (Node.js & TypeScript) Work with SQL & NoSQL databases, including schema design, performance tuning, and debugging Strong experience in AWS services like S3, RDS, DynamoDB, Cognito, IAM, SQS, and CloudWatch Write unit tests using Jest, Mocha, or Chai Experience with CI/CD pipelines, Terraform, CloudFormation Good understanding of software design principles, OOD, SOLID, and design patterns) Preferred: AWS Certification, experience with GraphDB (Neo4j, AWS Timestream) 2. Full-Stack Developer (IoT & AWS Cloud Solutions Specialist) Experience: 6-8 years Design and develop enterprise IoT solutions using AWS IoT Core and related AWS services Build real-time, event-driven architectures integrating IoT devices with cloud applications Implement React.js-based frontend applications for IoT platforms Develop APIs and backend services using Node.js, TypeScript, REST & GraphQL Strong understanding of AWS services like AWS IoT, S3, DynamoDB, Lambda, SQS, IAM Strong experience with microservices and event-driven architecture Preferred: Experience with GraphDB (Neo4j), AWS Timestream, real-time dashboards 3. Full-Stack Developer (Cloud & DevOps Focused) Experience: 7-9 years Design and develop cloud-based applications using Node.js, TypeScript, React.js Work extensively with AWS services, including Lambda, S3, RDS, DynamoDB, Cognito, IAM, CloudWatch Implement CI/CD pipelines in AWS environments using Terraform, CloudFormation, CircleCI Design and implement secure APIs (OpenAPI standards preferred) Strong knowledge of object-oriented programming OOD, SOLID principles, design patterns) Experience in unit testing & automation using Jest, Mocha, Chai Preferred: AWS Certification, experience with GraphDB (Neo4j), IoT solutions, or real-time data processing
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing architectures for meta-learning, self-reflective agents, and recursive optimization loops. Your role will involve building simulation frameworks for behavior grounded in Bayesian dynamics, attractor theory, and teleo-dynamics. Additionally, you will develop systems that integrate graph rewriting, knowledge representation, and neurosymbolic reasoning. Conducting research on fractal intelligence structures, swarm-based agent coordination, and autopoietic systems will be part of your responsibilities. You are expected to advance Mobius's knowledge graph with ontologies supporting logic, agency, and emergent semantics. Integration of logic into distributed, policy-scoped decision graphs aligned with business and ethical constraints is crucial. Furthermore, publishing cutting-edge results and mentoring contributors in reflective system design and emergent AI theory will be part of your duties. Lastly, building scalable simulations of multi-agent, goal-directed, and adaptive ecosystems within the Mobius runtime is an essential aspect of the role. In terms of qualifications, you should have proven expertise in meta-learning, recursive architectures, and AI safety. Proficiency in distributed systems, multi-agent environments, and decentralized coordination is necessary. Strong implementation skills in Python are required, with additional proficiency in C++, functional, or symbolic languages being a plus. A publication record in areas intersecting AI research, complexity science, and/or emergent systems is also desired. Preferred qualifications include experience with neurosymbolic architectures and hybrid AI systems, fractal modeling, attractor theory, complex adaptive dynamics, topos theory, category theory, logic-based semantics, knowledge ontologies, OWL/RDF, semantic reasoners, autopoiesis, teleo-dynamics, biologically inspired system design, swarm intelligence, self-organizing behavior, emergent coordination, and distributed learning systems. In terms of technical proficiency, you should be proficient in programming languages such as Python (required), C++, Haskell, Lisp, or Prolog (preferred for symbolic reasoning), frameworks like PyTorch and TensorFlow, distributed systems including Ray, Apache Spark, Dask, Kubernetes, knowledge technologies like Neo4j, RDF, OWL, SPARQL, experiment management tools like MLflow, Weights & Biases, and GPU and HPC systems like CUDA, NCCL, Slurm. Familiarity with formal modeling tools like Z3, TLA+, Coq, Isabelle is also beneficial. Your core research domains will include recursive self-improvement and introspective AI, graph theory, graph rewriting, and knowledge graphs, neurosymbolic systems and ontological reasoning, fractal intelligence and dynamic attractor-based learning, Bayesian reasoning under uncertainty and cognitive dynamics, swarm intelligence and decentralized consensus modeling, top os theory, and the abstract structure of logic spaces, autopoietic, self-sustaining system architectures, and teleo-dynamics and goal-driven adaptation in complex systems.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a Data Science Engineer who will be contributing to the development of intelligent, autonomous AI systems. The ideal candidate should have a strong background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. Your responsibilities will include deploying AI solutions that leverage technologies such as Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. As part of the flexible scheme, you will enjoy various benefits such as a best-in-class leave policy, gender-neutral parental leaves, childcare assistance benefit reimbursement, sponsorship for industry-relevant certifications, employee assistance program, comprehensive hospitalization insurance, accident and term life insurance, and health screening. Your key responsibilities will involve designing and developing Agentic AI Applications using frameworks like LangChain, CrewAI, and AutoGen, implementing RAG Pipelines, fine-tuning Language Models, training NER Models, developing Knowledge Graphs, collaborating cross-functionally, and optimizing AI workflows. To excel in this role, you should have at least 4 years of professional experience in AI/ML development, proficiency in Python, Python API frameworks, SQL, and familiarity with AI/ML frameworks like TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms, understanding of LLMs, SLMs, semantic technologies, and MLOps tools is required. Additionally, hands-on experience with vector databases, embedding techniques, and developing AI solutions for specific industries will be beneficial. You will receive support through training, coaching, and a culture of continuous learning to aid in your career progression. The company strives for a culture of empowerment, responsibility, commercial thinking, initiative, and collaboration. They promote a positive, fair, and inclusive work environment for all individuals. For further information about the company and its teams, please visit the company website at https://www.db.com/company/company.htm. Join a team that celebrates success and fosters a culture of excellence and inclusivity.,
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
Locations: India Type: Full-time Experience: 5+ years Functions: Consulting, Finance, Information Technology, Data Governance Industries: Capital Markets, Investment Banking, Alternative Investments, Financial Services, Management Consulting, Information Technology and Services, Business Travel Healthcare Role: We are looking for a talented Knowledge Graph Engineer to join our team. As a key member of our data engineering team, you will be responsible for designing, implementing, and optimizing graph databases to efficiently store and retrieve high-dimensional data. Responsibilities: Help design, build and continuously improve the client’s online platform . Design and implement graph databases to efficiently store and retrieve dimensional data. Utilize Neo4j for creating and managing knowledge graphs, ensuring optimal performance and scalability. Research, suggest and implement new technology solutions following best practices/standards. Develop and maintain knowledge graphs using Neo4j , incorporating domain-specific data and referential integrities. Work with application teams to integrate graph databases, knowledge graphs solutions within existing infrastructure. Provide support for query optimization and data modeling for application-specific requirements. Ensure that proper data security measures are implemented for graph databases and knowledge graphs . Requirements 5-10 years of experience working with data engineering with at least 2-3 years of experience working on graph databases . Proficiency in query languages like Cypher or Gremlin and a solid foundation in graph theory are crucial for success in this position. Outstanding written and verbal communication skills. Superior analytical and problem-solving skills. Experience in working in dual shore engagement is preferred.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough