Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Gen AI Developer Location: Chennai Salary: Open for discussion Experience Required: 5β7 years Employment Type: Full-Time π Role Overview: We are seeking a highly skilled and innovative Gen AI Developer with hands-on experience in Azure Gen AI services, Python, and advanced machine learning. The ideal candidate will be responsible for building and deploying AI-powered enterprise solutions, working on cutting-edge technologies such as LLMs, RAG, and intelligent document processing. π Key Responsibilities: Develop and deploy Azure Gen AI solutions (AI Search, Document Intelligence, Azure Functions, Logic Apps). Implement intelligent pipelines using Langchain and Python-based Gen AI tools. Build and fine-tune LLMs with capabilities like Reasoning Agents , RAG (Retrieval-Augmented Generation) , and Chain of Thought modeling. Design and manage prompt engineering workflows to optimize AI outcomes. Construct robust APIs for enterprise AI use cases. Work with Graph DBs such as Cosmos DB and Neo4j to manage and query large-scale AI data. Apply domain knowledge in finance and data management for contextual model training. Support code deployment and automation using GitHub workflows . Collaborate with cross-functional teams to deliver live AI applications and ensure scalability, security, and performance. β Mandatory Skills & Proficiency: SkillMandatoryProficiency (1β5) Azure Gen AI (AI Search, Doc Intelligence, Azure Functions, Logic Apps) Yes 4 Python (Langchain, related AI frameworks) Yes 4 Machine Learning Yes 4 DevOps (Terraform, etc.) No 3 π― Preferred Qualifications: Experience deploying at least one enterprise AI solution live in production. Solid understanding of data pipelines , RAG architecture , and LLM customization . Excellent problem-solving and prompt design skills for generative AI. Familiarity with DevOps practices and tools like Terraform is a plus. Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Gen AI Developer Location: Chennai Salary: Open for discussion Experience Required: 5β7 years Employment Type: Full-Time π Role Overview We are seeking a highly skilled and innovative Gen AI Developer with hands-on experience in Azure Gen AI services, Python, and advanced machine learning. The ideal candidate will be responsible for building and deploying AI-powered enterprise solutions, working on cutting-edge technologies such as LLMs, RAG, and intelligent document processing. π Key Responsibilities Develop and deploy Azure Gen AI solutions (AI Search, Document Intelligence, Azure Functions, Logic Apps). Implement intelligent pipelines using Langchain and Python-based Gen AI tools. Build and fine-tune LLMs with capabilities like Reasoning Agents, RAG (Retrieval-Augmented Generation), and Chain of Thought modeling. Design and manage prompt engineering workflows to optimize AI outcomes. Construct robust APIs for enterprise AI use cases. Work with Graph DBs such as Cosmos DB and Neo4j to manage and query large-scale AI data. Apply domain knowledge in finance and data management for contextual model training. Support code deployment and automation using GitHub workflows. Collaborate with cross-functional teams to deliver live AI applications and ensure scalability, security, and performance. β Mandatory Skills & Proficiency Skill Mandatory Proficiency (1β5) Azure Gen AI (AI Search, Doc Intelligence, Azure Functions, Logic Apps) Yes 4 Python (Langchain, related AI frameworks) Yes 4 Machine Learning Yes 4 DevOps (Terraform, etc.) No 3 π― Preferred Qualifications Experience deploying at least one enterprise AI solution live in production. Solid understanding of data pipelines, RAG architecture, and LLM customization. Excellent problem-solving and prompt design skills for generative AI. Familiarity with DevOps practices and tools like Terraform is a plus. Skills: learning,langchain,graph dbs,devops,azure,cosmos db,azure gen ai,machine learning,python,terraform,neo4j,intelligence Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. RCE-Risk Data Engineer/Leads Description β External Job Description: - Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies. The ideal candidate with 3-6 years of experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role you will: Develop a thorough understanding of the data science lifecycle, including data exploration, preprocessing, modelling, validation, and deployment Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases. Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities. Core/Must Have Skills. Significant data analysis experience using python, SQL and spark. Experience in python or spark to write scripts for data transformation, integration and automation tasks. 3-6 yearsβ experience with cloud ML (AWS) or any similar tools. Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Strong Experience with statistical analytical techniques, data mining and predictive models. Conduct A/B testing and other model validation techniques to ensure the accuracy and reliability of data models.. Experience with optimization modelling, machine learning, forecasting and/or natural language processing. Hands-on experience with Amazon S3 data storage, data lifecycle policies, and integration with other AWS services. Maintain, optimize, and scale AWS Redshift clusters to ensure efficient data storage, retrieval, and query performance Utilize Amazon S3 to store raw data, manage large datasets, and integrate with other AWS services to ensure secure, scalable, and cost-effective data solutions. Experience in implementing CI/CD Pipelines in AWS. At least 4+ years of experience in Database Design and Dimension modelling using SQL Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (SQL Server, Neo4J) Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems. Strong communication skills to effectively collaborate with team members and present findings to stakeholders. Collaborate with cross-functional teams to ensure successful implementation of solutions. Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points. Good to have: Apply domain knowledge (if applicable) in financial fraud to enhance predictive modelling and anomaly detection capabilities. Knowledge of AWS IAM for managing secure access to data resources. Familiarity with DevOps practices and automation tools like Terraform or CloudFormation. Experience with data visualization tools like Quick Sight or integrating Redshift data with BI tools (Tableau, PowerBI, etc.). AWS certifications such as AWS Certified Data Analytics β Specialty or AWS Certified Solutions Architect are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. RCE-Risk Data Engineer/Leads Description β External Job Description: - Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies. The ideal candidate with 3-6 years of experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role you will: Develop a thorough understanding of the data science lifecycle, including data exploration, preprocessing, modelling, validation, and deployment Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases. Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities. Core/Must Have Skills. Significant data analysis experience using python, SQL and spark. Experience in python or spark to write scripts for data transformation, integration and automation tasks. 3-6 yearsβ experience with cloud ML (AWS) or any similar tools. Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Strong Experience with statistical analytical techniques, data mining and predictive models. Conduct A/B testing and other model validation techniques to ensure the accuracy and reliability of data models.. Experience with optimization modelling, machine learning, forecasting and/or natural language processing. Hands-on experience with Amazon S3 data storage, data lifecycle policies, and integration with other AWS services. Maintain, optimize, and scale AWS Redshift clusters to ensure efficient data storage, retrieval, and query performance Utilize Amazon S3 to store raw data, manage large datasets, and integrate with other AWS services to ensure secure, scalable, and cost-effective data solutions. Experience in implementing CI/CD Pipelines in AWS. At least 4+ years of experience in Database Design and Dimension modelling using SQL Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (SQL Server, Neo4J) Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems. Strong communication skills to effectively collaborate with team members and present findings to stakeholders. Collaborate with cross-functional teams to ensure successful implementation of solutions. Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points. Good to have: Apply domain knowledge (if applicable) in financial fraud to enhance predictive modelling and anomaly detection capabilities. Knowledge of AWS IAM for managing secure access to data resources. Familiarity with DevOps practices and automation tools like Terraform or CloudFormation. Experience with data visualization tools like Quick Sight or integrating Redshift data with BI tools (Tableau, PowerBI, etc.). AWS certifications such as AWS Certified Data Analytics β Specialty or AWS Certified Solutions Architect are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. RCE-Risk Data Engineer/Leads Description β External Job Description: - Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies. The ideal candidate with 3-6 years of experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role you will: Develop a thorough understanding of the data science lifecycle, including data exploration, preprocessing, modelling, validation, and deployment Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases. Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities. Core/Must Have Skills. Significant data analysis experience using python, SQL and spark. Experience in python or spark to write scripts for data transformation, integration and automation tasks. 3-6 yearsβ experience with cloud ML (AWS) or any similar tools. Design, build, and maintain tree-based predictive models, such as decision trees, random forests, and gradient-boosted trees, with a low-level understanding of their algorithms and functioning. Strong Experience with statistical analytical techniques, data mining and predictive models. Conduct A/B testing and other model validation techniques to ensure the accuracy and reliability of data models.. Experience with optimization modelling, machine learning, forecasting and/or natural language processing. Hands-on experience with Amazon S3 data storage, data lifecycle policies, and integration with other AWS services. Maintain, optimize, and scale AWS Redshift clusters to ensure efficient data storage, retrieval, and query performance Utilize Amazon S3 to store raw data, manage large datasets, and integrate with other AWS services to ensure secure, scalable, and cost-effective data solutions. Experience in implementing CI/CD Pipelines in AWS. At least 4+ years of experience in Database Design and Dimension modelling using SQL Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (SQL Server, Neo4J) Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems. Strong communication skills to effectively collaborate with team members and present findings to stakeholders. Collaborate with cross-functional teams to ensure successful implementation of solutions. Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points. Good to have: Apply domain knowledge (if applicable) in financial fraud to enhance predictive modelling and anomaly detection capabilities. Knowledge of AWS IAM for managing secure access to data resources. Familiarity with DevOps practices and automation tools like Terraform or CloudFormation. Experience with data visualization tools like Quick Sight or integrating Redshift data with BI tools (Tableau, PowerBI, etc.). AWS certifications such as AWS Certified Data Analytics β Specialty or AWS Certified Solutions Architect are a plus. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
6.0 - 9.0 years
32 - 35 Lacs
Noida, Kolkata, Chennai
Work from Office
Dear Candidate, We are hiring a Rust Developer to build safe, concurrent, and high-performance applications for system-level or blockchain development. Key Responsibilities: Develop applications using Rust and its ecosystem (Cargo, Crates) Write memory-safe and zero-cost abstractions for systems or backends Build RESTful APIs, CLI tools, or blockchain smart contracts Optimize performance using async/await and ownership model Ensure safety through unit tests, benchmarks, and fuzzing Required Skills & Qualifications: Proficient in Rust , lifetimes , and borrowing Experience with Tokio , Actix , or Rocket frameworks Familiarity with WebAssembly , blockchain (e.g. Substrate) , or embedded Rust Bonus: Background in C/C++ , systems programming, or cryptography Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 weeks ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Clojure Developer to build modern applications with simplicity, immutability, and strong functional design. Key Responsibilities: Write functional code using Clojure and ClojureScript Develop APIs, web apps, or backends using Ring , Compojure , or Luminus Work with immutable data structures and REPL-driven development Build scalable microservices or event-driven systems Maintain clean, modular, and expressive codebases Required Skills & Qualifications: Strong experience with Clojure , Leiningen , and core.async Familiarity with functional programming and persistent data structures Experience integrating with Java , Datomic , or Kafka Bonus: Frontend experience with Reagent or Re-frame Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 2 weeks ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Ahmedabad) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Attri) What do you need for this opportunity? Must have skills required: Python, Python Programming Attri is Looking for: About The Role: We are a global team with our people spread out across different countries. We strive to build a diverse team of passionate people who believe in bringing change through their work. At Attri, we are seeking a talented Frontend Engineer to join our dynamic team. We are a cutting-edge company, and we're looking for an individual who is passionate, inquisitive, and a self-learner, to contribute to the success of our projects. Responsibilities: Modern Web Development: Proficiency in HTML5, CSS3, ES6+, Typescript, and Node.js, with a strong emphasis on staying up-to-date with the latest technologies. TypeScript: Hands on with Generics, Template Literals, Mapped Types, Conditional Types Flexible Approach: Based on problem at hand apply appropriate solution while considering all the risks Frontend React.js and Flux Architecture: Extensive experience in React.js and Flux Architecture, along with external state management to build robust and performant web applications. JS Event Loop: Understanding of event loop, criticality of not blocking main thread, cooperative scheduling in react. State Management: Hands on with more than one state management library Ecosystem: Ability to leverage vast JS ecosystem and hands on with non-typical libraries. Backend SQL - Extensive hands on with Postgres with comfortable with json_agg, json_build_object, WITH CLAUSE, CTE, View/Materialized View, Transactions Redis - Hands-on with different data structures and usage. Architectural Patterns - Backend for Frontend, Background Workers, CQRS, Event Sourcing, Orchestration/Choreography, etc Transport Protocols, such as HTTP(S), SSE, and WS(S), to optimize data transfer and enhance application performance Serialization Protocols - JSON and at least one more protocol Authentication/Authorization - Comfortable with OAuth, JWT and other mechanisms for different use cases Comfortable with reading open source code of libraries in use and understanding of internals Able to fork the library to either improve, fix bug, or redesign Tooling: Knowledge of essential frontend tools like Prettier, ESLint, and Conventional Commit to maintain code quality and consistency. Dependency management and versioning Familiarity with CI/CD Testing: Utilize Jest/Vitest and React Testing Library for comprehensive testing of your code, ensuring high code quality and reliability. Collaboration: Collaborate closely with our design team to craft responsive and themable components for data-intensive applications, ensuring a seamless user experience. Programming Paradigms: Solid grasp of both Object-Oriented Programming and Functional Programming concepts to create clean and maintainable code. Design/Architectural Patterns: Identifying suitable design and architectural pattern to solve the problem at hand. Comfortable with tailoring the pattern to fit the problem optimally Modular and Reusable Code: Write modular, reusable, and testable code that enhances codebase maintainability. DSA: Basic understanding of DSA when required to optimize hot paths. Good To Have: Python: Django Rest Framework, Celery, Pandas/Numpy, Langchain, Ollama Storybook: Storybook to develop components in isolation, streamlining the UI design and development process. Charting and Visualization: Experience with charting and visualization libraries, especially ECharts by Apache, to create compelling data representations. Tailwind CSS: Understanding of Tailwind CSS for efficient and responsive UI development. NoSQL Stores - ElasticSearch, Neo4j, Cassandra, Qdrant, etc. Functional Reactive Programming RabbitMQ/Kafka Great To Have: Open Source Contribution: Experience in contributing to open-source projects (not limited to personal projects or forks) that showcases your commitment to the development community. Renderless/Headless React Components: Developing renderless or headless React components to provide flexible and reusable UI solutions. End-to-End Testing: Experience with Cypress or any other end-to-end (E2E) testing framework, ensuring the robustness and quality of the entire application. Deployment: Being target agnostic and understanding the nuances of application in operation. What You Bring: Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of relevant experience in frontend web development, including proficiency in HTML5, CSS3, ES6+, Typescript, React.js, and related technologies. Solid understanding of Object-Oriented Programming, Functional Programming, SOLID principles, and Design Patterns. Proven experience in developing modular, reusable, and testable code. Prior work on data-intensive applications and collaboration with design teams to create responsive and themable components. Experience with testing frameworks like Jest/Vitest and React Testing Library. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
3.0 - 7.0 years
5 - 10 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
1 year contract Job Title: Python Developer with Neo4j & ExpressJS Location: Bangalore / Chennai / Hyderabad Experience: 3+ Years mode -wfo Salary: 5-10lpa Employment Type: Full-Time (Immediate to 15 Days Joiners Preferred) Client: Brillio On Payroll of: Nyxtech Hiring Contact: Yash Sharma (LinkedIn) :-linkedin.com/in/yashsharma1608 Notice Period: Immediate to 15 Days Job Description: Brillio is seeking a highly capable Python Developer with hands-on expertise in Neo4j (Cypher, APOC) and ExpressJS . The ideal candidate will have a strong backend development background and experience working with modern databases and CI/CD tools. This is a fast-paced role requiring immediate availability and a passion for problem-solving and scalable architecture. Key Responsibilities: Develop and maintain backend services using Python and ExpressJS. Design and optimize graph-based data models using Neo4j , Cypher queries, and APOC procedures. Work with relational and NoSQL databases like PostgreSQL and Cassandra . Collaborate with cross-functional teams to translate business needs into technical solutions. Implement CI/CD workflows using Git and Jenkins . Ensure code quality, security, and performance in a production environment. Required Skills & Experience: Minimum 3 years of development experience. Strong proficiency in Python and ExpressJS . Deep understanding of Neo4j , including Cypher and APOC . Working knowledge of PostgreSQL , Cassandra , Git , and Jenkins . Familiarity with Java and Spring Framework is a plus. Good analytical skills with a proactive approach to problem-solving. Immediate to 15 days availability is mandatory. Additional Details: Be part of innovative and data-intensive application development. Work in a supportive, high-energy environment with cutting-edge technologies. Immediate joiners preferred due to project urgency.
Posted 2 weeks ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Solution Architect Location: CIPL, Noida Sector - 138 Job Type : Full-time Experience: 10+ years in IT, with 10+ years of relevant experience Educational Qualification: BE/B.Tech or MCA in Computer Science/Information Technology Industry Preference: Information Technology Job Summary: We are looking for an experienced Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing scalable, high-performance enterprise architectures using modern technologies, cloud platforms, and GEN AI tools. You will play a key role in driving technology strategy, aligning IT solutions with business goals, and ensuring architectural integrity across projects. Key Responsibilities: Analyze complex business requirements and translate them into scalable, secure, and efficient technical architectures. Design end-to-end solutions and document architectural decisions, diagrams, and system specifications. Ensure architectural alignment with enterprise IT standards and frameworks. Collaborate with cross-functional teams, including developers, project managers, and business stakeholders. Oversee implementation to ensure solutions are built as per architectural guidelines. Identify potential technical risks and develop mitigation strategies. Stay updated on emerging technologies and evaluate their potential impact on the organization's architecture. Technical Skills: Programming Languages: Java, Node.js, Go-lang Cloud Platforms (Preferred): AWS : IOT Core, DynamoDB, Lambda, Elastic Search, Glue, Athena, EKS, SQS, SNS, Kinesis Stream/Firehose, API Gateway, CloudFront, Cognito Azure : Service Bus, SQL Database, Redis (Azure Cache), Blob Storage, API Management, Key Vault Generative AI Tools: AWS Lex, Bedrock, Vector DB GitHub Copilot, Amazon Q, Cluster AI Cloud Infrastructure Tools: Cloud Foundry, Terraform, AWS CloudFormation (or similar) UI & Mobile Frameworks: React.js, React Native, Flutter Databases: NoSQL : Neo4j, Cassandra, InfluxDB, DynamoDB, MongoDB, Cosmos DB Relational : Oracle, MySQL, MS SQL Server Frameworks & Tools: Spring Framework, Kafka, Kinesis Data Stream Docker, Kubernetes Soft Skills & Competencies: Strong analytical and problem-solving skills Excellent communication and stakeholder management Leadership and collaboration across multidisciplinary teams Commitment to best practices in software development and architecture Proactive in risk management and solution delivery Interested candidates can apply by sending their updated resume to d.darshani@cipl.org.in or apply through the job post directly! Thanks! TA Team - CIPL Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Join our dynamic AI team as AI/ML Engineer focused on backend and AI system development. You will design, develop, and scale robust backend services and AI/ML pipelines using Python (3.11+), FastAPI. Your work will involve building advanced generative AI solutionsβsuch as retrieval-augmented generation (RAG) and agent-based workflowsβleveraging large language models to solve real-world problems. Our projects span multiple verticals, including healthcare and management consulting, providing a unique opportunity to work on impactful, real-world challenges. What you will be doing: Develop and maintain scalable backend APIs and services in Python (3.11+) using FastAPI Design and implement LLM-driven solutions, including prompt engineering and intelligent AI agent workflows Build and deploy Retrieval-Augmented Generation (RAG) pipelines using vector databases (Weaviate, Neo4j, Firestore) Leverage LangChain and LangGraph frameworks for orchestrating LLM interactions and knowledge graph tasks Integrate and manage data storage with Firebase Firestore or similar NoSQL databases on Google Cloud Platform (GCP) Process, clean, and analyze text data using NLP techniques to feed into AI pipelines Collaborate with cross-functional teams to translate requirements into robust AI features and ensure production readiness Create dashboards and developer tools (using JavaScript/TypeScript) for monitoring and analytics of AI systems Ensure code quality and follow best practices (testing, CI/CD) for reliable production deployments You are a good match if you have: 3+ years of software engineering experience with a strong focus on Python (3.11+) and backend development Proficiency with FastAPI or similar Python web frameworks for building RESTful APIs Hands-on experience with large language models (LLMs) and Generative AI, including prompt engineering techniques Solid experience with LangChain and LangGraph (or equivalent orchestration frameworks) Proven track record in building RAG systems and deploying AI agents in production environments Proficient in using vector databases like Weaviate for semantic search and retrieval Strong NLP and text data processing skills (tokenization, embeddings, text analytics) Experience with Google Cloud Platform (GCP) services and Firebase Firestore (or other NoSQL databases) Basic proficiency in JavaScript/TypeScript for developing internal dashboards or front-end components Excellent communication, teamwork and problem-solving skills is must You get extra points for: Experience with other cloud platforms (AWS, Azure) and cloud-native architectures Familiarity with containerization (Docker, Kubernetes) and CI/CD pipelines Experience with frontend frameworks (React, Vue) or visualization libraries for dashboards Why join us? Youβll be a key early hire in a fast-moving, supportive, and experienced team. Reporting to the Head of Product Engineering, youβll have the autonomy to shape support, release, and onboarding processes as we scale. Youβll work across the business and play a vital role in our customersβ success. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Greater Ahmedabad Area
On-site
Sr. Fullstack Developer Experience: 4 - 8 Years Exp Salary : Competitive Preferred Notice Period: Within 30 Days Shift: 10:00AM to 7:00PM IST Opportunity Type: Onsite (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python , Python Programming Attri (One of Uplers' Clients) is Looking for: Senior DevOps Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About Attri Attri is an AI organization that helps businesses initiate and accelerate their AI efforts. We offer the industryβs first end-to-end enterprise machine learning platform, empowering teams to focus on ML development rather than infrastructure. From ideation to execution, our global team of AI experts supports organizations in building scalable, state-of-the-art ML solutions. Our mission is to redefine businesses by harnessing cutting-edge technology and a unique, value-driven approach. With team members across continents, we celebrate diversity, curiosity, and innovation. About The Role: We are a global team with our people spread out across different countries. We strive to build a diverse team of passionate people who believe in bringing change through their work. At Attri, we are seeking a talented Frontend Engineer to join our dynamic team. We are a cutting-edge company, and we're looking for an individual who is passionate, inquisitive, and a self-learner, to contribute to the success of our projects. Responsibilities: Modern Web Development: Proficiency in HTML5, CSS3, ES6+, Typescript, and Node.js, with a strong emphasis on staying up-to-date with the latest technologies. TypeScript: Hands on with Generics, Template Literals, Mapped Types, Conditional Types Flexible Approach: Based on problem at hand apply appropriate solution while considering all the risks Frontend React.js and Flux Architecture: Extensive experience in React.js and Flux Architecture, along with external state management to build robust and performant web applications. JS Event Loop: Understanding of event loop, criticality of not blocking main thread, cooperative scheduling in react. State Management: Hands on with more than one state management library Ecosystem: Ability to leverage the vast JS ecosystem and hands on with non-typical libraries. Backend SQL - Extensive hands on with Postgres with comfortable with json_agg, json_build_object, WITH CLAUSE, CTE, View/Materialized View, Transactions Redis - Hands-on with different data structures and usage. Architectural Patterns - Backend for Frontend, Background Workers, CQRS, Event Sourcing, Orchestration/Choreography, etc Transport Protocols, such as HTTP(S), SSE, and WS(S), to optimize data transfer and enhance application performance Serialization Protocols - JSON and at least one more protocol Authentication/Authorization - Comfortable with OAuth, JWT and other mechanisms for different use cases Comfortable with reading open source code of libraries in use and understanding of internals Able to fork the library to either improve, fix bug, or redesign Tooling: Knowledge of essential frontend tools like Prettier, ESLint, and Conventional Commit to maintain code quality and consistency. Dependency management and versioning Familiarity with CI/CD Testing: Utilize Jest/Vitest and React Testing Library for comprehensive testing of your code, ensuring high code quality and reliability. Collaboration: Collaborate closely with our design team to craft responsive and themable components for data-intensive applications, ensuring a seamless user experience. Programming Paradigms: Solid grasp of both Object-Oriented Programming and Functional Programming concepts to create clean and maintainable code. Design/Architectural Patterns: Identifying suitable design and architectural pattern to solve the problem at hand. Comfortable with tailoring the pattern to fit the problem optimally Modular and Reusable Code: Write modular, reusable, and testable code that enhances codebase maintainability. DSA: Basic understanding of DSA when required to optimize hot paths. Good To Have: Python: Django Rest Framework, Celery, Pandas/Numpy, Langchain, Ollama Storybook: Storybook to develop components in isolation, streamlining the UI design and development process. Charting and Visualization: Experience with charting and visualization libraries, especially ECharts by Apache, to create compelling data representations. Tailwind CSS: Understanding of Tailwind CSS for efficient and responsive UI development. NoSQL Stores - ElasticSearch, Neo4j, Cassandra, Qdrant, etc. Functional Reactive Programming RabbitMQ/Kafka Great To Have: Open Source Contribution: Experience in contributing to open-source projects (not limited to personal projects or forks) that showcases your commitment to the development community. Renderless/Headless React Components: Developing renderless or headless React components to provide flexible and reusable UI solutions. End-to-End Testing: Experience with Cypress or any other end-to-end (E2E) testing framework, ensuring the robustness and quality of the entire application. Deployment: Being target agnostic and understanding the nuances of application in operation. What You Bring: Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of relevant experience in frontend web development, including proficiency in HTML5, CSS3, ES6+, Typescript, React.js, and related technologies. Solid understanding of Object-Oriented Programming, Functional Programming, SOLID principles, and Design Patterns. Proven experience in developing modular, reusable, and testable code. Prior work on data-intensive applications and collaboration with design teams to create responsive and themable components. Experience with testing frameworks like Jest/Vitest and React Testing Library. Benefits : Competitive Salary πΈ Support for continual learning (free books and online courses) π Leveling Up Opportunities π± Diverse team environment π How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Attri, an AI organization, leads the way in enterprise AI, offering advanced solutions and services driven by AI agents and powered by Foundation Models. Our comprehensive suite of AI-enabled tools drives business impact, enhances quality, mitigates risk, and also helps unlock growth opportunities. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 2 weeks ago
2.0 - 3.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with React, Api, Ecommerc, Node & Expressjs, Javas Cript
Posted 3 weeks ago
3.0 - 5.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Key Responsibilities : - Design, develop, and deploy AI/ML models for data-driven applications. - Work with Neo4j (Graph Database) to model and analyze complex relationships in datasets. - Implement machine learning algorithms for recommendation systems, predictive analytics, and knowledge graphs. - Develop and optimize graph-based data pipelines and algorithms using Neo4j Cypher queries. - Process, clean, and analyze large datasets for feature engineering and model training. - Integrate ML models into scalable applications using Python, TensorFlow, PyTorch, or Scikit-learn. - Optimize model performance, accuracy, and scalability for production deployment. - Collaborate with data scientists, backend engineers, and product teams to enhance AI capabilities. Required Skills : - Machine Learning & Deep Learning : Hands-on with ML models, classification, clustering, NLP. - Neo4j & Graph Databases : Experience in graph data modeling, Cypher queries, and data relationships. - Programming : Proficiency in Python, TensorFlow, PyTorch, Scikit-learn. - Data Engineering : Experience with ETL pipelines, data preprocessing, and feature engineering. - Big Data & Cloud : Exposure to Spark, Kafka, AWS/GCP/Azure (Good to have). - MLOps & Deployment : Working knowledge of Docker, Kubernetes, CI/CD pipelines. - Strong Analytical & Problem-Solving Skills for AI-driven applications. Good to Have : - Experience with Graph Neural Networks (GNNs) and Graph Data Science. - Knowledge of NLP, Computer Vision, or Time-Series Analysis. - Exposure to automated ML pipelines and model monitoring.
Posted 3 weeks ago
3.0 - 5.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Artificial Intelligence/Machine Learning Engineer - Neo4j/Deep Learning Role : AI/ML Engineer Location : Bangalore Experience : 3-4 Years Joining : Immediate (Preferred) Key Responsibilities : - Design, develop, and deploy AI/ML models for data-driven applications. - Work with Neo4j (Graph Database) to model and analyze complex relationships in datasets. - Implement machine learning algorithms for recommendation systems, predictive analytics, and knowledge graphs. - Develop and optimize graph-based data pipelines and algorithms using Neo4j Cypher queries. - Process, clean, and analyze large datasets for feature engineering and model training. - Integrate ML models into scalable applications using Python, TensorFlow, PyTorch, or Scikit-learn. - Optimize model performance, accuracy, and scalability for production deployment. - Collaborate with data scientists, backend engineers, and product teams to enhance AI capabilities. Required Skills : - Machine Learning & Deep Learning : Hands-on with ML models, classification, clustering, NLP. - Neo4j & Graph Databases : Experience in graph data modeling, Cypher queries, and data relationships. - Programming : Proficiency in Python, TensorFlow, PyTorch, Scikit-learn. - Data Engineering : Experience with ETL pipelines, data preprocessing, and feature engineering. - Big Data & Cloud : Exposure to Spark, Kafka, AWS/GCP/Azure (Good to have). - MLOps & Deployment : Working knowledge of Docker, Kubernetes, CI/CD pipelines. - Strong Analytical & Problem-Solving Skills for AI-driven applications. Good to Have : - Experience with Graph Neural Networks (GNNs) and Graph Data Science. - Knowledge of NLP, Computer Vision, or Time-Series Analysis. - Exposure to automated ML pipelines and model monitoring.
Posted 3 weeks ago
0 years
0 Lacs
Greater Chennai Area
On-site
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Show more Show less
Posted 3 weeks ago
9.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, youβll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And weβre counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Generative AI β Application Developer Python EYβs GDS Tax Technology teamβs mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EYβs core Tax practice, youβll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and study more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Generative AI β Application Developer (Python) to join our Tax Technology practice in Bangalore & Kolkata India. The opportunity Weβre looking for Tax Seniors with expertise in Full-stack Application Development using Python for Generative AI applications to join the TTT team in Tax Service Line. This is a fantastic opportunity to be part of a pioneer firm whilst being instrumental in the growth of a new service offering. Your Key Responsibilities Design, develop, and implement AI agents/plugins/interfaces and APIs, ensuring integration with various systems aligns with the core product/ platform development strategy. Estimate and manage technical efforts, including work breakdown structures, risks, and solutions, while adhering to development methodologies and KPIs. Maintain effective communication within the team and with stakeholders, proactively managing expectations and collaborating on problem-solving. Contribute to the refinement of development/engineering methodologies and standards, anticipating potential issues and leading the resolution process. Skills And Attributes For Success Must-Have: Skilled in full-stack application development with Python, FAST Api, React or any other typescript based UI frameworks, SQL databases Advanced knowledge of Azure services such as Azure app services, Azure Functions, Entra ID etc. Containerisation β Docker, Azure container apps, Azure Kubernetes Services (AKS) No-SQL database such Cosmos or Mongo DB Working experience with source control such as git or TFVC CI/CD pipelines, Azure DevOps, GitHub Actions etc. Generative AI application development with Azure OpenAI, Semantic Kernel, LangChain, and Vector databases like Azure AI search, Postgres, etc. Fundamental understanding of various types of Large Language Models (LLMs) Fundamental understanding of Retrieval Augment Generation (RAG) techniques Fundamental understanding of classical AI/ML Skilled in Advanced prompt engineering Nice-to-Have: Awareness about various AI Agents/ Agentic workflow frameworks and libraries Graph Database such as Neo4j Experience with M365 Copilot Studio Microsoft Azure AI-900/ AI-102 Certification Behavioural Skills: Excellent learning ability. Strong communication skill. Flexibility to work both independently and as part of a larger team. Strong analytical skills and attention to detail. The ability to adapt your work style to work with both internal and client team members. To qualify for the role, you must have Bachelorβs / masterβs degree in software engineering / information technology / BE/ B.TECH An overall 5 β 9 years of experience. Ideally, youβll also have Thorough knowledge Tax or Finance Domain. Strong analytical skills and attention to detail. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY TAS practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations β Argentina, China, India, the Philippines, Poland and the UK β and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Weβll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: Youβll develop the mindset and skills to navigate whatever comes next. Success, as defined by you: Weβll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: Weβll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: Youβll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 3 weeks ago
6.0 - 9.0 years
32 - 35 Lacs
Noida, Kolkata, Chennai
Work from Office
Dear Candidate, We are hiring a Lua Developer to create lightweight scripting layers in games, embedded systems, or automation tools. Key Responsibilities: Develop scripts and integrations using Lua Embed Lua in C/C++ applications for extensibility Write custom modules or bindings for game engines or IoT devices Optimize Lua code for memory and execution time Integrate with APIs, data sources, or hardware systems Required Skills & Qualifications: Proficient in Lua and its integration with host languages Experience with Love2D , Corona SDK , or custom engines Familiarity with C/C++ , embedded Linux , or IoT Bonus: Game scripting or automation experience Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Overview: The Technology Solution Delivery - Front Line Manager (M1) is responsible for providing leadership and day-to-day direction to a cross functional engineering team. This role involves establishing and executing operational plans, managing relationships with internal and external customers, and overseeing technical fulfillment projects. The manager also supports sales verticals in customer interactions and ensures the delivery of technology solutions aligns with business needs. What you will do: Build strong relationships with both internal and external stakeholders including product, business and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters. Mentor, coach and develop junior and senior software, quality and reliability engineers. Collaborate with the architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs. Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Drive technical documentation including support, end user documentation and run books. Lead Sprint planning, Sprint Retrospectives, and other team activities Implement architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate Provides coaching, leadership and talent development; ensures teams functions as a high-performing team; able to identify performance gaps and opportunities for upskilling and transition when necessary. Drives culture of accountability through actions and stakeholder engagement and expectation management Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Oversee systems designs within the scope of the broader area, and review product or system development code to solve ambiguous problems Identify and resolve problems affecting day-to-day operations Set priorities for the engineering team and coordinate work activities with other supervisors Cloud Certification Strongly Preferred What experience you need: BS or MS degree in a STEM major or equivalent job experience required 10+ yearsβ experience in software development and delivery You adore working in a fast paced and agile development environment You possess excellent communication, sharp analytical abilities, and proven design skills You have detailed knowledge of modern software development lifecycles including CI / CD You have the ability to operate across a broad and complex business unit with multiple stakeholders You have an understanding of the key aspects of finance especially as related to Technology. Specifically including total cost of ownership and value You are a self-starter, highly motivated, and have a real passion for actively learning and researching new methods of work and new technology You possess excellent written and verbal communication skills with the ability to communicate with team members at various levels, including business leaders What Could Set You Apart UI development (e.g. HTML, JavaScript, AngularJS, Angular4/5 and Bootstrap) Source code control management systems (e.g. SVN/Git, Subversion) and build tools like Maven Big Data, Postgres, Oracle, MySQL, NoSQL databases (e.g. Cassandra, Hadoop, MongoDB, Neo4J) Design patterns Agile environments (e.g. Scrum, XP) Software development best practices such as TDD (e.g. JUnit), automated testing (e.g. Gauge, Cucumber, FitNesse), continuous integration (e.g. Jenkins, GoCD) Linux command line and shell scripting languages Relational databases (e.g. SQL Server, MySQL) Cloud computing, SaaS (Software as a Service) Atlassian tooling (e.g. JIRA, Confluence, and Bitbucket) Experience working in financial services Experience working with open source frameworks; preferably Spring, though we would also consider Ruby, Apache Struts, Symfony, Django, etc. Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Customer-focused with a drive to exceed expectations. Demonstrates integrity and accountability. Intellectually curious and driven to innovate. Values diversity and fosters collaboration. Results-oriented with a sense of urgency and agility. Show more Show less
Posted 3 weeks ago
2.0 - 5.0 years
15 - 19 Lacs
Mumbai
Work from Office
Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Womenβs Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose β to power better investment decisions. Youβll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Masterβs degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Masterβs degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less
Posted 3 weeks ago
7.0 - 10.0 years
16 - 30 Lacs
Noida, Lucknow
Work from Office
Hi We have one urgent requirement for Python Developer roles who can Join in max 30 Days About HCL Software HCLSoftware, a division of HCLTech, develops, markets, sells, and supports software for AI and Automation, Data, Analytics and Insights, Digital Transformation, and Enterprise Security. HCLSoftware is the cloud-native solution factory for enterprise software and powers millions of apps at more than 20,000 organizations, including more than half of the Fortune 1000 and Global 2000 companies. HCLSoftware's mission is to drive ultimate customer success through relentless product innovation. Website : hcl-software.com Please also find below JD Job Title: Python Developer Location: Noida Experience Required: 7 to 10 years Notice Period: Serving and join within 2-4 weeks (Early joiners only) Job Description: We are seeking an experienced Python Developer with a strong background in application and product development to join our team in Noida. The ideal candidate will have extensive experience in Python programming , with a focus on building robust and scalable applications. We are specifically looking for professionals who have been involved in full-cycle product development , rather than those whose experience is limited to writing scripts for testing or automation purposes. Key Responsibilities: Design, develop, and maintain Python-based applications and products. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Participate in code reviews. Work with other developers, designers, and stakeholders to build high-quality, innovative, and fully performing software. Requirements: 7 to 10 years of proven experience in Python development. Strong understanding of Python programming and application development . Hands-on experience in full-cycle application/product development and Agile/Scrum development methodology. Solid understanding of software development principles, algorithms, and data structures. Experience with Python frameworks such as Django, Flask, or FastAPI. Proficient understanding of code versioning tools like Git . Familiarity with databases (GraphDB / Neo4j) and cloud services is a plus. Experience with deploying applications in cloud environments (AWS, Azure, or GCP). Hands-on experience with containerization technologies like Docker. Familiarity with orchestration tools such as Kubernetes for deploying and managing microservices. Excellent problem-solving skills and attention to detail. Good communication and collaboration skills. Thanks & Regards Syed Hasan Abbas (He/Him) Senior Executive HR | HCL βSoftware || TAG LinkedIn - www.linkedin.com/in/hasan-abbas
Posted 3 weeks ago
0.0 - 2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Engineering Analyst 2 is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of hardware, software and network infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks. Responsibilities: Perform system and application monitoring, capacity planning and systems tests to ensure products meet performance requirements Evaluate technologies, develop prototypes, contribute to design issues, and implement solutions Work with various internal and external teams to identify and resolve problems Consult with end users and clients to identify and correct systems problems or propose solutions Assist in the development of software and systems tools used by integration teams to create end user packages Provide support for operating systems and in-house applications, including third party applications, as needed Perform coding, analysis, testing or other appropriate functions in order to identify problems and propose solutions Adhere to Citi technology standards, audit requirements and corporate compliance issues and requirements Apply knowledge of engineering procedures and concepts and basic knowledge of other technical areas to day to day activities Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience in an Engineering role Experience working in Financial Services or a large complex and/or global environment Project Management experience Consistently demonstrates clear and concise written and verbal communication Comprehensive knowledge of design metrics, analytics tools, benchmarking activities and related reporting to identify best practices Demonstrated analytic/diagnostic skills Ability to work in a matrix environment and partner with virtual teams Ability to work independently, multi-task, and take ownership of various parts of a project or initiative Ability to work under pressure and manage to tight deadlines or unexpected changes in expectations or requirements Proven track record of operational process change and improvement Education: Bachelorβs degree/University degree or equivalent experience Roles & Responsibilities: Knowledge on APIGEE implementation and support Working experience of all CICD processes including LSE and ECS Hadoop cluster experience Cloud Computing Knowledge on AWS Must have SRE knowledge and self healing implementation Experience on Automatic Server patching and batch management Working experice on Devops tools and technologies Skillset: Bigdata, Hadoop cluster, KAFKA, GemFire, NEO4J, TEACMCITY, uDeploy, Autosys, RHEL, Oracle ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Systems & Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citiβs EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Job Weβre looking for an experienced, smart, driven individual who will help analyze our data used in our linking solutions through quantitative approaches, blending analytical, mathematical, and technical skills. Must-Have Skills Bachelorβs or masterβs degree in computer science, engineering, mathematics, statistics, or equivalent technical discipline. 7+ years of experience working with data mapping, data analysis, and managing large data sets/data warehouses. Strong application development experience in Java . Strong proficiency in Angular 1.X. Familiarity with Singleton and MVC design patterns. Strong proficiency in SQL and/or MySQL, including optimization techniques (at least MySQL). Experience using tools such as Eclipse, GIT, Postman, JIRA, and Confluence. Knowledge of test-driven development. Solid understanding of object-oriented programming. Proficiency in Azure Data Bricks , Azure Data Explorer , ADLS2 , EventHub technologies. Hands-on experience with Docker , GitHub , and CI/CD pipelines . Experience working with Cosmos DB is preferred. Ability to analyze, evaluate, and make data-driven recommendations from big data. Strong understanding of data structures, algorithms, and their applications in solving business problems. Excellent analytical , problem-solving , and communication skills, both verbal and written. Strong organizational, project planning, time management, and change management skills. Good-to-Have Skills Expertise in Spring Boot, Microservices, and API development. Familiarity with OAuth2.0 patterns (experience with at least 2 patterns). Knowledge of Graph Databases (e.g., Neo4J, Apache Tinkerpop, Gremlin). Experience with Kafka messaging. Familiarity with Docker, Kubernetes, and cloud development. Experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of industry-wide technology trends and best practices. Familiarity with Azure Data Factory . Experience in containerization and deployment processes (Docker, GitHub, CI/CD). Experience with data visualization tools and techniques. Knowledge of cloud-native technologies and architectures. Experience Range 5+ years of relevant experience in data engineering, data analysis, and working with large datasets. Hiring Locations Chennai, Mumbai, Gurgaon Skills Java, Angular, Azure (DataBricks, DataFactory) Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Neo4j, a popular graph database management system, is seeing a growing demand in the job market in India. Companies are looking for professionals who are skilled in working with Neo4j to manage and analyze complex relationships in their data. If you are a job seeker interested in Neo4j roles, this article will provide you with valuable insights to help you navigate the job market in India.
The average salary range for Neo4j professionals in India varies based on experience levels. - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the Neo4j skill area, a typical career progression may look like: - Junior Developer - Developer - Senior Developer - Tech Lead
Apart from expertise in Neo4j, professionals in this field are often expected to have or develop skills in: - Cypher Query Language - Data modeling - Database management - Java or Python programming
As you explore Neo4j job opportunities in India, it's essential to not only possess the necessary technical skills but also be prepared to showcase your expertise during interviews. Stay updated with the latest trends in Neo4j and continuously enhance your skills to stand out in the competitive job market. Prepare thoroughly, demonstrate your knowledge confidently, and land your dream Neo4j job in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2