Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 13.0 years
35 - 50 Lacs
Chennai
Work from Office
Cognizant Hiring Payments BA!!! Location: Chennai, Bangalore, Hyderabad JD: Job Summary Atleast 10yrs of experience in the BA role and in that a couple of years of experience as BA lead role good domain knowledge in SWIFT/ISO 20022 Payment background and stakeholders management Java Microservices and Spring boot Technical Knowledge: Java / Spring Boot Kafka Streams REST JSON Netflix Micro Services suite ( Zuul Eureka Hystrix etc)12 Factor Apps Oracle PostgresSQL Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Responsibilities Strategy Develop the strategic direction and roadmap for our flagship payments platform aligning with Business Strategy Tech and Ops Strategy and investment priorities. Tap into latest industry trends innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes enhance productivity reduce risk and improve controls Business Work hand in hand with Payments Business taking product programs from investment decisions into design specifications solutioning development implementation and hand-over to operations securing support and collaboration from other teams Ensure delivery to business meeting time cost and high quality constraints Support respective businesses in growing Return on investment commercialization of capabilities bid teams monitoring of usage improving client experience enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments Clearing etc. Own technology delivery of projects and programs across global markets that a develop/enhance core product capabilities b ensure compliance to Regulatory mandates c support operational improvements process efficiencies and zero touch agenda d build payments platform to align with latest technology and architecture trends improved stability and scale Interface with business & technology leaders of other systems for collaborative delivery.
Posted 2 months ago
7.0 - 9.0 years
1 - 3 Lacs
Pune
Work from Office
About the Role: We are looking for a highly skilled and experienced Lead Gen AI engineer to spearhead AI/ML initiatives and oversee the development of advanced machine learning models, deep learning architectures, and generative AI systems. The ideal candidate will have 7-8 years of hands-on experience in data science, machine learning, and data engineering, with a strong focus on leadership, innovation, and generative AI technologies. You will be responsible for guiding a team, delivering AI solutions, and collaborating with cross-functional stakeholders to meet business goals. The desired candidate should be well versed in AI/ML solutioning and should have worked on end to end Product Deliveries. Key Responsibilities: Lead the development and deployment of machine learning models, deep learning frameworks, and AI-driven solutions across the organization. Work closely with stakeholders to define data-driven strategies and drive innovation using AI and machine learning. Design and implement robust data pipelines and workflows in collaboration with data engineers and software developers. Develop and deploy APIs using web frameworks for seamless integration of AI/ML models into production environments. Mentor and lead a team of data scientists and engineers, providing technical guidance and fostering professional growth. Leverage LangChain or LlamaIndex to enhance model integration, document management, and data retrieval capabilities. Lead projects in Generative AI technologies, such as Large Language Models (LLM), Retrieval-Augmented Generation (RAG), and AI agents, to create innovative AI-driven products and services. Stay updated on the latest AI/ML trends, ensuring that cutting-edge methodologies are adopted across projects. Collaborate with cross-functional teams to translate business problems into technical solutions and communicate findings effectively to both technical and non-technical stakeholders. Required Skills and Qualifications: Experience: 7-8 years of experience in data science and AI/ML, with a strong foundation in machine learning, deep learning, generative AI and data engineering. Generative AI Expertise: Minimum 2+ years of experience with generative AI. Hands-on experience with LLMs, RAG, and AI agents. AI Agents & Frameworks: Hands-on experience with AI agent frameworks/libraries (e.g., AutoGen, CrewAI, OpenAI's Function Calling, Semantic Kernel, etc.). Programming: Strong proficiency in Python, with experience using TensorFlow, PyTorch, and Scikit-learn. LangChain & LlamaIndex: Experience integrating LLMs with structured and unstructured data. Knowledge Graphs: Expertise in building and utilizing Knowledge Graphs for AI-driven applications. SQL & NoSQL Databases: Hands-on experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra, etc.). API Development: Experience in developing APIs using Flask, FastAPI, or Django. Cloud & MLOps: Experience working with AWS, GCP, Azure and MLOps best practices. Excellent communication, leadership, and project management skills. Strong problem-solving ability with a focus on delivering scalable, impactful solutions. Preferred Skills: Experience with Computer Vision applications. Chain of Thought Reasoning: Familiarity with CoT prompting and reasoning techniques. Ontology: Understanding of ontologies for knowledge representation in AI systems. Data Engineering: Experience with ETL pipelines and data engineering workflows. Familiarity with big data tools like Spark, Hadoop, or distributed computing.
Posted 2 months ago
7.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Role Overview We are seeking an experienced Data Engineer with 7-10 years of experience to design, develop, and optimize data pipelines while integrating machine learning (ML) capabilities into production workflows. The ideal candidate will have a strong background in data engineering, big data technologies, cloud platforms, and ML model deployment. This role requires expertise in building scalable data architectures, processing large datasets, and supporting machine learning operations (MLOps) to enable data-driven decision-making. Key Responsibilities Data Engineering & Pipeline Development Design, develop, and maintain scalable, robust, and efficient data pipelines for batch and real-time data processing. Build and optimize ETL/ELT workflows to extract, transform, and load structured and unstructured data from multiple sources. Work with distributed data processing frameworks like Apache Spark, Hadoop, or Dask for large-scale data processing. Ensure data integrity, quality, and security across the data pipelines. Implement data governance, cataloging, and lineage tracking using appropriate tools. Machine Learning Integration Collaborate with data scientists to deploy, monitor, and optimize ML models in production. Design and implement feature engineering pipelines to improve model performance. Build and maintain MLOps workflows, including model versioning, retraining, and performance tracking. Optimize ML model inference for low-latency and high-throughput applications. Work with ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and deployment tools like Kubeflow, MLflow, or SageMaker. Cloud & Big Data Technologies Architect and manage cloud-based data solutions using AWS, Azure, or GCP. Utilize serverless computing (AWS Lambda, Azure Functions) and containerization (Docker, Kubernetes) for scalable deployment. Work with data lakehouses (Delta Lake, Iceberg, Hudi) for efficient storage and retrieval. Database & Storage Management Design and optimize relational (PostgreSQL, MySQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Manage and optimize data warehouses (Snowflake, BigQuery, Redshift, Databricks) for analytical workloads. Implement data partitioning, indexing, and query optimizations for performance improvements. Collaboration & Best Practices Work closely with data scientists, software engineers, and DevOps teams to develop scalable and reusable data solutions. Implement CI/CD pipelines for automated testing, deployment, and monitoring of data workflows. Follow best practices in software engineering, data modeling, and documentation. Continuously improve the data infrastructure by researching and adopting new technologies. Required Skills & Qualifications Technical Skills: Programming Languages: Python, SQL, Scala, Java Big Data Technologies: Apache Spark, Hadoop, Dask, Kafka Cloud Platforms: AWS (Glue, S3, EMR, Lambda), Azure (Data Factory, Synapse), GCP (BigQuery, Dataflow) Data Warehousing: Snowflake, Redshift, BigQuery, Databricks Databases: PostgreSQL, MySQL, MongoDB, Cassandra ETL/ELT Tools: Airflow, dbt, Talend, Informatica Machine Learning Tools: MLflow, Kubeflow, TensorFlow, PyTorch, Scikit-learn MLOps & Model Deployment: Docker, Kubernetes, SageMaker, Vertex AI DevOps & CI/CD: Git, Jenkins, Terraform, CloudFormation Soft Skills: Strong analytical and problem-solving abilities. Excellent collaboration and communication skills. Ability to work in an agile and cross-functional team environment. Strong documentation and technical writing skills. Preferred Qualifications Experience with real-time streaming solutions like Apache Flink or Spark Streaming. Hands-on experience with vector databases and embeddings for ML-powered applications. Knowledge of data security, privacy, and compliance frameworks (GDPR, HIPAA). Experience with GraphQL and REST API development for data services. Understanding of LLMs and AI-driven data analytics.
Posted 2 months ago
6.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About The Role As a Node.js Developer, you will be responsible for building and maintaining the backbone of our applications. You will utilize your deep understanding of JavaScript, the Node.js ecosystem, and asynchronous programming principles to create efficient and reliable server-side code. You will participate in architectural discussions, contribute to technical designs, and ensure the delivery of robust and well-tested backend services. Responsibilities - Design, develop, and maintain robust and scalable server-side applications and APIs using Node.js and related frameworks (e.g., Express.js, NestJS, Koa.js). - Architect and implement well-documented and secure RESTful APIs and potentially other web service protocols (e.g., GraphQL). - Follow best practices for API design, versioning, and error handling. - Integrate applications with various types of databases, including relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Redis). - Design and implement efficient database schemas and data models. - Write optimized database queries and utilize ORM/ODM libraries (e.g., Sequelize, Mongoose) effectively. - Leverage Node.js's non-blocking, event-driven architecture to build high-performance and scalable applications. - Implement efficient asynchronous patterns using Promises, Async/Await, and event loops. - Identify and address performance bottlenecks through profiling and optimization techniques. - Write comprehensive unit, integration, and end-to-end tests using appropriate testing frameworks (e.g., Jest, Mocha, Chai, Supertest). - Participate actively in code reviews to ensure code quality, maintainability, and adherence to coding standards. - Implement and maintain CI/CD pipelines for automated testing and deployment. - Collaborate effectively with frontend developers to define API contracts and ensure seamless integration. - Communicate technical concepts clearly and concisely to both technical and non-technical stakeholders. - Participate in team meetings, sprint planning, and other agile ceremonies. - Participate in the deployment process of Node.js applications to various environments (e.g., cloud platforms like AWS, Azure, GCP; on-premise servers). - Understand and utilize containerization technologies like Docker and orchestration tools like Kubernetes (preferred). - Implement monitoring and logging solutions to ensure application health and facilitate troubleshooting. - Implement security best practices to protect applications from common vulnerabilities (e.g., OWASP top 10). - Implement authentication and authorization mechanisms. - Ensure data security and compliance. - Stay up-to-date with the latest trends and advancements in Node.js, JavaScript, and backend development. - Proactively explore and evaluate new technologies and tools to improve our development processes and application quality. Skills Required - Deep understanding of JavaScript fundamentals, including asynchronous programming, closures, and prototypical inheritance. - Comprehensive knowledge of the Node.js runtime environment, event loop, and core modules. - Strong experience with at least one popular Node.js framework (Express.js is highly preferred; experience with NestJS or Koa.js is a plus). - Proven ability to design, develop, and document RESTful APIs. - Hands-on experience integrating with various types of databases (both SQL and NoSQL). - Testing Proficiency in writing unit, integration, and end-to-end tests with relevant testing frameworks. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Posting Back-End Developer Your Role : As a Back-End Developer, you'll collaborate with the development team to build and maintain scalable, secure, and high-performing back-end systems for our SaaS products. You will play a key role in designing and implementing microservices architectures, integrating databases, and ensuring seamless operation of cloud-based applications. Responsibilities : - Design, develop, and maintain robust and scalable back-end solutions using modern frameworks and tools. - Create, manage, and optimize microservices architectures, ensuring efficient communication between services. - Develop and integrate RESTful APIs to support front-end and third-party systems. - Design and implement database schemas and optimize performance for SQL and NoSQL databases. - Support deployment processes by aligning back-end development with CI/CD pipeline requirements. - Implement security best practices, including authentication, authorization, and data protection. - Collaborate with front-end developers to ensure seamless integration of back-end services. - Monitor and enhance application performance, scalability, and reliability. - Keep up-to-date with emerging technologies and industry trends to improve back-end practices. Your Qualifications : Must-Have Skills - Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. - Proven experience as a Back-End Developer with expertise in modern frameworks such as Node.js, Express.js, or Django. - Expertise in .NET frameworks including development in C++ and C# for high performance databases - Strong proficiency in building and consuming RESTful APIs. - Expertise in database design and management with both SQL (e.g., PostgreSQL, MS SQL Server) and NoSQL (e.g., MongoDB, Cassandra) databases. - Hands-on experience with microservices architecture and containerization tools like Docker and Kubernetes. - Strong understanding of cloud platforms like Microsoft Azure, AWS, or Google Cloud for deployment, monitoring, and management. - Proficiency in implementing security best practices (e.g., OAuth, JWT, encryption techniques). - Experience with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or Azure DevOps. - Familiarity with Agile methodologies and participation in sprint planning and reviews. Good-to-Have Skills - Experience with time-series databases like TimescaleDB or InfluxDB. - Experience with monitoring solutions like Datadog or Splunk. - Experience with real-time data processing frameworks like Kafka or RabbitMQ. - Familiarity with serverless architecture and tools like Azure or AWS Lambda Functions. - Expertise in Java backend services and microservices - Hands-on experience with business intelligence tools like Grafana or Kibana for monitoring and visualization. - Knowledge of API management platforms like Kong or Apigee. - Experience with integrating AI/ML models into back-end systems. - Familiarity with MLOps pipelines and managing AI/ML workloads. - Understanding of iPaaS (Integration Platforms as a Service) and related technologies. Key Competencies & Attributes : - Strong problem-solving and analytical skills. - Exceptional organizational skills with the ability to manage multiple priorities. - Adaptability to evolving technologies and industry trends. - Excellent collaboration and communication skills to work effectively in cross-functional teams. - Ability to thrive in self-organizing teams with a focus on transparency and trust. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
5.0 - 10.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Experience with : Leading Java development (Spring Boot, Java 8+, Spring, Microservices) Rest APIs - how to design, build, maintain microservices and best practices, including security and performance tuning. Experience with AWS (Amazon Web Services) services Relational and object databases e.g., MongoDB, Redis, MySQL, Cassandra Architecture of cloud solutions includes focus on scalability, high availability, in a microservice architecture (Spring Boot). Experienced in designing, enhancing, updates, and programming changes for portions and subsystems of systems software, including utilities, databases, and CI/CD tools Basic Knowledge with Kubernetes deployments for micro services, tools like Kops, Helm, etc. Strong Core Java Skills, Design pattern, Collections, Garbage Collection, Multithreading Extensive experience with Java frameworks and technologies, including Spring Boot, Hibernate, and Maven Experience in writing and managing JUnits and Mock frameworks like Mockito, JMock or equivalent Knowledge of relational databases and SQL, as well as NoSQL databases such as MongoDB, Oracle, SQL Server or Cassandra. Experience with RESTful web services and API design Knowledge of relational databases and SQL, as well as NoSQL Required Skills : 5+ of Java and microservices experience Excellent analytical and problem-solving skills Strong ability to work independently, propose architectural solutions, create prototypes, and deliver necessary technical documentation Ability to provide technical guidance on full stack, design, coding, and delivery Good at extracting and writing requirements and specifications, extensive experience with multiple software applications design tools and languages, excellent analytical and problem-solving skills Excellent written and verbal communication skills; proficiency in English and local language Ability to effectively communicate product architectures, design proposals and negotiate Ability to work independently in a fast-paced environment and deliver results under pressure Passion for quality and attention to detail Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
7.0 - 8.0 years
17 - 22 Lacs
Mumbai
Work from Office
About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
6.0 - 7.0 years
16 - 20 Lacs
Mumbai
Work from Office
About The Role We are seeking a highly skilled and motivated Solutions Architect or Backend Architect to play a pivotal role in designing, developing, and integrating robust and scalable solutions. This role will focus heavily on API development and seamless system integration, leveraging a diverse technology stack including Python (with Django and Flask frameworks), Angular, Java, and JavaScript. You will be instrumental in translating business needs into technical blueprints, guiding development teams, and ensuring the delivery of high-quality, secure, and efficient IT solutions. Responsibilities - Lead and facilitate brainstorming sessions with stakeholders to understand business requirements, challenges, and opportunities, and translate them into potential technical solutions. - Design and architect scalable, secure, and high-performance backend systems and APIs. - Develop comprehensive architectural blueprints, diagrams, and documentation outlining system components, data flows, and integration strategies. - Define and enforce architectural standards, patterns, and best practices across development teams. - Evaluate and recommend appropriate technologies, frameworks, and tools to meet project requirements and business objectives. - Consider non-functional requirements such as performance, scalability, reliability, security, and maintainability in architectural designs. - Possess an in-depth understanding of coding languages relevant to API development, including Angular, Python (with expertise in Django and/or Flask), Java, and JavaScript. - Design and implement RESTful and other API architectures, ensuring clarity, consistency, and ease of use. - Develop and maintain API documentation using tools like Swagger/OpenAPI. - Architect and implement robust system integration solutions, connecting diverse applications and data sources. - Experience with various integration patterns and technologies (e.g., message queues, event-driven architectures). - Demonstrate good working knowledge of various Cloud Service Providers (e.g., AWS, Azure, GCP) and their core services relevant to backend development and deployment. - Design and architect cloud-native solutions, leveraging platform-as-a-service (PaaS) and serverless technologies where appropriate. - Understand cloud security best practices and ensure solutions are deployed securely in the cloud environment. - Possess experience with enterprise architecture principles and methodologies. - Familiarity with IT architecture frameworks (e.g., TOGAF, Zachman) is a plus. - Understand fundamental IT security principles and best practices, ensuring security is integrated into the solution design. - Have a working understanding of IT infrastructure components and their impact on solution architecture. - Appreciate IT governance and compliance requirements and ensure solutions adhere to relevant policies. - Actively identify opportunities for process improvements within the development lifecycle and IT operations. - Effectively analyze and resolve complex technical problems that arise during development or in production environments. - Provide technical leadership, guidance, and mentorship to development teams. - Collaborate effectively with cross-functional teams, including product managers, designers, and QA engineers. - Participate in code reviews and ensure code quality and adherence to architectural standards. - Continuously research current and emerging technologies, trends, and best practices in backend development, API design, and system integration. - Proactively propose and evaluate new technologies and approaches that could benefit the organization. - Demonstrate sound knowledge of various operating systems (e.g., Linux, Windows Server). - Possess a strong understanding of different database technologies (both relational and NoSQL) and their appropriate use cases. - Prepare and document comprehensive testing requirements to ensure the quality and reliability of the developed solutions. - Create and maintain clear and concise technical documentation for architectural designs, APIs, and integration processes. - Identify areas where IT can effectively support business needs and contribute to achieving organizational goals. - Collaborate with business units to understand their strategic objectives and translate them into actionable IT strategies and solutions. - Work with business units to improve current IT implementations and drive efficiency. - Participate in building and migrating software applications and services across the organization, ensuring minimal disruption and data integrity. - Work closely with product and delivery teams to architect and develop highly scalable and reliable solutions and products. - Communicate effectively and consult with both technical and non-technical clients and internal stakeholders to understand their needs and develop appropriate solutions. - Manage customer satisfaction by setting realistic expectations for the end-product and ensuring clear communication throughout the development process. - Demonstrate strong organizational skills to manage complex projects and tasks effectively. - Exhibit leadership qualities to guide and influence development teams and stakeholders. Qualifications - Proven experience as a Solutions Architect, Backend Architect, or in a similar role focused on API development and system integration. - Deep expertise in one or more of the following programming languages Python (Django, Flask), Java, JavaScript, and experience with Angular framework. - Strong understanding of API design principles (REST, SOAP, etc.) and experience with API management platforms. - Solid experience with database systems (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). - Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and their services. - Excellent communication, presentation, and interpersonal skills. - Strong analytical and problem-solving abilities. - Ability to work independently and as part of a collaborative team. - Strong organizational and time management skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
8.0 - 12.0 years
45 - 50 Lacs
Bengaluru
Work from Office
Member of a software engineering team involved in development & design of AI Data Platform built on NetApp s flagship storage operating ONTAP. ONTAP is a feature rich stack with its rich data management capabilities that has tremendous value to our customers and are used in mission critical applications across the world. You will work as part of a team responsible for the development, testing and debugging of distributed software that drives NetApp cloud, hybrid-cloud and on-premises solutions. As part of the Research and Development function, the overall focus of the group is on competitive market and customer requirements, supportability, technology advances, product quality, product cost and time-to-market. Software engineers focus on new product development along with enhancements to existing products. This is a mid-level technical lead position that requires an individual to be broad-thinking, systems-focused, creative, team-oriented, technologically savvy, able to lead large cross-functional teams, and driven to produce results. Job Requirements Proficiency in programming languages like Go/Golang. Experience with Machine Learning Libraries and Frameworks: PyTorch, TensorFlow, Keras, Open AI, LLMs ( Open Source), LangChain etc Experience working in Linux, AWS/Azure/GCP, Kubernetes - Control plane, Auto scaling, orchestration, containerization is a must. Experience with No Sql Document Databases (eg, Mongo DB, Cassandra, Cosmos DB, Document DB). Experience working building Micro Services, REST APIs and related API frameworks. Experience with Big Data Technologies: Understanding big data technologies and platforms like Spark, Hadoop and distributed storage systems for handling large-scale. datasets and parallel processing. Experience with Filesystems or Networking or file/cloud protocols is a must. Proven track record of leading mid to large sized projects. This position requires an individual to be creative, team-oriented, a quick learner and driven to produce results. Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies. Participate in technical discussions within the team and with other groups within Business Units associated with specified projects. Willing to work on additional tasks and responsibilities that will contribute towards team, department and company goals. A strong understanding and experience with concepts related to computer architecture, data structures and programming practices. Experience with AI/ML frameworks like PyTorch or TensorFlow is a Plus. Education Typically requires a minimum of 8-12 years of related experience with a bachelors degree and a masters degree; or a PhD with relevant experience.
Posted 2 months ago
2.0 - 5.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Skills: Java->Springboot , Java->Microservices , Java Responsibilities Required Experience on Java 8, Spring, Spring boot, Microservices APIs, XML and JSON formats Experience working in Media domain Working knowledge in design and build using AWS environments, AWS networking Knowledge on Devops related technologies like Ansible, Terraform, Bitbucket, Jenkins Good understanding of networking Requirement gathering and impact analysis Good troubleshoot/problem solving skills Databases: MySQL, No SQL(Cassandra), PostgreSQL Other Tools: Kibana, Grafana, Logstash, ELK, Kubernetes Preferred Experience in working on large enterprise level microservice solutions Experience working with automation testing and related tools
Posted 2 months ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
Join us as a Data Engineer We ll look to you to drive the build of effortless, digital-first customer experiences as you simplify our bank while keeping our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipeline modelling and ETL design, inspiring to be commercially successful through insights This is your opportunity to explore your leadership potential while bringing a competitive edge to your career profile by solving problems and creating smarter solutions Were offering this role at director level What you ll do In this role, you ll develop and share knowledge of business data structures and metrics, advocating for changes when needed for product development. You ll also educate and embed new data techniques into the business through role modelling, training, and experiment design oversight. We ll look to you to drive DevOps adoption into the delivery of data engineering, proactively performing root cause analysis while resolving issues. You ll also deliver a clear understanding of data platform cost levers to meet department cost savings and income targets. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to gather and build data solutions Actively participating in the data engineering community to deliver opportunities to support our bank s strategic direction Driving data engineering strategies to build complex, scalable data architecture and a customer feature rich dataset Driving the advanced automation of data engineering pipelines through the removal of manual stages Working alongside colleagues, scrums and project teams while liaising with technology and engineering teams to build business stakeholder engagement and to develop data solutions The skills you ll need We re looking for someone with strong communication skills and the ability to proactively engage and manage a wide range of stakeholders. You ll have extensive experience working in a governed, and regulatory environment. You ll also need: Experience of extracting value and features from large scale data Advanced experience of designing efficient data models that meet organisational needs. Familiarity with various modelling techniques like star and snowflake schemas can significantly enhance data retrieval and reporting performance Proven proficiency in relational databases (e.g., MySQL, PostgreSQL) as well as NoSQL databases (e.g., MongoDB, Cassandra) is critical for storing and managing various types of data. An understanding of modern code development practices Hours 45 Job Posting Closing Date: 03/06/2025
Posted 2 months ago
7.0 - 12.0 years
9 - 14 Lacs
Pune
Work from Office
About Enlyft Data and AI are at the core of the Enlyft platform. We are looking for creative, customer and detail-obsessed data engineers who can contribute to our strong engineering culture. Our big data engine indexes billions of structured / unstructured documents and leverages data science to accurately infer the footprint of thousands of technologies and products across millions of businesses worldwide. The complex and evolving relationships between products and companies form a technological graph that is core to our predictive modeling solutions. Our machine learning based models work by combining data from our customers CRM with our proprietary technological graph and firmographic data, and reliably predict an accounts propensity to buy. About the Role As a key member of our data platform team, youll be tasked with development of our next gen cutting-edge data platform. Your responsibilities will include building robust data pipelines for data acquisition, processing, and implementing optimized data models, creating APIs and data products to support our machine learning models, insights engine, and customer-facing applications. Additionally, youll harness the power of GenAI throughout the data platform lifecycle, while maintaining a strong focus on data governance to uphold timely data availability with high accuracy. What we re looking for Bachelors degree or higher in Computer Science, Engineering or related field with 7+ years of experience in data engineering with a strong focus on designing and building scalable data platforms and products Proven expertise in data modeling, ETL/ELT processes, data warehousing with distributed computing - Hadoop, Spark and Kafka Proficient in programming languages such as Python, Java and SQL Experience with cloud such as AWS, Azure or GCP and related services (S3, Redshift, BigQuery, Dataflow) Strong understanding of SQL / NoSQL databases (e.g. PostGres, MySQL, Cassandra) Proven expertise in data quality checks to ensure data accuracy, completeness, consistency, and timeliness Excellent problem-solving in fast-paced, collaborative environment, coupled with strong communication for effective interaction with tech & non-tech stakeholders Why join Enlyft A top-notch culture that is customer obsessed, transparent and constantly strives for excellence A top-notch team with colleagues that will help you learn and grow in a collaborative environment Competitive pay and great benefits Enlyft is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Posted 2 months ago
9.0 - 14.0 years
35 - 40 Lacs
Gurugram
Work from Office
Join us as a Data Engineer We ll look to you to drive the build of effortless, digital-first customer experiences as you simplify our bank while keeping our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipeline modelling and ETL design, inspiring to be commercially successful through insights This is your opportunity to explore your leadership potential while bringing a competitive edge to your career profile by solving problems and creating smarter solutions Were offering this role at director level What you ll do In this role, you ll develop and share knowledge of business data structures and metrics, advocating for changes when needed for product development. You ll also educate and embed new data techniques into the business through role modelling, training, and experiment design oversight. We ll look to you to drive DevOps adoption into the delivery of data engineering, proactively performing root cause analysis while resolving issues. You ll also deliver a clear understanding of data platform cost levers to meet department cost savings and income targets. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to gather and build data solutions Actively participating in the data engineering community to deliver opportunities to support our bank s strategic direction Driving data engineering strategies to build complex, scalable data architecture and a customer feature rich dataset Driving the advanced automation of data engineering pipelines through the removal of manual stages Working alongside colleagues, scrums and project teams while liaising with technology and engineering teams to build business stakeholder engagement and to develop data solutions The skills you ll need We re looking for someone with strong communication skills and the ability to proactively engage and manage a wide range of stakeholders. You ll have extensive experience working in a governed, and regulatory environment. You ll also need: Experience of extracting value and features from large scale data Advanced experience of designing efficient data models that meet organisational needs. Familiarity with various modelling techniques like star and snowflake schemas can significantly enhance data retrieval and reporting performance Proven proficiency in relational databases (e.g., MySQL, PostgreSQL) as well as NoSQL databases (e.g., MongoDB, Cassandra) is critical for storing and managing various types of data. An understanding of modern code development practices Hours 45 Job Posting Closing Date: 03/06/2025
Posted 2 months ago
7.0 - 11.0 years
0 - 1 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 10+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration, and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML, CSS, JavaScript, and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Hard Skills: Java Programming: Expert-level knowledge in Java and its application in full-stack development. Frontend Technologies: Proficient in frontend development using HTML, CSS, JavaScript, and frameworks like React or Angular. Backend Development: Skilled in backend development using Java, Spring, Hibernate, and RESTful services. Full Stack Development: Experience in developing end-to-end solutions, from the user interface to backend services and databases. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm (Monday-26th May to Friday-30th May)
Posted 2 months ago
5.0 - 10.0 years
17 - 25 Lacs
Bengaluru
Work from Office
Senior WASM Developer: Will be part of a team focused on creating our main platform experience with WASM technologies. This position is perfect for someone enthusiastic about Web assembly products and community and who wants to work on initiatives with major impact and cutting-edge technologies. You enjoy building next-gen products on WASM and want to push the boundaries of what WASM computing applications can do. Responsibilities: Leverage the power of WASM to build next-gen products at edge and cloud. Work with evangelists in the Organization with a deep desire to build next gen products and computing platforms using Web Assembly Being part of expert team to work on improving specifications, developing pallets with close attention to extensive testing and security Work on use cases across serverless apps, embedded functions, microservices and IoT devices Support latest WASM standards and proposal development Provide easy to understand documentation for the development team. Co-ordinate with architects to produce technical designs. Document the development process, architecture, and standard components. Job Description: 5+ years of Development experience; with mastery in atleast one of the languages Rust, Golang, C++ Expertise in JavaScript, Typescript, Node.Js Experience with Linux Knowledge about Security best practices standards. Knowledge about Low level code execution. Experience with Relation databases like MySQL, MSSQL and NoSQL databases like MongoDB. Work across multiple teams, mentor junior developers and actively participate in code reviews. Knowledge of design patterns and best practices. Should possess good coding skills and ability to approach the given problem statement Strong in software programming fundamental concepts Good To have: Proficiency with WASM and familiarity with ink!; and ability to extend Wasm for distributed cloud computing Experienced in anyone of the WASM RT Wasmtime, Lucet, WAMR, WASM Edge. Knowledge about Distributed communication protocol Devp2p, Libp2p Hands-on Applied Cryptography Signing, Hashing, Encryption, PKCS, Key Management. Familiarity in Docker, Kubernetes, Nginx, Git Knowledge of Cloud services (AWS, Azure, GCP) Knowledge about Stack machine. Awareness of embedded systems, prototype board like Raspberry Pi, IMX. Awareness of CI/CD pipelines. Any open-source contribution in the field of WASM. Any certifications or whitepapers. Sr WASM Developer Overall Exp. 5+ Skills - Year of experience - Remarks - Weightage Rust/Golang/C++ - 4+ - Mandatory - 40% WASM - 2+ - Mandatory - 30% Edge - 2+ - Good to have - 15% Linux - 2+ - Good to have - 15%
Posted 2 months ago
2.0 - 4.0 years
10 - 18 Lacs
Bengaluru
Work from Office
About Lowes Lowes Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About the Team This hiring is for Personalization team and it’s a mix of Software Engineers, Data Engineers and Data scientist. This team build signals and segments based on customer’s relatime and insession features and does gives multiple generative prediction which can help personalize the customer digital experience. This in turn will reduce friction and help customer find the right product more quick. I. Job Summary: The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver code modules, stable application systems, and software solutions. This includes developing, configuring, or modifying integrated business and/or enterprise application solutions within various computing environments. This role will be working closely with stakeholders and cross-functional departments to communicate project statuses and proposals. II. Roles & Responsibilities: Core Responsibilities: • Translates business requirements and specifications into logical program designs, code modules, stable application systems, and software solutions with occasional guidance from senior colleagues; partners with the product team to understand business needs and functional specifications. • Develops, configures, or modifies integrated business and/or enterprise application solutions within various computing environments by designing and coding component-based applications using various programming languages. • Tests application using test-driven development and behavior-driven development frameworks to ensure the integrity of the application. • Conducts root cause analysis of issues and participates in the code review process to identify gaps. • Implements continuous integration/continuous delivery processes to ensure quality and efficiency in the development cycle using DevOps automation processes and tools. • Ideates, builds, and publishes reusable libraries to improve productivity across teams. • Conducts the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications. • Solves difficult technical problems to ensure solutions are testable, maintainable, and efficient. III. Years of Experience: 2- 4 years experience IV. Education Qualification & Certifications: B.Tech in computer science or equivalent stream. Primary Skills (must have) Java Spring Boot Kafka Cassandra Redis GCP Secondary Skills (desired) Apache Beam React Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under state or local law. Lowe’s wishes to maintain appropriate standards and integrity in meeting the requirements of the Information Technology Act’s privacy provisions.
Posted 2 months ago
3.0 - 5.0 years
7 - 9 Lacs
Bengaluru
Work from Office
We are seeking a Generative AI Engineer with 3-5 years of experience with strong background in developing agentic architectures and experience with various frameworks and observability tools relevant to generative AI applications. This role demands proficiency in Python, deployment experience, and familiarity with Retrieval-Augmented Generation (RAG) applications. Responsibilities Design, develop, and implement agentic architecture for generative AI systems. Utilize frameworks such as Langchain, to enhance application performance and monitoring. Collaborate with cross-functional teams to build and deploy generative AI solutions that address business needs. Develop, test, and maintain agentic RAG applications, ensuring high performance and reliability. Develop, test, and maintain backend services and APIs using Python frameworks Experience in deploying machine learning models in production environments. Design and implement robust testing frameworks to assess model performance, including precision, recall, and other key performance indicators (KPIs). Continuously improve algorithms for accuracy, speed, and robustness in real-time. Stay updated with the latest advancements in generative AI technologies and methodologies to implement innovative solutions. Document processes, architectures, and code to ensure maintainability and knowledge sharing. Required Skills Technical Skills: Proficient in Python programming language in developing, training and evaluating deep learning models. Hands-on experience with frameworks like Langchain and Langraph. Version Control Systems: GIT, GitHub FastAPI, Flask, or Django REST Framework. Conduct evaluations of chat applications to assess effectiveness and user experience. Knowledge of various RAG applications and their implementation. Containerization and Orchestration: Docker and Kubernetes. Work with relational databases (PostgreSQL/MySQL) and NoSQL databases (MongoDB/Cassandra). Develop both synchronous (REST, gRPC) and asynchronous communication channels (using message brokers like RabbitMQ or Kafka). Experience: Minimum of 3 years of experience in software engineering or related fields. Proven track record of developing generative AI applications. Experience in evaluating chat applications for functionality and user engagement. Soft Skills: Strong analytical and problem-solving skills. Excellent communication skills for collaboration with team members and stakeholders. Ability to work independently as well as part of a team in a fast-paced environment.
Posted 2 months ago
2 - 5 years
2 - 5 Lacs
Bengaluru
Work from Office
Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings
Posted 2 months ago
5 - 7 years
8 - 14 Lacs
Hyderabad
Work from Office
Responsibilities for this position include : - Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, NoSQL stores like Cassandra, HBase etc) across Fractal and contributes to open source Big Data technologies - Write and tune complex Java, MapReduce, Pig and Hive jobs - Adapt quickly to change in requirements and be willing to work with different technologies if required - Experience leading a Backend/Distributed Data Systems team while remaining hands-on is very important - Lead the effort to build, implement and support the data infrastructure - Manage the business intelligence team and vendor partners, ensuring to prioritize projects according to customer and internal needs, and develops top-quality dashboards using industry best practices - Manage team of data engineers (both full-time associates and/or third party resources) - Own the majority of deliverables for the Big Data team from a delivery perspective - Analyzes and confirms the integrity of source data to be evaluated - Leads in deployment and auditing models and attributes for accuracy Education for Lead Data Engineer : Have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Engineering, Statistics, Education, Technical, Information Technology, Information Systems, Mathematics, Computer Engineering, Management Skills for Lead Data Engineer : Desired skills for lead data engineer include : - Python - Spark - Java - Hive - SQL - Hadoop architecture - Large scale search applications and building high volume data pipelines - Message queuing - NoSQL - Scala Desired experience for lead data engineer includes : - Experience in development utilizing C# .NET 4.5 - Experience in managing a live service that customers depend - Experience in coaching and managing other engineers - Be a team player and enjoy collaboration with other engineers and teams - Experience with software version management systems - Experience with task/bug tracking software.
Posted 2 months ago
5 - 9 years
11 - 15 Lacs
Mumbai, Hyderabad
Work from Office
Senior Data Scientist - NAV02EG Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-WB-Kolkata, IND-MM-Pune, IND-TN-Chennai Job Digital Solutions Schedule Full-time Employment Type Employee Job Level Experienced Job Posting May 8, 2025 Unposting Date Jun 7, 2025 Reporting Manager Title Head of Data Intelligence Duration of Contract 0 Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. Were bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. Worley Digital At Worley, our Digital team collaborates closely with the business to deliver efficient, technology-enabled sustainable solutions, that will be transformational for Worley. This team, aptly named Worley Digital, is currently seeking talented individuals who would be working on a wide range of latest technologies, including solutions based on Automation, Generative AI. What drives us at Worley Digital? Its our shared passion for pushing the boundaries of technological innovation, embracing best practices, and propelling Worley to the forefront of industry advancements. If youre naturally curious, open-minded, and a self-motivated learner - one whos ready to invest time and effort to stay future-ready - then Worley could be your ideal workplace. Major Accountabilities of PositionAI/ML Architect must have Defining, designing, and delivering ML architecture patterns operable in native and hybrid cloud architectures. Collaborate with Enterprise Architecture, Info Security, DevOps and Data Intelligence team to implement ML Solutions. Defining data augmentation pipelines for unstructured data like Documents, Engineering drawings etc. Build new network architecture in CNN/LSTM/RCNN or Develop wrapper for pre-trained models. Conduct feasibility of transfer learning fitment for given problem. Research, analyze, recommend, and select technical approaches to address challenging development and data integration problems related to ML Model training and deployment in Enterprise Applications. Perform research activities to identify emerging technologies (Generative AI) and trends that may affect the Data Science/ ML life-cycle management in enterprise application portfolio. Design and deploy AI/ML models in real-world environments and integrating AI/ML using Cloud native or hybrid technologies into large-scale enterprise applications .Demonstrated experience developing best practices and recommendations around tools/technologies for ML life-cycle capabilities such as Data collection, Data preparation, Feature Engineering, Model Management, ML Ops, Model Deployment approaches and Model monitoring and tuning. Knowledge / Experience / Competencies Required IT Skills & Experience (Priority wise) 1. Hands-on programming and architecture capabilities in Python. 2. Demonstrated technical expertise around architecting solutions around AI, ML, deep learning and Generative AI related technologies. 3. Experience in implementing and deploying Machine Learning solutions (using various models, such as GPT-4, Lama2, Mistral ai, text embedding ada, Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Topic Modeling, Game Theory etc. ) 4. Understanding of Nvidia Enterprise NEMO Suite. 5. Expertise in popular deep learning frameworks, such as TensorFlow, PyTorch, and Keras, for building, training, and deploying neural network models. 6. Experience in AI solution development with external SaaS products like Azure OCR 7. Experience in the AI/ML components like Azure ML studio, Jupyter Hub, TensorFlow & Sci-Kit Learn 8. Hands-on knowledge of API frameworks. 9. Familiarity with the transformer architecture and its applications in natural language processing (NLP), such as machine translation, text summarization, and question-answering systems. 10. Expertise in designing and implementing CNNs for computer vision tasks, such as image classification, object detection, and semantic segmentation. 11. Hands on experience in RDBMS, NoSQL, big data stores likeElastic, Cassandra. 12. Experience with open source software 13. Experience using the cognitive APIs machine learning studios on cloud. 14. Hands-on knowledge of image processing with deep learning ( CNN,RNN,LSTM,GAN) 15. Familiarity with GPU computing and tools like CUDA and cu DNN to accelerate deep learning computations and reduce training times. 16. Understanding of complete AI/ML project life cycle 17.
Posted 2 months ago
5 - 9 years
11 - 15 Lacs
Mumbai, Hyderabad
Work from Office
Senior Data Scientist - NAV02ED Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-WB-Kolkata, IND-MM-Pune, IND-TN-Chennai Job Digital Solutions Schedule Full-time Employment Type Employee Job Level Experienced Job Posting May 8, 2025 Unposting Date Jun 7, 2025 Reporting Manager Title Head of Data Intelligence Duration of Contract 0 Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. Were bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. Major Accountabilities of Position a AI/ML Architect must have Defining, designing, and delivering ML architecture patterns operable in native and hybrid cloud architectures. Collaborate with Enterprise Architecture, Info Security, DevOps and Data Intelligence team to implement ML Solutions. Defining data augmentation pipelines for unstructured data like Documents, Engineering drawings etc. Build new network architecture in CNN/LSTM/RCNN or Develop wrapper for pre-trained models. Conduct feasibility of transfer learning fitment for given problem. Research, analyze, recommend, and select technical approaches to address challenging development and data integration problems related to ML Model training and deployment in Enterprise Applications. Perform research activities to identify emerging technologies (Generative AI) and trends that may affect the Data Science/ ML life-cycle management in enterprise application portfolio. Design and deploy AI/ML models in real-world environments and integrating AI/ML using Cloud native or hybrid technologies into large-scale enterprise applications .Demonstrated experience developing best practices and recommendations around tools/technologies for ML life-cycle capabilities such as Data collection, Data preparation, Feature Engineering, Model Management, ML Ops, Model Deployment approaches and Model monitoring and tuning. Knowledge / Experience / Competencies Required IT Skills & Experience (Priority wise) 1. Hands-on programming and architecture capabilities in Python. 2. Demonstrated technical expertise around architecting solutions around AI, ML, deep learning and Generative AI related technologies. 3. Experience in implementing and deploying Machine Learning solutions (using various models, such as GPT-4, Lama2, Mistral ai, text embedding ada, Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Topic Modeling, Game Theory etc. ) 4. Understanding of Nvidia Enterprise NEMO Suite. 5. Expertise in popular deep learning frameworks, such as TensorFlow, PyTorch, and Keras, for building, training, and deploying neural network models. 6. Experience in AI solution development with external SaaS products like Azure OCR 7. Experience in the AI/ML components like Azure ML studio, Jupyter Hub, TensorFlow & Sci-Kit Learn 8. Hands-on knowledge of API frameworks. 9. Familiarity with the transformer architecture and its applications in natural language processing (NLP), such as machine translation, text summarization, and question-answering systems. 10. Expertise in designing and implementing CNNs for computer vision tasks, such as image classification, object detection, and semantic segmentation. 11. Hands on experience in RDBMS, NoSQL, big data stores likeElastic, Cassandra. 12. Experience with open source software 13. Experience using the cognitive APIs machine learning studios on cloud. 14. Hands-on knowledge of image processing with deep learning ( CNN,RNN,LSTM,GAN) 15. Familiarity with GPU computing and tools like CUDA and cu DNN to accelerate deep learning computations and reduce training times. 16. Understanding of complete AI/ML project life cycle 17. Understanding of data structures, data modelling and software architecture 18.
Posted 2 months ago
5 - 8 years
9 - 14 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Apache Cassandra database. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 2 months ago
- 5 years
3 - 6 Lacs
Gurugram
Work from Office
The Role and Responsibilities We have open positions ranging from Associate Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Roles and levels We are hiring for engineering role across the levels from Associate Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years. In addition to the base salary, this position may be eligible for performance-based incentives.
Posted 2 months ago
7 - 12 years
25 - 30 Lacs
Pune, Bengaluru
Work from Office
Tech Lead/ Associate Architect/ Architect- Artificial Intelligence and Machine Learning Technical Skills- Candidates with 7- 17 Years of total experience Strong experience in Artificial Intelligence and Machine Learning Experience with common data science toolkits, such as Python is a Must. Should have Worked in Concurrency, Data pipelines and Data Ingestion for models Should have actually worked on ML models beyond the parameter tuning and interfacing Experience with data visualization tools, such as Tableau, Power BI, D3.js, Gplot, etc.-mandatory for Architect Experience with SQL databases and time series databases. Experience with noSQL databases such as MongoDB, Cassandra, HBase would be an added advantage Other Skills- A Bachelors Degree from an accredited college or university or equivalent years (4 years) Creates and manages a machine learning pipeline, from raw data acquisitions to merging and normalizing to sophisticated feature engineering development to model execution Designs, leads and actively engages in projects with broad implication for the business and/or the future architecture, successfully addressing cross-technology and cross-platform issues. Selects tools and methodologies for projects and negotiates terms and conditions with vendors Curiosity about and a deep interest in how digital technology and systems are powering the way users do their jobs Comfortable working in a dynamic environment where digital is still evolving as a core offering For architect a must have business development support and presales activities
Posted 2 months ago
13 - 20 years
14 - 18 Lacs
Chennai
Work from Office
About The Role Solution Architects assess a projects technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architects role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. About The Role - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certificationpreferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management StorageArchitecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France