Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 7.0 years
16 - 20 Lacs
Mumbai
Work from Office
About The Role We are seeking a highly skilled and motivated Solutions Architect or Backend Architect to play a pivotal role in designing, developing, and integrating robust and scalable solutions. This role will focus heavily on API development and seamless system integration, leveraging a diverse technology stack including Python (with Django and Flask frameworks), Angular, Java, and JavaScript. You will be instrumental in translating business needs into technical blueprints, guiding development teams, and ensuring the delivery of high-quality, secure, and efficient IT solutions. Responsibilities - Lead and facilitate brainstorming sessions with stakeholders to understand business requirements, challenges, and opportunities, and translate them into potential technical solutions. - Design and architect scalable, secure, and high-performance backend systems and APIs. - Develop comprehensive architectural blueprints, diagrams, and documentation outlining system components, data flows, and integration strategies. - Define and enforce architectural standards, patterns, and best practices across development teams. - Evaluate and recommend appropriate technologies, frameworks, and tools to meet project requirements and business objectives. - Consider non-functional requirements such as performance, scalability, reliability, security, and maintainability in architectural designs. - Possess an in-depth understanding of coding languages relevant to API development, including Angular, Python (with expertise in Django and/or Flask), Java, and JavaScript. - Design and implement RESTful and other API architectures, ensuring clarity, consistency, and ease of use. - Develop and maintain API documentation using tools like Swagger/OpenAPI. - Architect and implement robust system integration solutions, connecting diverse applications and data sources. - Experience with various integration patterns and technologies (e.g., message queues, event-driven architectures). - Demonstrate good working knowledge of various Cloud Service Providers (e.g., AWS, Azure, GCP) and their core services relevant to backend development and deployment. - Design and architect cloud-native solutions, leveraging platform-as-a-service (PaaS) and serverless technologies where appropriate. - Understand cloud security best practices and ensure solutions are deployed securely in the cloud environment. - Possess experience with enterprise architecture principles and methodologies. - Familiarity with IT architecture frameworks (e.g., TOGAF, Zachman) is a plus. - Understand fundamental IT security principles and best practices, ensuring security is integrated into the solution design. - Have a working understanding of IT infrastructure components and their impact on solution architecture. - Appreciate IT governance and compliance requirements and ensure solutions adhere to relevant policies. - Actively identify opportunities for process improvements within the development lifecycle and IT operations. - Effectively analyze and resolve complex technical problems that arise during development or in production environments. - Provide technical leadership, guidance, and mentorship to development teams. - Collaborate effectively with cross-functional teams, including product managers, designers, and QA engineers. - Participate in code reviews and ensure code quality and adherence to architectural standards. - Continuously research current and emerging technologies, trends, and best practices in backend development, API design, and system integration. - Proactively propose and evaluate new technologies and approaches that could benefit the organization. - Demonstrate sound knowledge of various operating systems (e.g., Linux, Windows Server). - Possess a strong understanding of different database technologies (both relational and NoSQL) and their appropriate use cases. - Prepare and document comprehensive testing requirements to ensure the quality and reliability of the developed solutions. - Create and maintain clear and concise technical documentation for architectural designs, APIs, and integration processes. - Identify areas where IT can effectively support business needs and contribute to achieving organizational goals. - Collaborate with business units to understand their strategic objectives and translate them into actionable IT strategies and solutions. - Work with business units to improve current IT implementations and drive efficiency. - Participate in building and migrating software applications and services across the organization, ensuring minimal disruption and data integrity. - Work closely with product and delivery teams to architect and develop highly scalable and reliable solutions and products. - Communicate effectively and consult with both technical and non-technical clients and internal stakeholders to understand their needs and develop appropriate solutions. - Manage customer satisfaction by setting realistic expectations for the end-product and ensuring clear communication throughout the development process. - Demonstrate strong organizational skills to manage complex projects and tasks effectively. - Exhibit leadership qualities to guide and influence development teams and stakeholders. Qualifications - Proven experience as a Solutions Architect, Backend Architect, or in a similar role focused on API development and system integration. - Deep expertise in one or more of the following programming languages Python (Django, Flask), Java, JavaScript, and experience with Angular framework. - Strong understanding of API design principles (REST, SOAP, etc.) and experience with API management platforms. - Solid experience with database systems (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). - Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and their services. - Excellent communication, presentation, and interpersonal skills. - Strong analytical and problem-solving abilities. - Ability to work independently and as part of a collaborative team. - Strong organizational and time management skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
8.0 - 12.0 years
45 - 50 Lacs
Bengaluru
Work from Office
Member of a software engineering team involved in development & design of AI Data Platform built on NetApp s flagship storage operating ONTAP. ONTAP is a feature rich stack with its rich data management capabilities that has tremendous value to our customers and are used in mission critical applications across the world. You will work as part of a team responsible for the development, testing and debugging of distributed software that drives NetApp cloud, hybrid-cloud and on-premises solutions. As part of the Research and Development function, the overall focus of the group is on competitive market and customer requirements, supportability, technology advances, product quality, product cost and time-to-market. Software engineers focus on new product development along with enhancements to existing products. This is a mid-level technical lead position that requires an individual to be broad-thinking, systems-focused, creative, team-oriented, technologically savvy, able to lead large cross-functional teams, and driven to produce results. Job Requirements Proficiency in programming languages like Go/Golang. Experience with Machine Learning Libraries and Frameworks: PyTorch, TensorFlow, Keras, Open AI, LLMs ( Open Source), LangChain etc Experience working in Linux, AWS/Azure/GCP, Kubernetes - Control plane, Auto scaling, orchestration, containerization is a must. Experience with No Sql Document Databases (eg, Mongo DB, Cassandra, Cosmos DB, Document DB). Experience working building Micro Services, REST APIs and related API frameworks. Experience with Big Data Technologies: Understanding big data technologies and platforms like Spark, Hadoop and distributed storage systems for handling large-scale. datasets and parallel processing. Experience with Filesystems or Networking or file/cloud protocols is a must. Proven track record of leading mid to large sized projects. This position requires an individual to be creative, team-oriented, a quick learner and driven to produce results. Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies. Participate in technical discussions within the team and with other groups within Business Units associated with specified projects. Willing to work on additional tasks and responsibilities that will contribute towards team, department and company goals. A strong understanding and experience with concepts related to computer architecture, data structures and programming practices. Experience with AI/ML frameworks like PyTorch or TensorFlow is a Plus. Education Typically requires a minimum of 8-12 years of related experience with a bachelors degree and a masters degree; or a PhD with relevant experience.
Posted 3 weeks ago
2.0 - 5.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Skills: Java->Springboot , Java->Microservices , Java Responsibilities Required Experience on Java 8, Spring, Spring boot, Microservices APIs, XML and JSON formats Experience working in Media domain Working knowledge in design and build using AWS environments, AWS networking Knowledge on Devops related technologies like Ansible, Terraform, Bitbucket, Jenkins Good understanding of networking Requirement gathering and impact analysis Good troubleshoot/problem solving skills Databases: MySQL, No SQL(Cassandra), PostgreSQL Other Tools: Kibana, Grafana, Logstash, ELK, Kubernetes Preferred Experience in working on large enterprise level microservice solutions Experience working with automation testing and related tools
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
Join us as a Data Engineer We ll look to you to drive the build of effortless, digital-first customer experiences as you simplify our bank while keeping our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipeline modelling and ETL design, inspiring to be commercially successful through insights This is your opportunity to explore your leadership potential while bringing a competitive edge to your career profile by solving problems and creating smarter solutions Were offering this role at director level What you ll do In this role, you ll develop and share knowledge of business data structures and metrics, advocating for changes when needed for product development. You ll also educate and embed new data techniques into the business through role modelling, training, and experiment design oversight. We ll look to you to drive DevOps adoption into the delivery of data engineering, proactively performing root cause analysis while resolving issues. You ll also deliver a clear understanding of data platform cost levers to meet department cost savings and income targets. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to gather and build data solutions Actively participating in the data engineering community to deliver opportunities to support our bank s strategic direction Driving data engineering strategies to build complex, scalable data architecture and a customer feature rich dataset Driving the advanced automation of data engineering pipelines through the removal of manual stages Working alongside colleagues, scrums and project teams while liaising with technology and engineering teams to build business stakeholder engagement and to develop data solutions The skills you ll need We re looking for someone with strong communication skills and the ability to proactively engage and manage a wide range of stakeholders. You ll have extensive experience working in a governed, and regulatory environment. You ll also need: Experience of extracting value and features from large scale data Advanced experience of designing efficient data models that meet organisational needs. Familiarity with various modelling techniques like star and snowflake schemas can significantly enhance data retrieval and reporting performance Proven proficiency in relational databases (e.g., MySQL, PostgreSQL) as well as NoSQL databases (e.g., MongoDB, Cassandra) is critical for storing and managing various types of data. An understanding of modern code development practices Hours 45 Job Posting Closing Date: 03/06/2025
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Pune
Work from Office
About Enlyft Data and AI are at the core of the Enlyft platform. We are looking for creative, customer and detail-obsessed data engineers who can contribute to our strong engineering culture. Our big data engine indexes billions of structured / unstructured documents and leverages data science to accurately infer the footprint of thousands of technologies and products across millions of businesses worldwide. The complex and evolving relationships between products and companies form a technological graph that is core to our predictive modeling solutions. Our machine learning based models work by combining data from our customers CRM with our proprietary technological graph and firmographic data, and reliably predict an accounts propensity to buy. About the Role As a key member of our data platform team, youll be tasked with development of our next gen cutting-edge data platform. Your responsibilities will include building robust data pipelines for data acquisition, processing, and implementing optimized data models, creating APIs and data products to support our machine learning models, insights engine, and customer-facing applications. Additionally, youll harness the power of GenAI throughout the data platform lifecycle, while maintaining a strong focus on data governance to uphold timely data availability with high accuracy. What we re looking for Bachelors degree or higher in Computer Science, Engineering or related field with 7+ years of experience in data engineering with a strong focus on designing and building scalable data platforms and products Proven expertise in data modeling, ETL/ELT processes, data warehousing with distributed computing - Hadoop, Spark and Kafka Proficient in programming languages such as Python, Java and SQL Experience with cloud such as AWS, Azure or GCP and related services (S3, Redshift, BigQuery, Dataflow) Strong understanding of SQL / NoSQL databases (e.g. PostGres, MySQL, Cassandra) Proven expertise in data quality checks to ensure data accuracy, completeness, consistency, and timeliness Excellent problem-solving in fast-paced, collaborative environment, coupled with strong communication for effective interaction with tech & non-tech stakeholders Why join Enlyft A top-notch culture that is customer obsessed, transparent and constantly strives for excellence A top-notch team with colleagues that will help you learn and grow in a collaborative environment Competitive pay and great benefits Enlyft is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Posted 3 weeks ago
9.0 - 14.0 years
35 - 40 Lacs
Gurugram
Work from Office
Join us as a Data Engineer We ll look to you to drive the build of effortless, digital-first customer experiences as you simplify our bank while keeping our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipeline modelling and ETL design, inspiring to be commercially successful through insights This is your opportunity to explore your leadership potential while bringing a competitive edge to your career profile by solving problems and creating smarter solutions Were offering this role at director level What you ll do In this role, you ll develop and share knowledge of business data structures and metrics, advocating for changes when needed for product development. You ll also educate and embed new data techniques into the business through role modelling, training, and experiment design oversight. We ll look to you to drive DevOps adoption into the delivery of data engineering, proactively performing root cause analysis while resolving issues. You ll also deliver a clear understanding of data platform cost levers to meet department cost savings and income targets. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to gather and build data solutions Actively participating in the data engineering community to deliver opportunities to support our bank s strategic direction Driving data engineering strategies to build complex, scalable data architecture and a customer feature rich dataset Driving the advanced automation of data engineering pipelines through the removal of manual stages Working alongside colleagues, scrums and project teams while liaising with technology and engineering teams to build business stakeholder engagement and to develop data solutions The skills you ll need We re looking for someone with strong communication skills and the ability to proactively engage and manage a wide range of stakeholders. You ll have extensive experience working in a governed, and regulatory environment. You ll also need: Experience of extracting value and features from large scale data Advanced experience of designing efficient data models that meet organisational needs. Familiarity with various modelling techniques like star and snowflake schemas can significantly enhance data retrieval and reporting performance Proven proficiency in relational databases (e.g., MySQL, PostgreSQL) as well as NoSQL databases (e.g., MongoDB, Cassandra) is critical for storing and managing various types of data. An understanding of modern code development practices Hours 45 Job Posting Closing Date: 03/06/2025
Posted 3 weeks ago
7.0 - 11.0 years
0 - 1 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 10+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration, and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML, CSS, JavaScript, and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Hard Skills: Java Programming: Expert-level knowledge in Java and its application in full-stack development. Frontend Technologies: Proficient in frontend development using HTML, CSS, JavaScript, and frameworks like React or Angular. Backend Development: Skilled in backend development using Java, Spring, Hibernate, and RESTful services. Full Stack Development: Experience in developing end-to-end solutions, from the user interface to backend services and databases. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm (Monday-26th May to Friday-30th May)
Posted 3 weeks ago
5.0 - 10.0 years
17 - 25 Lacs
Bengaluru
Work from Office
Senior WASM Developer: Will be part of a team focused on creating our main platform experience with WASM technologies. This position is perfect for someone enthusiastic about Web assembly products and community and who wants to work on initiatives with major impact and cutting-edge technologies. You enjoy building next-gen products on WASM and want to push the boundaries of what WASM computing applications can do. Responsibilities: Leverage the power of WASM to build next-gen products at edge and cloud. Work with evangelists in the Organization with a deep desire to build next gen products and computing platforms using Web Assembly Being part of expert team to work on improving specifications, developing pallets with close attention to extensive testing and security Work on use cases across serverless apps, embedded functions, microservices and IoT devices Support latest WASM standards and proposal development Provide easy to understand documentation for the development team. Co-ordinate with architects to produce technical designs. Document the development process, architecture, and standard components. Job Description: 5+ years of Development experience; with mastery in atleast one of the languages Rust, Golang, C++ Expertise in JavaScript, Typescript, Node.Js Experience with Linux Knowledge about Security best practices standards. Knowledge about Low level code execution. Experience with Relation databases like MySQL, MSSQL and NoSQL databases like MongoDB. Work across multiple teams, mentor junior developers and actively participate in code reviews. Knowledge of design patterns and best practices. Should possess good coding skills and ability to approach the given problem statement Strong in software programming fundamental concepts Good To have: Proficiency with WASM and familiarity with ink!; and ability to extend Wasm for distributed cloud computing Experienced in anyone of the WASM RT Wasmtime, Lucet, WAMR, WASM Edge. Knowledge about Distributed communication protocol Devp2p, Libp2p Hands-on Applied Cryptography Signing, Hashing, Encryption, PKCS, Key Management. Familiarity in Docker, Kubernetes, Nginx, Git Knowledge of Cloud services (AWS, Azure, GCP) Knowledge about Stack machine. Awareness of embedded systems, prototype board like Raspberry Pi, IMX. Awareness of CI/CD pipelines. Any open-source contribution in the field of WASM. Any certifications or whitepapers. Sr WASM Developer Overall Exp. 5+ Skills - Year of experience - Remarks - Weightage Rust/Golang/C++ - 4+ - Mandatory - 40% WASM - 2+ - Mandatory - 30% Edge - 2+ - Good to have - 15% Linux - 2+ - Good to have - 15%
Posted 3 weeks ago
2.0 - 4.0 years
10 - 18 Lacs
Bengaluru
Work from Office
About Lowes Lowes Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About the Team This hiring is for Personalization team and it’s a mix of Software Engineers, Data Engineers and Data scientist. This team build signals and segments based on customer’s relatime and insession features and does gives multiple generative prediction which can help personalize the customer digital experience. This in turn will reduce friction and help customer find the right product more quick. I. Job Summary: The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver code modules, stable application systems, and software solutions. This includes developing, configuring, or modifying integrated business and/or enterprise application solutions within various computing environments. This role will be working closely with stakeholders and cross-functional departments to communicate project statuses and proposals. II. Roles & Responsibilities: Core Responsibilities: • Translates business requirements and specifications into logical program designs, code modules, stable application systems, and software solutions with occasional guidance from senior colleagues; partners with the product team to understand business needs and functional specifications. • Develops, configures, or modifies integrated business and/or enterprise application solutions within various computing environments by designing and coding component-based applications using various programming languages. • Tests application using test-driven development and behavior-driven development frameworks to ensure the integrity of the application. • Conducts root cause analysis of issues and participates in the code review process to identify gaps. • Implements continuous integration/continuous delivery processes to ensure quality and efficiency in the development cycle using DevOps automation processes and tools. • Ideates, builds, and publishes reusable libraries to improve productivity across teams. • Conducts the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications. • Solves difficult technical problems to ensure solutions are testable, maintainable, and efficient. III. Years of Experience: 2- 4 years experience IV. Education Qualification & Certifications: B.Tech in computer science or equivalent stream. Primary Skills (must have) Java Spring Boot Kafka Cassandra Redis GCP Secondary Skills (desired) Apache Beam React Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under state or local law. Lowe’s wishes to maintain appropriate standards and integrity in meeting the requirements of the Information Technology Act’s privacy provisions.
Posted 3 weeks ago
3.0 - 5.0 years
7 - 9 Lacs
Bengaluru
Work from Office
We are seeking a Generative AI Engineer with 3-5 years of experience with strong background in developing agentic architectures and experience with various frameworks and observability tools relevant to generative AI applications. This role demands proficiency in Python, deployment experience, and familiarity with Retrieval-Augmented Generation (RAG) applications. Responsibilities Design, develop, and implement agentic architecture for generative AI systems. Utilize frameworks such as Langchain, to enhance application performance and monitoring. Collaborate with cross-functional teams to build and deploy generative AI solutions that address business needs. Develop, test, and maintain agentic RAG applications, ensuring high performance and reliability. Develop, test, and maintain backend services and APIs using Python frameworks Experience in deploying machine learning models in production environments. Design and implement robust testing frameworks to assess model performance, including precision, recall, and other key performance indicators (KPIs). Continuously improve algorithms for accuracy, speed, and robustness in real-time. Stay updated with the latest advancements in generative AI technologies and methodologies to implement innovative solutions. Document processes, architectures, and code to ensure maintainability and knowledge sharing. Required Skills Technical Skills: Proficient in Python programming language in developing, training and evaluating deep learning models. Hands-on experience with frameworks like Langchain and Langraph. Version Control Systems: GIT, GitHub FastAPI, Flask, or Django REST Framework. Conduct evaluations of chat applications to assess effectiveness and user experience. Knowledge of various RAG applications and their implementation. Containerization and Orchestration: Docker and Kubernetes. Work with relational databases (PostgreSQL/MySQL) and NoSQL databases (MongoDB/Cassandra). Develop both synchronous (REST, gRPC) and asynchronous communication channels (using message brokers like RabbitMQ or Kafka). Experience: Minimum of 3 years of experience in software engineering or related fields. Proven track record of developing generative AI applications. Experience in evaluating chat applications for functionality and user engagement. Soft Skills: Strong analytical and problem-solving skills. Excellent communication skills for collaboration with team members and stakeholders. Ability to work independently as well as part of a team in a fast-paced environment.
Posted 3 weeks ago
2 - 5 years
2 - 5 Lacs
Bengaluru
Work from Office
Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings
Posted 1 month ago
5 - 7 years
8 - 14 Lacs
Hyderabad
Work from Office
Responsibilities for this position include : - Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, NoSQL stores like Cassandra, HBase etc) across Fractal and contributes to open source Big Data technologies - Write and tune complex Java, MapReduce, Pig and Hive jobs - Adapt quickly to change in requirements and be willing to work with different technologies if required - Experience leading a Backend/Distributed Data Systems team while remaining hands-on is very important - Lead the effort to build, implement and support the data infrastructure - Manage the business intelligence team and vendor partners, ensuring to prioritize projects according to customer and internal needs, and develops top-quality dashboards using industry best practices - Manage team of data engineers (both full-time associates and/or third party resources) - Own the majority of deliverables for the Big Data team from a delivery perspective - Analyzes and confirms the integrity of source data to be evaluated - Leads in deployment and auditing models and attributes for accuracy Education for Lead Data Engineer : Have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Engineering, Statistics, Education, Technical, Information Technology, Information Systems, Mathematics, Computer Engineering, Management Skills for Lead Data Engineer : Desired skills for lead data engineer include : - Python - Spark - Java - Hive - SQL - Hadoop architecture - Large scale search applications and building high volume data pipelines - Message queuing - NoSQL - Scala Desired experience for lead data engineer includes : - Experience in development utilizing C# .NET 4.5 - Experience in managing a live service that customers depend - Experience in coaching and managing other engineers - Be a team player and enjoy collaboration with other engineers and teams - Experience with software version management systems - Experience with task/bug tracking software.
Posted 1 month ago
5 - 9 years
11 - 15 Lacs
Mumbai, Hyderabad
Work from Office
Senior Data Scientist - NAV02EG Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-WB-Kolkata, IND-MM-Pune, IND-TN-Chennai Job Digital Solutions Schedule Full-time Employment Type Employee Job Level Experienced Job Posting May 8, 2025 Unposting Date Jun 7, 2025 Reporting Manager Title Head of Data Intelligence Duration of Contract 0 Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. Were bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. Worley Digital At Worley, our Digital team collaborates closely with the business to deliver efficient, technology-enabled sustainable solutions, that will be transformational for Worley. This team, aptly named Worley Digital, is currently seeking talented individuals who would be working on a wide range of latest technologies, including solutions based on Automation, Generative AI. What drives us at Worley Digital? Its our shared passion for pushing the boundaries of technological innovation, embracing best practices, and propelling Worley to the forefront of industry advancements. If youre naturally curious, open-minded, and a self-motivated learner - one whos ready to invest time and effort to stay future-ready - then Worley could be your ideal workplace. Major Accountabilities of PositionAI/ML Architect must have Defining, designing, and delivering ML architecture patterns operable in native and hybrid cloud architectures. Collaborate with Enterprise Architecture, Info Security, DevOps and Data Intelligence team to implement ML Solutions. Defining data augmentation pipelines for unstructured data like Documents, Engineering drawings etc. Build new network architecture in CNN/LSTM/RCNN or Develop wrapper for pre-trained models. Conduct feasibility of transfer learning fitment for given problem. Research, analyze, recommend, and select technical approaches to address challenging development and data integration problems related to ML Model training and deployment in Enterprise Applications. Perform research activities to identify emerging technologies (Generative AI) and trends that may affect the Data Science/ ML life-cycle management in enterprise application portfolio. Design and deploy AI/ML models in real-world environments and integrating AI/ML using Cloud native or hybrid technologies into large-scale enterprise applications .Demonstrated experience developing best practices and recommendations around tools/technologies for ML life-cycle capabilities such as Data collection, Data preparation, Feature Engineering, Model Management, ML Ops, Model Deployment approaches and Model monitoring and tuning. Knowledge / Experience / Competencies Required IT Skills & Experience (Priority wise) 1. Hands-on programming and architecture capabilities in Python. 2. Demonstrated technical expertise around architecting solutions around AI, ML, deep learning and Generative AI related technologies. 3. Experience in implementing and deploying Machine Learning solutions (using various models, such as GPT-4, Lama2, Mistral ai, text embedding ada, Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Topic Modeling, Game Theory etc. ) 4. Understanding of Nvidia Enterprise NEMO Suite. 5. Expertise in popular deep learning frameworks, such as TensorFlow, PyTorch, and Keras, for building, training, and deploying neural network models. 6. Experience in AI solution development with external SaaS products like Azure OCR 7. Experience in the AI/ML components like Azure ML studio, Jupyter Hub, TensorFlow & Sci-Kit Learn 8. Hands-on knowledge of API frameworks. 9. Familiarity with the transformer architecture and its applications in natural language processing (NLP), such as machine translation, text summarization, and question-answering systems. 10. Expertise in designing and implementing CNNs for computer vision tasks, such as image classification, object detection, and semantic segmentation. 11. Hands on experience in RDBMS, NoSQL, big data stores likeElastic, Cassandra. 12. Experience with open source software 13. Experience using the cognitive APIs machine learning studios on cloud. 14. Hands-on knowledge of image processing with deep learning ( CNN,RNN,LSTM,GAN) 15. Familiarity with GPU computing and tools like CUDA and cu DNN to accelerate deep learning computations and reduce training times. 16. Understanding of complete AI/ML project life cycle 17.
Posted 1 month ago
5 - 9 years
11 - 15 Lacs
Mumbai, Hyderabad
Work from Office
Senior Data Scientist - NAV02ED Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-WB-Kolkata, IND-MM-Pune, IND-TN-Chennai Job Digital Solutions Schedule Full-time Employment Type Employee Job Level Experienced Job Posting May 8, 2025 Unposting Date Jun 7, 2025 Reporting Manager Title Head of Data Intelligence Duration of Contract 0 Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. Were bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. Major Accountabilities of Position a AI/ML Architect must have Defining, designing, and delivering ML architecture patterns operable in native and hybrid cloud architectures. Collaborate with Enterprise Architecture, Info Security, DevOps and Data Intelligence team to implement ML Solutions. Defining data augmentation pipelines for unstructured data like Documents, Engineering drawings etc. Build new network architecture in CNN/LSTM/RCNN or Develop wrapper for pre-trained models. Conduct feasibility of transfer learning fitment for given problem. Research, analyze, recommend, and select technical approaches to address challenging development and data integration problems related to ML Model training and deployment in Enterprise Applications. Perform research activities to identify emerging technologies (Generative AI) and trends that may affect the Data Science/ ML life-cycle management in enterprise application portfolio. Design and deploy AI/ML models in real-world environments and integrating AI/ML using Cloud native or hybrid technologies into large-scale enterprise applications .Demonstrated experience developing best practices and recommendations around tools/technologies for ML life-cycle capabilities such as Data collection, Data preparation, Feature Engineering, Model Management, ML Ops, Model Deployment approaches and Model monitoring and tuning. Knowledge / Experience / Competencies Required IT Skills & Experience (Priority wise) 1. Hands-on programming and architecture capabilities in Python. 2. Demonstrated technical expertise around architecting solutions around AI, ML, deep learning and Generative AI related technologies. 3. Experience in implementing and deploying Machine Learning solutions (using various models, such as GPT-4, Lama2, Mistral ai, text embedding ada, Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Topic Modeling, Game Theory etc. ) 4. Understanding of Nvidia Enterprise NEMO Suite. 5. Expertise in popular deep learning frameworks, such as TensorFlow, PyTorch, and Keras, for building, training, and deploying neural network models. 6. Experience in AI solution development with external SaaS products like Azure OCR 7. Experience in the AI/ML components like Azure ML studio, Jupyter Hub, TensorFlow & Sci-Kit Learn 8. Hands-on knowledge of API frameworks. 9. Familiarity with the transformer architecture and its applications in natural language processing (NLP), such as machine translation, text summarization, and question-answering systems. 10. Expertise in designing and implementing CNNs for computer vision tasks, such as image classification, object detection, and semantic segmentation. 11. Hands on experience in RDBMS, NoSQL, big data stores likeElastic, Cassandra. 12. Experience with open source software 13. Experience using the cognitive APIs machine learning studios on cloud. 14. Hands-on knowledge of image processing with deep learning ( CNN,RNN,LSTM,GAN) 15. Familiarity with GPU computing and tools like CUDA and cu DNN to accelerate deep learning computations and reduce training times. 16. Understanding of complete AI/ML project life cycle 17. Understanding of data structures, data modelling and software architecture 18.
Posted 1 month ago
5 - 8 years
9 - 14 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Apache Cassandra database. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
- 5 years
3 - 6 Lacs
Gurugram
Work from Office
The Role and Responsibilities We have open positions ranging from Associate Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Roles and levels We are hiring for engineering role across the levels from Associate Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years. In addition to the base salary, this position may be eligible for performance-based incentives.
Posted 1 month ago
7 - 12 years
25 - 30 Lacs
Pune, Bengaluru
Work from Office
Tech Lead/ Associate Architect/ Architect- Artificial Intelligence and Machine Learning Technical Skills- Candidates with 7- 17 Years of total experience Strong experience in Artificial Intelligence and Machine Learning Experience with common data science toolkits, such as Python is a Must. Should have Worked in Concurrency, Data pipelines and Data Ingestion for models Should have actually worked on ML models beyond the parameter tuning and interfacing Experience with data visualization tools, such as Tableau, Power BI, D3.js, Gplot, etc.-mandatory for Architect Experience with SQL databases and time series databases. Experience with noSQL databases such as MongoDB, Cassandra, HBase would be an added advantage Other Skills- A Bachelors Degree from an accredited college or university or equivalent years (4 years) Creates and manages a machine learning pipeline, from raw data acquisitions to merging and normalizing to sophisticated feature engineering development to model execution Designs, leads and actively engages in projects with broad implication for the business and/or the future architecture, successfully addressing cross-technology and cross-platform issues. Selects tools and methodologies for projects and negotiates terms and conditions with vendors Curiosity about and a deep interest in how digital technology and systems are powering the way users do their jobs Comfortable working in a dynamic environment where digital is still evolving as a core offering For architect a must have business development support and presales activities
Posted 1 month ago
13 - 20 years
14 - 18 Lacs
Chennai
Work from Office
About The Role Solution Architects assess a projects technical feasibility, as well as implementation risks. They are responsible for the design and implementation of the overall technical and solution architecture. They define the structure of a system, its interfaces, the solution principles guiding the organisation, the software design and the implementation. The scope of the Solution Architects role is defined by the business issue at hand. To fulfil the role, a Solution Architect utilises business and technology expertise and experience. About The Role - Grade Specific Managing Solution/Delivery Architect - Design, deliver and manage complete solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certificationpreferably Capgemini Architects certification level 2 or above, relevant solution certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Confluence Delegation DevOps Docker ETL Tools Executive Presence GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management StorageArchitecture Storytelling Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication
Posted 1 month ago
9 - 14 years
14 - 18 Lacs
Kolkata
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Having 5+ years of experience in creating data strategy frameworks/ roadmaps Having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders. 5+ years in Analytics and data maturity evaluation based on current AS-is vs to-be framework. 5+ years Relevant experience in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's 2+ years working knowledge in Data StrategyData Governance/ MDM etc 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive , cognitive analytics Identify, design, and recommend internal process improvementsautomating manual processes, optimizing data delivery, re- designing infrastructure for greater scalability, etc. Identify data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to create frameworks for digital twins/ digital threads. Relevant experience in co-ordinating with cross functional team ; aka SPOC for global master data Your Profile 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with understanding big data toolsHadoop, Spark, Kafka, etc. Experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB. Experience with understanding data pipeline and workflow management toolsLuigi, Airflow, etc. Good to have cloud skillsets (Azure/ AWS/ GCP), 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Postgres/ SQL/ Mongo 2+ years working knowledge in Data StrategyData Governance/ MDM etc. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 1 month ago
12 - 15 years
20 - 25 Lacs
Bengaluru
Work from Office
About this opportunity: This position plays a crucial role in the development of Python-based solutions, their deployment within a Kubernetes-based environment, and ensuring the smooth data flow for our machine learning and data science initiatives. The ideal candidate will possess a strong foundation in Python programming, hands-on experience with ElasticSearch, Logstash, and Kibana (ELK), a solid grasp of fundamental Spark concepts, and familiarity with visualization tools such as Grafana and Kibana. Furthermore, a background in ML Ops and expertise in both machine learning model development and deployment will be highly advantageous What you will do: Generative AI & LLM Development, 12-15 Yrs of experience as Enterprise Software Architect with strong hands-on experience Strong hands-on experience in Python and microservice architecture concepts and development Expertise in crafting technical guides, architecture designs for AI platform Experience in Elastic Stack , Cassandra or any Big Data tool Experience with advance distributed systems and tooling, for example, Prometheus, Terraform, Kubernetes, Helm, Vault, CI/CD systems. Prior experience to build multiple AI/ML based models and deployed the models into production environment and creating the data pipelines Experience in guiding teams working on AI, ML, BigData and Analytics Strong understanding of development practices like architecture design, coding, test and verification. Experience with delivering software products, for example release management, documentation What you will Bring: Python Development: Write clean, efficient, and maintainable Python code to support data engineering tasks, including data collection, transformation, and integration with machine learning models. Data Pipeline Development: Design, develop, and maintain robust data pipelines that efficiently gather, process, and transform data from various sources into a format suitable for machine learning and data science tasks using ELK stack, Python and other leading technologies. Spark Knowledge: Apply basic Spark concepts for distributed data processing when necessary, optimizing data workflows for performance and scalability. ELK Integration: Utilize ElasticSearch, Logstash, and Kibana (ELK) for data management, data indexing, and real-time data visualization. Knowledge of OpenSearch and related stack would be beneficial. Grafana and Kibana: Create and manage dashboards and visualizations using Grafana and Kibana to provide real-time insights into data and system performance. Kubernetes Deployment: Deploy data engineering solutions and machine learning models to a Kubernetes-based environment, ensuring security, scalability, reliability, and high availability.
Posted 1 month ago
2 - 6 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title- Data Tribe | Java Backend Engineer Location- Pune (India) Role Description In our squad Data Driven Services we develop and maintain high-quality microservices for digital private banking solutions, such as our messaging product InfoServices and the digital customer advisory Finanzcheck. We are part of the tribe Customer Data Intelligence in TDI PB Germany that covers the complete value chain of customer data processing including mobilization, integration, enrichment and storage. On our platforms we offer analytical capabilities and value-added services. Exciting tasks are awaiting us in the context of with the banks cloud strategy (Google partnership). If you are looking for a new challenge in a truly agile team building excellent solutions, now is a perfect time to join us as a Engineer. As Engineer you are building significant parts of the entire engineering solutions. You will actively participate in the solution design and contribute to ensure that the solutions are maintainable and built stable according to the DevOps philosophy. You engage in the agile team and strive to expand your solid technical expertise. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and maintain high-quality data driven microservices Follow the DevOps model to build stable, ideally zero-ops applications, share the responsibility for a stable production Find new ways to work more efficiently and support the introduction of productivity-increasing measures, leaner processes, efficient tools and generally automation Actively and continuously look to expand your understanding of technologies currently or perspectively employed in our squad Act as a true agile team member prioritizing team goals Deepen knowledge in respect to architecture and solution design Your skills and experience Successfully completed degree in computer sciences, maths or related field Extensive demonstrated hands-on experience using Java and experience designing, developing, and maintaining complex applications and database experience (SQL Server and/or Oracle) Java 8, Spring, Hibernate, REST APIs, Junit, Oracle/SQL/ PL SQL, Linux, Shell Script, JMS, MQ Jira, Confluence, Git, Maven, Teamcity/Jenkins, Artifactory Solid understanding of design patterns Experience working in an Agile/DevOps environment Strong analytical and design skills Proficient communication skills (written/verbal) Experience with Google Cloud, AWS or Azureuni Event based frameworks, Spring Boot, Apache camel, Kafka, BigQuery, Apache Beam, NoSQL (PostgreS, couchbase, cassandra, Oracle, NoSQL), Docker, Kubernetes, Openshift, GKE TDD, BDD Experience in Investment Banking/Financial domain Practical experience with Build Tools (preferably Maven), Source Code Control (preferably Git), Continuous Integration (Hudson, Jenkins or TeamCity) and Cloud/ Docker based application deployment. TDD experience. Data modeling experience. Ability to produce well-tested, documented, high-performance code to tight schedules Minimum of 3 years hands-on programming experience with Java. Deep knowledge about frameworks such as Spring Boot Profound knowledge of design patterns and principles Deep understanding of database models and SQL tools Understanding of ITIL processes in a DevOps context Worked in agile environments with methodologies such as SCRUM, Kanban or similar Excellent command of spoken and written English Experienced working in distributed multi-cultural teams How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
5 - 10 years
45 - 50 Lacs
Pune
Work from Office
About The Role : Job Title Solution Architect, VP LocationPune, India Corporate TitleVP Role Description Your role will be an Individual contributor in the team. You will be closely working with team comprising of engineers, Lead, functional analysts, and test lead. The team is responsible for developing and implementing micro-services, Front end Application development & enhancements, integrating another partner and client integrations. As a Solution architect you are expected to give the team be hands on with software development, contribute towards good software design and test developed software. You will also be engaged in peer code reviews, document design decisions, and components APIs. You will be participating in daily stand up meetings, analysing software defects and fixing them in a timely manner, and working closely with the Functional Analysis and Quality Assurance teams. As/when required to, you are also expected to train other team members to bring them up to speed. Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities As a Solution Architect, you will be responsible for creating the Architecture and Design of the applications on the Cloud Platform. You are expected to have practical architecture depth to meet the business and technical requirements and expected to have hands on Engineering Experience. You are expected to create the Design and Architecture blueprint for the multi-region, highly available applications on Google Cloud Platform. You will be responsible for the design and implementation of various non-functional requirements like Security, Scalability, Observability, Disaster Recovery, Data Protection. You will provide the technical leadership to the Engineering teams and deliver the application releases. You will work with the SRE and support teams to help bring Architectural improvements. Your skills and experience Expert level Cloud Architecture experience at the Solution Design and implementation level. Overall 15+ years of hands-on coding and engineering experience with at least 5 years of experience in designing and building applications on Cloud Platforms. Cloud Certifications for GCP (Preferred) or AWS or Azure. Well versed with the Well Architected Framework pillars like Security, Availability, Reliability, Operational Excellence, Cost Optimization. Hands on experience in Cloud Services like Kubernetes, API gateways, Load Balancers, Cloud Storage Services, VPCs, NAT Gateways, Cloud SQL databases, VMs and compute services like Cloud Run. Hands on development experience in developing applications using Core Java, Sprint Boot, REST APIs, Databased like Oracle, MongoDB, Apache Kafka. Good knowledge about Frontend technologies like JavaScript, React.js, TypeScript. Experience in designing multi-region Disaster Recovery (DR) solutions and achieving the Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO). Experience in building highly available, low latency and high-volume applications, performance testing and tuning. Good knowledge about Microservices architecture. Working knowledge DevOps tools like Jenkins/ GitHub actions, Terraform, Helm Chart. Experience in building application Observability using tools like Prometheus/Grafana, New Relic and creating SLO dashboards. Good understanding of security principle like encryption techniques, handling security vulnerabilities, and building solutions to prevent DDoS attacks. Nice to have skills. FunctionalPayment Industry overview, Payment processing, Real-time payments processing Shell Scripting is nice to have Change management process exposure Software and infra production promotion experience Test Automation Frameworks Moderate coding skills on Python. Experience in distributed system development. Cross-platform development in several CPU/operating system environments and network protocols. Demonstrated expertise in problem-solving and technical innovation Data Structures, Algorithms and Design Patterns Data stores, persistence, caching (Oracle, MongoDB, Cassandra, and Hadoop tools, memcache etc) How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
2 - 3 years
0 - 0 Lacs
Bengaluru
Work from Office
Experience Range: 2-3 years Hiring Location: Bangalore Must-Have Skills: Proficiency in Java and/or Python with strong OOP concepts Solid experience with SQL and relational databases (e.g., MySQL, PostgreSQL) Experience with Spring Framework (for Java) or Django/Flask (for Python) Basic understanding of Unix shell scripting Strong analytical , problem-solving , and communication skills Proficiency with version control systems like Git Good-to-Have Skills: Knowledge of Hibernate , JPA , or Pandas Experience with NoSQL databases such as MongoDB or Cassandra Familiarity with front-end technologies like HTML, CSS, JavaScript and frameworks such as React, Angular, Vue.js Hands-on experience with RESTful API development Exposure to test automation frameworks like JUnit , PyTest Experience with CI/CD tools such as Jenkins , GitLab Familiarity with cloud platforms (AWS, Azure, GCP) Experience in the investment banking domain Required Skills Java/Python,OOPS,SQL
Posted 1 month ago
6 - 10 years
11 - 21 Lacs
Bengaluru
Hybrid
RESPONSIBILITIES: Choosing the right technologies for our use cases, deploy and operate. Setting up Data stores structured, semi structured and non-structured. Secure data at rest via encryption Implement tool to access securely multiple data sources Implement solutions to run real-time analytics Use container technologies Required Experience & Skills: Experience in one of the following: Elastic Search, Cassandra, Hadoop, Mongo DB Experience in Spark and Presto/Trino Experience with microservice based architectures Experience on Kubernetes Experience of Unix/Linux environments is plus Experience of Agile/Scrum development methodologies is a plus Cloud knowledge a big plus (AWS/GCP) (Kubernetes/Docker) Be nice, respectful, able to work in a team Willingness to learn
Posted 1 month ago
12 - 15 years
20 - 25 Lacs
Bengaluru
Work from Office
About this opportunity: This position plays a crucial role in the development of Python-based solutions, their deployment within a Kubernetes-based environment, and ensuring the smooth data flow for our machine learning and data science initiatives. The ideal candidate will possess a strong foundation in Python programming, hands-on experience with ElasticSearch, Logstash, and Kibana (ELK), a solid grasp of fundamental Spark concepts, and familiarity with visualization tools such as Grafana and Kibana. Furthermore, a background in ML Ops and expertise in both machine learning model development and deployment will be highly advantageous What you will do: Generative AI & LLM Development, 12-15 Yrs of experience as Enterprise Software Architect with strong hands-on experience Strong hands-on experience in Python and microservice architecture concepts and development Expertise in crafting technical guides, architecture designs for AI platform Experience in Elastic Stack , Cassandra or any Big Data tool Experience with advance distributed systems and tooling, for example, Prometheus, Terraform, Kubernetes, Helm, Vault, CI/CD systems. Prior experience to build multiple AI/ML based models and deployed the models into production environment and creating the data pipelines Experience in guiding teams working on AI, ML, BigData and Analytics Strong understanding of development practices like architecture design, coding, test and verification. Experience with delivering software products, for example release management, documentation What you will Bring: Python Development: Write clean, efficient, and maintainable Python code to support data engineering tasks, including data collection, transformation, and integration with machine learning models. Data Pipeline Development: Design, develop, and maintain robust data pipelines that efficiently gather, process, and transform data from various sources into a format suitable for machine learning and data science tasks using ELK stack, Python and other leading technologies. Spark Knowledge: Apply basic Spark concepts for distributed data processing when necessary, optimizing data workflows for performance and scalability. ELK Integration: Utilize ElasticSearch, Logstash, and Kibana (ELK) for data management, data indexing, and real-time data visualization. Knowledge of OpenSearch and related stack would be beneficial. Grafana and Kibana: Create and manage dashboards and visualizations using Grafana and Kibana to provide real-time insights into data and system performance. Kubernetes Deployment: Deploy data engineering solutions and machine learning models to a Kubernetes-based environment, ensuring security, scalability, reliability, and high availability. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 766747
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Cassandra is a popular open-source distributed database management system that is widely used in the tech industry. In India, the demand for professionals with Cassandra skills is on the rise, with many companies actively hiring for roles related to this technology.
Here are 5 major cities in India where there is a high demand for Cassandra professionals:
The salary range for Cassandra professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 5-8 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
Typically, a career in Cassandra progresses from roles such as Junior Developer or Database Administrator to Senior Developer, Tech Lead, and eventually Architect or Data Engineer.
In addition to proficiency in Cassandra, employers often look for candidates with the following skills:
As you explore job opportunities in the Cassandra domain, make sure to brush up on your skills and be well-prepared for interviews. With the right preparation and confidence, you can land a rewarding career in this growing field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.