Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are looking for a Lead AI Engineer with at least 8 years of experience to join UST at Trivandrum. As a Lead AI Engineer, you will be responsible for designing and implementing end-to-end AI/ML solutions using Python and deep learning frameworks like Keras. Your role will involve working with Generative AI and LLM architectures such as GPT-3 and BERT, as well as leading AI/ML teams and mentoring junior engineers. Your key responsibilities will include developing data pipelines and infrastructure using tools like Kafka, Big Data tools, Aurora DB, and ELK Stack. You will also be expected to deploy and optimize ML models on cloud platforms such as AWS, Azure, or GCP. Your expertise in Time Series Forecasting, Predictive Analytics, and Deep Learning will be crucial in translating business requirements into scalable AI solutions. In addition to hands-on experience in AI and Machine Learning, you should have a strong programming background in Python, Keras, and Pandas. Knowledge of advanced ML models like LangChain, GPT-3, and Transformers, as well as analytical models such as Time Series Forecasting, will be highly beneficial. Familiarity with databases like ChromaDB/Pinecone is also desired. Experience with PoC creation, model experimentation, and AI governance principles will be an advantage. You should also keep abreast of emerging trends and tools in the AI/ML space to evaluate their strategic fit and contribute to AI roadmap planning. UST is a global digital transformation solutions provider known for partnering with leading companies to drive real impact through innovation. With a workforce of over 30,000 employees in 30 countries, UST is committed to embedding innovation and agility into its clients" organizations, touching billions of lives in the process.,
Posted 1 week ago
3.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As an Enterprise Architect & AI Expert, your role will involve defining and maintaining the enterprise architecture framework, standards, and governance. You will align IT strategy with business goals to ensure architectural integrity across systems and platforms. Leading the development of roadmaps for cloud, data, application, and infrastructure architectures will be a key responsibility. It will also be crucial to evaluate and select technologies, platforms, and tools that align with enterprise goals. You will be responsible for designing and implementing AI/ML solutions to solve complex business problems. Leading AI initiatives such as NLP, computer vision, predictive analytics, and generative AI will be part of your duties. Collaborating with data scientists, engineers, and business stakeholders to deploy AI models at scale will be essential. Ensuring ethical AI practices, data governance, and compliance with regulatory standards will also be critical. In terms of leadership and collaboration, you will act as a strategic advisor to senior leadership on technology trends and innovation. Mentoring cross-functional teams, promoting architectural best practices, and facilitating enterprise-wide workshops and architecture review boards will be part of your role. To qualify for this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. You should possess 14+ years of experience in enterprise architecture, with at least 3 years in AI/ML domains. Proven experience with cloud platforms such as AWS, Azure, GCP, microservices, and API management is required. Strong knowledge of AI/ML frameworks like TensorFlow, PyTorch, Scikit-learn, and MLOps practices is essential. Familiarity with data architecture, data lakes, and real-time analytics platforms is also expected. Excellent communication, leadership, and stakeholder management skills are necessary for this role. Mandatory skills for this position include experience in designing GenAI and RAG architectures, familiarity with AWS, Vector DB - Milvus (preferred), OpenAI or Claude, LangChain, and LlamaIndex. Thank you for considering this opportunity. Siva,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a "Lending DevOps Efficiency Engineering Lead" at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects. Your responsibilities will include planning, budgeting, ensuring agreed quality, and adhering to governance standards. As a key member of the team, you will lead the evolution of our digital landscape, driving innovation and excellence to enhance our digital offerings and deliver unparalleled customer experiences. To excel in this role, you should have a minimum of 7 years of project management experience in technology environments with a strong DevOps background. You must possess a solid understanding of the software development lifecycle and DevOps practices and have experience implementing security controls within development processes. Additionally, familiarity with CI/CD pipelines, infrastructure as code, and automation tools is essential. Excellent communication and stakeholder management skills, along with a proven ability to drive organizational change and process improvements, are key requirements. Experience with Agile methodologies and tools is also highly desirable. In addition to the above requirements, the following skills will be considered advantageous: - Possession of PMP, CSM, or other project management certifications - Experience working with Java-based applications and microservices architecture - Knowledge of cloud platforms such as AWS and OpenShift - Understanding of regulatory compliance requirements in technology - Experience with security frameworks - Background in financial services or other regulated industries As the "Lending DevOps Efficiency Engineering Lead," you will be based in Pune and will be responsible for leading and managing engineering teams. Your primary focus will be to provide technical guidance, mentorship, and support to ensure the delivery of high-quality software solutions. By driving technical excellence, fostering innovation, and collaborating with cross-functional teams, you will align technical decisions with business objectives and contribute to the overall success of the projects. Your key accountabilities will include: - Leading engineering teams effectively to achieve project goals and organizational objectives - Overseeing timelines, team allocation, risk management, and task prioritization for successful solution delivery - Mentoring team members, conducting performance reviews, and identifying opportunities for growth - Evaluating and enhancing engineering processes, tools, and methodologies to increase efficiency and optimize team productivity - Collaborating with stakeholders to translate business requirements into technical solutions and ensure a cohesive approach to product development - Enforcing technology standards, facilitating peer reviews, and implementing robust testing practices to deliver high-quality solutions If you are appointed as an Assistant Vice President, you will be expected to advise and influence decision-making, contribute to policy development, and ensure operational effectiveness. As a leader, you will set objectives, coach employees, and drive performance excellence. If you are an individual contributor, you will lead collaborative assignments, guide team members, and identify new directions for projects to meet required outcomes. All colleagues at Barclays are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship. Additionally, they should embody the Barclays Mindset of Empower, Challenge, and Drive, which serves as the operating manual for the organization's behavior.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Associate DevOps Engineer at NTT DATA, you will be a developing subject matter expert responsible for consulting with internal clients regarding the software technology environment with medium complexity. Your primary role will involve translating business requirements into technology terms, developing algorithms and programming code for software solutions, and designing, testing, implementing, and maintaining software code to enable computer systems to perform specific tasks. Key Responsibilities: - Work closely with Platform Specialists to receive detailed business/user requirement specifications. - Assist in preparing technical requirements documents and systems flowcharts. - Analyze business requirements and ensure that the designed solution meets those requirements. - Transform detailed requirements into a complete systems design document. - Compile detailed technical specifications for developers or IT Programmers. - Convert designs into a complete information system, including testing and refining programs. - Document the development process and procedures for application use and security. - Identify and correct program errors in logic and data. - Assist with the installation, deployment, and resolution of any problems in the integration and test phases. - Support all code sets and implementations in production in accordance with defined Service Level Agreements (SLAs). - Write and distribute task descriptions for operating and maintaining the implemented solution. To excel in this role, you need to have: - Knowledge of multi-technology application design best practices. - Ability to take holistic views of application environments. - Strong problem-solving and analytical skills. - Technical understanding of development and platform engineering. - Client-centric focus on business outcomes. - Excellent planning and organizational skills. - Effective communication skills. - Attention to detail and ability to work well in a team environment. - Knowledge of project management principles. - Knowledge of continuous integration and deployment processes. - Experience with deployment and release management. - Knowledge of automation tooling and source code repositories. Academic Qualifications and Certifications: - Bachelor's degree in Information Technology or related field. - Relevant DevOps certification preferred. - Relevant Agile related certification preferred. - Certification of cloud platforms and services (e.g., AWS, Azure, GCP). - Certification of scripting and programming languages (e.g., Bash, Python, Ruby). Required Experience: - Experience in Software Development and Support. - Experience deploying software solutions in an outsourced or similar IT environment. - Experience working in a multi-team environment across multiple geographies. - Experience in programming/development including Agile processes such as SCRUM, KANBAN. Workplace Type: Hybrid Working About NTT DATA: NTT DATA is a trusted global innovator of business and technology services committed to helping clients innovate, optimize, and transform for long-term success. With a diverse workforce and a robust partner ecosystem, we provide business and technology consulting, data and artificial intelligence services, industry solutions, and application development and management. As a leading provider of digital and AI infrastructure, NTT DATA invests in research and development to support organizations and society in the digital future. NTT DATA is an Equal Opportunity Employer.,
Posted 1 week ago
12.0 - 19.0 years
0 Lacs
karnataka
On-site
As a technical leader at Sun Life Global Solutions (SLGS), you will play a crucial role in shaping the technical strategy and driving innovation to support our mission of helping clients achieve lifetime financial security and live healthier lives. You will collaborate with architects to make build vs. buy decisions and create technical roadmaps. Evaluating and selecting appropriate technology stacks, platforms, and vendors will be key responsibilities, including web application frameworks and cloud providers for solution development. Your involvement in team ceremonies and removing technical impediments will be essential to ensure project success. You will be responsible for owning the success of foundational enablers, championing research and innovation, and leading scaled agile ceremonies and activities. Collaborating with the Platform Owner, you will prioritize technical capabilities and present platform delivery metrics to executive audiences. Working closely with other Technical Leads, you will create and maintain the technical roadmap for products and services within the platform. The preferred candidate profile for this role includes a total work experience of 12-19 years, a B.E./B.Tech or equivalent engineering background, and a master's degree or equivalent experience in Marketing, Business, or Finance. You should have 15+ years of experience in technical architecture, solution design, and platform engineering. Strong expertise in MDM, Data Quality, and Data Governance practices, including tools like Informatica MDM SaaS, Informatica Data Quality, and Collibra, will be advantageous. Additionally, experience with major cloud platforms and data tools such as AWS, Microsoft Azure, Kafka, CDC, Tableau, and Data virtualization tools is preferred. A solid background in ETL and BI solution development, familiarity with agile methodologies, and knowledge of industry standards and regulations related to data management (e.g., HIPAA, PCI-DSS, GDPR) are desired qualifications for this role. Your technical acumen in areas like Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts will be essential for success in this position.,
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
chennai, tamil nadu
On-site
As a member of the US Omni tech team at Walmart Global Tech, you will be playing a crucial role in enhancing the quality of Catalog data in the fast-growing E-Commerce sector. Your responsibilities will include analyzing data to identify gaps, recommending solutions, and collaborating with cross-functional teams to drive operational decisions. Effective communication with stakeholders, building SOPs, template management, and ensuring adherence to quality processes will be key components of your role. You will proactively address item-related issues reported by Merchants and Suppliers, independently handle complex problems, and work towards eliminating process redundancies. Your proficiency in Microsoft Office applications, strong analytical skills, and ability to bring operational efficiencies by following best practices will be essential for success in this role. The ideal candidate will hold a bachelor's degree with 0-3 years of experience in the Retail/E-Commerce industry, possess excellent English communication skills, and be adept at email etiquette. Flexibility to work in multiple shifts, along with technical skills such as system administration concepts, familiarity with ticketing systems, and basic scripting knowledge will be advantageous. Experience with cloud platforms and data querying tools will also be beneficial. At Walmart Global Tech, you will be part of a dynamic team that leverages technology to make a significant impact on the retail industry. You will have the opportunity to grow your skills, collaborate with experts, and drive innovation at scale. The hybrid work model at Walmart allows for a mix of in-office and virtual presence, providing flexibility and enabling quick decision-making. Apart from a competitive compensation package, you will have access to incentive awards, best-in-class benefits, and a supportive work culture that values diversity and inclusion. By fostering a workplace where everyone feels included, Walmart aims to create opportunities for associates, customers, and suppliers globally. Join us at Walmart Global Tech to be a part of a team that is shaping the future of retail and making a positive impact on millions of lives.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Arivon is seeking an Artificial Intelligence (AI) Software Engineer with expertise in generative AI, Natural Language Processing (NLP), and LLM. The ideal candidate should have a minimum of 5 years of experience in NLP and Generative AI and LLM research. The role involves tasks such as language model evaluation, data processing for pre-training and fine-tuning, LLM alignment, reinforcement learning for language model tuning, and efficient training and inference. Additionally, the candidate should have experience in multilingual and multimodal modeling. The AI Software Engineer should have hands-on experience with Machine Learning (ML), NLP, and Deep Learning (DL) frameworks like PyTorch, Tensorflow, and SFT on LLM. Proficiency in leveraging LangChain for the development and integration of LLMs into various applications is essential. The candidate should demonstrate a strong ability to enhance AI functionalities through innovative applications of natural language processing and understanding. Experience with model quantization or computational optimizations is considered a plus. Familiarity with cloud platforms such as AWS, GCP, and Azure is advantageous, and prior experience in the Healthcare domain is a bonus. The role also involves designing methods, tools, and infrastructure to advance the state of the art in large language models. Responsibilities include deploying, monitoring, and managing ML models in production to ensure high availability and low latency. The candidate will be expected to manage and optimize ML infrastructure, including cloud services, containers, and orchestration tools. A Master's degree in computer science, computer engineering, or a related technical field is required, while a Ph.D. in AI, computer science, data science, or related fields is preferred. Proficiency in programming languages like Python is necessary for this role.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Backend Developer at our company, you will be responsible for architecting, designing, building, and maintaining high-performance backend systems in modern Java. These systems need to be scalable and capable of handling millions of concurrent users in our multitenant SaaS environment. You will play a key role in API design and implementation, developing clean and well-documented RESTful APIs, as well as high-efficiency gRPC services to expose backend functionality. Your expertise in distributed systems will be crucial as you architect and build microservices and event-driven systems using frameworks like Spring Boot and Quarkus. You will work with both SQL (PostgreSQL, MySQL) and NoSQL (Cassandra, Redis) databases to ensure efficient data storage and retrieval. Additionally, you will deploy and manage services on cloud platforms (AWS, GCP, Azure) using Kubernetes, Docker, and serverless technologies as part of our cloud-native development approach. Identifying bottlenecks, optimizing queries, and implementing caching strategies for low-latency responses will be part of your responsibilities for performance optimization. You will also be involved in implementing security and compliance measures, including authentication/authorization, encryption, and data protection best practices. Collaboration with frontend, DevOps, and product teams is essential to define API contracts, troubleshoot issues, and deliver end-to-end features. Furthermore, you will set up logging, metrics, and alerts to ensure system health and rapid incident response and drive code reviews while advocating for engineering best practices. Your qualifications should include 8+ years of development experience in Java, expertise in RESTful API design, gRPC implementation, and hands-on experience with microservices, message brokers, and API gateways. Strong database skills, familiarity with cloud platforms and containerization, knowledge of DevOps practices, CI/CD pipelines, and IaC tools are also required. Excellent problem-solving skills and a passion for writing clean, maintainable code will be key to success in this role. This is a full-time position located in-person. Application Question(s): How many years of experience do you have in gRPC ,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
We are looking for an exceptional leader to oversee our service ecosystem with vision and precision as the Head of IT Service Excellence and Orchestration for Global Cloud and Hosting services. In this role, you will be responsible for managing a comprehensive portfolio of global managed services, ranging from traditional data centers to cutting-edge cloud platforms and emerging edge computing solutions. Your focus will be on balancing technical excellence with business acumen to ensure that our services deliver maximum value while upholding the highest standards of reliability, security, and performance. This position is at the intersection of technology strategy and operational excellence, requiring the ability to navigate complex multi-vendor relationships, drive innovation in hybrid environments, and inspire teams to deliver exceptional service experiences. You will collaborate with senior stakeholders across Siemens to understand business requirements and translate them into effective IT service strategies that drive our organization's success. **What we offer you:** - An attractive remuneration package - Access to Siemens share plans - 30 days of paid vacation and flexible work schedules - 2 to 3 days of mobile working per week as a global standard - Flexible training opportunities for professional and personal development - A work environment that values your individuality and fosters a sense of belonging at Siemens In this role, you will: - Develop and execute multi-sourcing strategies across hybrid environments - Integrate cutting-edge hyperscaler solutions - Elevate operational excellence across our service ecosystem - Lead and evolve a global multi-team department focused on service excellence - Ensure seamless integration of in-house and partner-delivered services - Drive continuous improvement in service quality, operational efficiency, and customer experience - Partner with senior leadership to align technical execution with strategic business goals - Build high-performing teams with a culture of ownership, innovation, and excellence **Qualifications:** - Masters degree or equivalent experience in Computer Science, Information Technology, or related field - Proven leadership experience managing diverse teams in complex IT service environments - 10 years of leadership experience in high-tech or engineering companies - Deep expertise in hybrid IT service management, including data centers, cloud, and edge computing - Strategic mindset with the ability to align technology roadmaps with business objectives - Successful track record implementing multi-cloud strategies and hyperscaler integrations - Experience managing in-house and partner service delivery models - Strong understanding of operational excellence frameworks and continuous service improvement - Exceptional stakeholder management and communication skills - Experience with agile methodologies and fostering innovation-driven cultures - Resilient, hands-on leadership approach in dynamic environments - Fluent English communication skills; additional languages beneficial We believe in the potential of every candidate and welcome applications from individuals with disabilities. At Siemens, we value diversity and support your personal and professional development. Join us in shaping the future of Siemens" Digital Infrastructure and creating a better future together. To learn more about job opportunities at Siemens, visit www.siemens.de/careers. If you have any questions about the application process, please refer to our FAQ section.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining the quality engineering team at PTC as a Frontend Automation QA Engineer. Your role will involve automating web applications, ensuring high-quality software delivery in an Agile environment. You should have at least 3 years of experience in automation testing for web applications and a strong understanding of Object-Oriented Programming languages, preferably TypeScript or JavaScript. Experience with Agile methodologies like Scrum or Kanban is essential, along with the ability to independently create effective test cases. Hands-on experience with Playwright for frontend automation is required, and you should be a quick learner who can adapt to new technologies. Good communication skills, both verbal and written, are important, as well as the ability to collaborate effectively within a team. Knowledge of Cucumber BDD for behavior-driven testing is a plus, as is experience with CI/CD pipelines, preferably using GitLab. Basic understanding of API testing, tools like Postman or REST clients, web accessibility standards, and cloud platforms such as Azure or AWS will be beneficial. At PTC, you will be part of a global team focused on creating opportunities for personal and professional growth while celebrating diversity and innovation. If you are passionate about problem-solving through innovation and ready to embark on a rewarding career journey, we invite you to explore the possibilities at PTC. PTC values the privacy rights of individuals and ensures the responsible handling of Personal Information in compliance with all relevant privacy and data protection laws. For more information, you can review our Privacy Policy on our website.,
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Nagar, Chennai
Work from Office
What you will do: As a Data Engineer at ACV Auctions, you HAVE FUN !! You will design, develop, write, and modify code. You will be responsible for development of ETLs, application architecture, optimizing databases & SQL queries. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Design, develop, and maintain scalable ETL pipelines using Python and SQL to ingest, process, and transform data from diverse sources. Write clean, efficient, and well-documented code in Python and SQL. Utilize Git for version control and collaborate effectively with other engineers. Implement and manage data orchestration workflows using industry-standard orchestration tools (e.g., Apache Airflow, Prefect).. Apply a strong understanding of major data structures (arrays, dictionaries, strings, trees, nodes, graphs, linked lists) to optimize data processing and storage. Support multi-cloud application development. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in MySQL and Postgres database development. Influence companywide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Participate in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements. Participate in on-call rotation What you will need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 3+ years of experience programming in Python 3+ years of experience with ETL workflow implementation (Airflow, Python) 3+ years work with continuous integration and build tools. 2+ year of experience with Cloud platforms preferably in AWS or GCP Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Proficient in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Proficient in databases (RDB), SQL, and can contribute to table definitions. Self-sufficient debugger who can identify and solve complex problems in code. Deep understanding of major data structures (arrays, dictionaries, strings). Experience with Domain Driven Design. Experience with containers and Kubernetes. Experience with database monitoring and diagnostic tools, preferably Data Dog. Hands-on skills and the ability to drill deep into the complex system design and implementation. Proficiency in SQL query writing and optimization. Familiarity with database security principles and best practices. Familiarity with in-memory data processing Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Experience working with: SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python Nice to Have Qualifications: Experience with Airflow, Docker, Visual Studio, Pycharm, Redis, Kubernetes, Fivetran, Spark, Dataflow, Dataproc, EMR Experience with database monitoring and diagnostic tools, preferably DataDog. Hands-on experience with Kafka or other event streaming technologies. Hands-on experience with micro-service architecture ACV Auctions in Chennai, India are looking for talented individuals to join our team As we expand our platform, we're offering a wide range of exciting opportunities across various roles in corpo
Posted 1 week ago
4.0 - 9.0 years
0 - 0 Lacs
Hyderabad, Chennai
Hybrid
Job Description: Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python. Optimize and tune data solutions for performance and scalability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer or similar role. Strong proficiency in AWS and Snowflake. Expertise in DBT and Python programming. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Work location Pune JD as below 5+ Years Only 30 days : We are looking for a highly skilled Java Developer with expertise in Spring Boot, Confluent Kafka, and distributed systems . The ideal candidate should have strong experience in designing, developing, and optimizing event-driven applications using Confluent Kafka while leveraging Spring Boot/Spring Cloud for microservices-based architectures. Key Responsibilities: Develop, deploy, and maintain scalable and high-performance applications using Java (Core Java, Collections, Multithreading, Executor Services, CompletableFuture, etc.) Work extensively with Confluent Kafka , including producer-consumer frameworks, offset management, and optimization of consumer instances based on message volume. Ensure efficient message serialization and deserialization using JSON, Avro, and Protobuf with Kafka Schema Registry . Design and implement event-driven architectures with real-time processing capabilities. Optimize Kafka consumers for high-throughput and low-latency scenarios. Collaborate with cross-functional teams to ensure seamless integration and deployment of services. Troubleshoot and resolve performance bottlenecks and scalability issues in distributed environments. Familiarity with containerization (Docker, Kubernetes) and cloud platforms is a plus. Experience with monitoring and logging tool- Splunk is a plus.
Posted 1 week ago
3.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 week ago
2.0 - 5.0 years
9 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 week ago
2.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 week ago
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are hiring a Delphix Engineer with 3-12 years of experience for a 12-month full-time onsite role across Bengaluru, Chennai, Hyderabad, Pune, and Vadodara The candidate must have strong hands-on experience with Test Data Management (TDM), particularly Delphix, along with data de-identification, masking, and synthetic data generation The ideal engineer will work closely with consumers to enable fast, secure test data provisioning Exposure to Python or .NET is a plus, as is knowledge of CI/CD pipelines and cloud-hosted platforms Must be a proactive contributor and effective collaborator, comfortable in dynamic environments Location - Bengaluru, Chennai, Hyderabad, Pune, Vadodara
Posted 1 week ago
2.0 - 5.0 years
14 - 17 Lacs
Hyderabad
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 week ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Advanced Programming Skills in Python,Scala,GoStrong expertise in developing and maintaining microservices in Go (or other similar languages), with the ability to lead and mentor others in this area. Extensive exposure in developing Big Data Applications ,Data Engineering ,ETL and Data Analytics . Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of deploying and managing cloud-native applications. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Required education Bachelor's Degree Required technical and professional expertise 4-7 years' experience primarily in using Apache Spark, Kafka and SQL preferably in Data Engineering projects with a strong TDD approach. Advanced Programming Skills in languages like Python ,Java , Scala with proficiency in SQL Extensive exposure in developing Big Data Applications, Data Engineering, ETL ETL tools and Data Analytics. Exposure in Data Modelling, Data Quality and Data Governance. Extensive exposure in Creating and maintaining Data pipelines - workflows to move data from various sources into data warehouses or data lakes. Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of developing, deploying and managing cloud-native applications. Good to have Front-End Development experienceReact, Carbon, and Node for managing and improving user-facing portals. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Preferred technical and professional experience Hands on experience with Data Analysis & Querying using SQLs and considerable exposure to ETL processes. Expertise in developing Cloud applications with High Volume Data processing. Worked on building scalable Microservices components using various API development frameworks.
Posted 1 week ago
10.0 - 15.0 years
12 - 16 Lacs
Jaipur
Remote
Job Summary Auriga is seeking a dynamic and experienced Staff Software Engineer / Technology Lead to lead and manage multiple cutting-edge digital projects. This role demands expertise in modern front-end technologies, a strong grasp of design system architecture, and a passion for creating exceptional user experiences for our enterprise clients. If you thrive on working with innovative digital products and enjoy collaborating with a team that values continuous learning and delivers high-quality results, then this is the perfect opportunity for you! Key Skills Strong proficiency in English (written and verbal communication) is required. Experience driving delivery excellence, leading and mentoring a team of engineers, QA automation engineers and data analysts. Experience working with remote teams in North America and Latin America (LATAM), ensuring smooth collaboration across time zones. Deep expertise in TypeScript, with extensive experience in modern Next.js (10+) and React (10+). A strong understanding of a11y and WCAG principles. Strong experience with modern CSS methodologies, specifically Tailwind CSS. Experience with modular front-end architectures, component-driven development, and design systems. Solid understanding of API consumption patterns, including REST, GraphQL and WebSockets. Experience with performance optimization techniques, including code-splitting, lazy loading, image optimization, and CDN strategies. Familiarity with headless CMS platforms, specifically Contentful. Experience with cloud platforms such as Google Cloud Platform (preferred) or similar. Understanding of containerization technologies for development environments. Understanding of Google Cloud Run as a web application run time environment. Experience with CI/CD pipelines for front-end deployments (GitHub Actions preferred). Knowledge of front-end security best practices, including CSP, OWASP Top 10, and secure authentication/authorization mechanisms (OAuth, JWT).Ability to communicate effectively with technical and non-technical stakeholders. You should feel comfortable explaining technical concepts in simple terms. Experience working in fast-paced, Agile environments, balancing priorities across multiple projects. Key Responsibilities Lead the architecture and development of scalable, performant, and maintainable front-end applications. Drive the adoption of modern front-end frameworks and technologies, ensuring best practices in React.js and Next.js. Optimize applications for Core Web Vitals (LCP, CLS, FID) to ensure high performance and superior user experience. Collaborate with designers and UX teams to implement seamless, accessible, and visually appealing interfaces. Define and maintain scalable component architecture using Storybook, and Tailwind CSS, or similar libraries. Understand and implement client-side state management solutions, React Query in particular. Work closely with backend teams to optimize REST API integrations, ensuring efficient data fetching and caching strategies. Lead and manage engineers and QA Automation engineers, providing mentorship, technical guidance, and career development support. Lead front-end testing initiatives, including unit, integration, and end-to-end testing (Playwright preferred). Integrate third-party headless CMS (Contentful) and personalization engines (Algolia, Cloudinary, Talon.One and Segment). Partner with Tech Directors and cross-functional teams to ensure front-end scalability, security, and maintainability. Stay updated with cutting-edge front-end technologies, continuously improving our development workflows and tools. About Company Who Has not dreamt of Working with Friends for a Lifetime. Come Join our team and be part of a dynamic and innovative organization that is dedicated to driving success for our clients through cutting-edge ERP solutions. Apply now to take the next step in your career journey with us!
Posted 1 week ago
12.0 - 17.0 years
45 - 50 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
We are seeking a highly skilled and visionary Agentic AI Architect to lead the strategic design, development, and scalable implementation of autonomous AI systems within our organization. This role demands an individual with deep expertise in cutting-edge AI architectures, a strong commitment to ethical AI practices, and a proven ability to drive innovation. The ideal candidate will architect intelligent, self-directed decision-making systems that integrate seamlessly with enterprise workflows and propel our operational efficiency forward. Key Responsibilities As an Agentic AI Architect, you will: AI Architecture and System Design: Architect and design robust, scalable, and autonomous AI systems that seamlessly integrate with enterprise workflows, cloud platforms, and advanced LLM frameworks. Define blueprints for APIs, agents, and pipelines to enable dynamic, context-aware AI decision-making. Strategic AI Leadership: Provide technical leadership and strategic direction for AI initiatives focused on agentic systems. Guide cross-functional teams of AI engineers, data scientists, and developers in the adoption and implementation of advanced AI architectures. Framework and Platform Expertise: Evaluate, recommend, and implement leading AI tools and frameworks, with a strong focus on autonomous AI solutions (e.g., multi-agent frameworks, self-optimizing systems, LLM-driven decision engines). Drive the selection and utilization of cloud platforms (AWS SageMaker preferred, Azure ML, Google Cloud Vertex AI) for scalable AI deployments. Customization and Optimization: Design strategies for optimizing autonomous AI models for domain-specific tasks (e.g., real-time analytics, adaptive automation). Define methodologies for fine-tuning LLMs, multi-agent frameworks, and feedback loops to align with overarching business goals and architectural principles. Innovation and Research Integration: Spearhead the integration of R&D initiatives into production architectures, advancing agentic AI capabilities. Evaluate and prototype emerging frameworks (e.g., Autogen, AutoGPT, LangChain), neuro-symbolic architectures, and self-improving AI systems for architectural viability. Documentation and Architectural Blueprinting: Develop comprehensive technical white papers, architectural diagrams, and best practices for autonomous AI system design and deployment. Serve as a thought leader, sharing architectural insights at conferences and contributing to open-source AI communities. System Validation and Resilience: Design and oversee rigorous architectural testing of AI agents, including stress testing, adversarial scenario simulations, and bias mitigation strategies, ensuring alignment with compliance, ethical and performance benchmarks for robust production systems. Stakeholder Collaboration & Advocacy: Collaborate with executives, product teams, and compliance officers to align AI architectural initiatives with strategic objectives. Advocate for AI-driven innovation and architectural best practices across the organization. Qualifications: Technical Expertise: 12+ years of progressive experience in AI/ML, with a strong track record as an AI Architect , ML Architect, or AI Solutions Lead. 7+ years specifically focused on designing and architecting autonomous/agentic AI systems (e.g., multi-agent frameworks, self-optimizing systems, or LLM-driven decision engines). Expertise in Python (mandatory) and familiarity with Node.js for architectural integrations. Extensive hands-on experience with autonomous AI tools and frameworks : LangChain, Autogen, CrewAI, or architecting custom agentic frameworks. Proficiency in cloud platforms for AI architecture : AWS SageMaker (most preferred), Azure ML, or Google Cloud Vertex AI, with a deep understanding of their AI service offerings. Demonstrable experience with MLOps pipelines (e.g., Kubeflow, MLflow) and designing scalable deployment strategies for AI agents in production environments. Leadership & Strategic Acumen: Proven track record of leading the architectural direction of AI/ML teams, managing complex AI projects, and mentoring senior technical staff. Strong understanding and practical application of AI governance frameworks (e.g., EU AI Act, NIST AI RMF) and advanced bias mitigation techniques within AI architectures. Exceptional ability to translate complex technical AI concepts into clear, concise architectural plans and strategies for non-technical stakeholders and executive leadership. Ability to envision and articulate a long-term strategy for AI within the business, aligning AI initiatives with business objectives and market trends. Foster collaboration across various practices, including product management, engineering, and marketing, to ensure cohesive implementation of AI strategies that meet business goals.
Posted 1 week ago
6.0 - 11.0 years
10 - 15 Lacs
Bengaluru
Hybrid
As a Salesforce Senior Developer at Clarivate, you will be instrumental in the design, development, and implementation of intricate Salesforce solutions aligned with our clients' business objectives. Collaborating closely with our Development Manager and his team, you'll utilize agile methodologies to deliver robust solutions. Whether as a team player or individual contributor, you'll work towards achieving business objectives set by the Enterprise Services team, ensuring the delivery of Salesforce projects with advanced technical expertise and adherence to best practices. About You experience, education, skills, and accomplishments Bachelors or masters degree in computer science, Information Technology, or related field, or equivalent experience. Salesforce certification required - Certified Conga CPQ or Certified Salesforce CPQ is a must for this position. At least 6 years of Salesforce development experience in Sales and Service cloud platforms. At least 3 years of design and development experience in Conga CPQ or Salesforce CPQ, Communities Experience cloud, and related products. Proven experience in M&A transformation projects in Sales, CPQ, Service, and Experience Cloud. Proficiency in Apex, Visualforce, Lightning components, Lightning Web Components, and other Salesforce development techniques. Experience with Agile and Scrum methodologies. Proficiency in solution documentation and design processes to maintain complex Salesforce instances. It would be great if you also had Ability to work independently with minimal supervision and within a team structure supporting multiple Salesforce ecosystems. What will you be doing in this role? Collaborate with product specialists and the Development Manager to understand complex requirements and translate them into Salesforce solutions. Design and develop Salesforce Sales solutions, emphasizing Conga CPQ, Salesforce CPQ, or both, Sales, Communities, and related technologies. Define development timelines and technical vision in close coordination with the Development Manager. Work closely with the team of developers under the Development Manager's guidance to deliver solutions aligned with project objectives. Ensure scalability, maintainability, and adherence to best practices in Salesforce design and development. Conduct code reviews and quality assurance to maintain high coding standards and deliver high-quality solutions. Demonstrate expertise in Salesforce platform and products, including Sales Cloud, Service Cloud, Experience Cloud (Community), Force.com, Apex Code, Aura Components, and Lightning Web Components. Stay updated with Salesforce vendor releases and emerging technologies to enhance client solutions. Troubleshoot and resolve technical issues, optimizing system performance and functionality. Provide expertise on Salesforce.com Apex, APIs, profiles, security model, Chat, Service Console, and Experience Cloud (Community). Take ownership, work under pressure, and manage multiple projects simultaneously. Interact with team members to deliver fast and reliable code, contribute ideas, provide feedback, and collaborate on various projects. About the Team We are a Salesforce Business Technology, Application Development team that support various mission-critical business processes. Hours of Work 45 h/week- 2- 11 PM IST
Posted 1 week ago
3.0 - 8.0 years
10 - 18 Lacs
Mumbai
Work from Office
We are looking for an experienced DevOps Enginee r to join our infrastructure and platform team. You will play a key role in designing, implementing, and maintaining our CI/CD pipelines, automating infrastructure, ensuring system reliability, and improving overall developer productivity. The ideal candidate is well-versed in On-Prem, cloud platforms, infrastructure as code, and modern DevOps practices. Role & responsibilities Design, build, and maintain CI/CD pipelines using tools like Jenkins, GitLab CI. Automate infrastructure provisioning and configuration using Terraform, Ansible, or CloudFormation. Manage and monitor production and staging environments across On-Prem and cloud platform (AWS). Implement containerization and orchestration using Docker and Kubernetes. Ensure system availability, scalability, and performance via monitoring, logging, and alerting tools (e.g., Prometheus, Grafana, ELK, Datadog). Maintain and improve infrastructure security, compliance, and cost optimization. Collaborate with development, QA, and security teams to streamline code deployment and feedback loops. Participate in on-call rotations and troubleshoot production incidents. Write clear and maintainable documentation for infrastructure, deployments, and processes. Preferred candidate profile 315 years of experience in DevOps, SRE, or infrastructure engineering. Proficiency in scripting languages like Bash, Python, or Go. Strong hands-on experience with cloud platforms (preferably AWS). Deep understanding of Docker and Kubernetes ecosystem. Experience with infrastructure automation tools such as Ansible, Terraform or Chef. Familiarity with source control (Git), branching strategies, and code review practices. Solid experience with Linux administration, system performance tuning, and troubleshooting. Knowledge of networking concepts, load balancers, VPNs, DNS, and firewalls. Experience with monitoring/logging tools like Prometheus, Grafana, ELK, Splunk, or Datadog, Nagios, Log Shippers like Filebeat ,Fluentd, Fluent Bit. Familiarity with security tools like Vault, AWS IAM, or cloud workload protection. Experience in high-availability, multi-region architecture design. Strong understanding in creation of RPM packages and Yum Repos. Strong ubnderstanding of Jmeter scripting and test case writing. Strong understanding of Artifact repository Manager (JFROG,Nexus,Maven,NPM,NVM) Installation of open source / enterprise tools from Source file or RPM Packages. Strong understanding of tech stack ( Redis, Mysql, Nginx, RabbitMQ, Tomcat, Apache, JBOSS) Implement cloud-native solutions including load balancers, VPCs, IAM, AutoScaling Group, CDNs, S3,Route 53 etc. SAST tools like SonarQube, CheckMarks, JFrog X-Ray. Expertise in configuring , upgrading the API Gateway Preferbly ( Google Apigee, Kong ) etc.
Posted 1 week ago
8.0 - 13.0 years
20 - 35 Lacs
Bengaluru
Hybrid
About Guidesly: Do you enjoy the Outdoors? Are you an enthusiast who wants to work in a place where you can follow your personal and professional passions? Guidesly is a well-funded startup building software for the Outdoor industry. Mobile apps, SaaS software, product services, and websites; we have a bit of everything. Sophisticated technology and a dream to change the world. If your passion is the outdoors and you want to live your dream, then this is your job. Work with guides, build products, get honest feedback from enthusiasts to create the very best solutions in the marketplace. The outdoor recreation industry includes fishing, wildlife, boating, water sports, snow sports, hunting, ATV, hiking, and biking. We are starting with the fishing vertical in the outdoor recreation industry. The product manager will work with professional guides throughout the United States and enthusiasts in all fishing varieties. Key Responsibilities: Develop and deploy machine learning, deep learning and NLP models for AI applications. Process and analyze large datasets, including feature extraction, vectorization, and embedding-based retrieval. Build and maintain backend systems and APIs to support AI-powered applications. Implement scalable AI solutions in cloud environments, primarily AWS. Develop end-to-end pipelines for model training, testing, and deployment using AWS services such as SageMaker, Bedrock, Lambda, and Step Functions. Optimize AI models for efficiency, scalability, and low-latency inference. Implement MLOps best practices for continuous integration, deployment, and monitoring of AI models. Document system architecture, API specifications, and performance benchmarks. Stay up to date with advancements in AI, including LLMs, FAISS, and deep learning frameworks, to drive innovation. Required Skills & Qualifications: Hands-on experience in AI/ML development, with expertise in natural language processing (NLP) and/or computer vision. Proficiency in Python, SQL, and machine learning libraries such as TensorFlow, PyTorch, and Scikit-learn. Strong understanding of LLM architectures, embeddings, and transformer-based models. Hands-on experience processing large datasets, including feature engineering, embedding retrieval, and vectorization. Experience with Cloud Services (Preferably AWS) for model deployment, data processing, and cloud-based AI solutions. Strong grasp of MLOps principles, including CI/CD pipelines, model monitoring, and workflow automation. Strong analytical and problem-solving skills with a proactive approach to AI model optimization. Excellent communication and teamwork skills to collaborate effectively with engineers, data scientists, and stakeholders. Key Technologies & Tools: Languages & Frameworks: Python, SQL, TensorFlow, PyTorch, Hugging Face Data Processing & Feature Engineering: Vectorization, Embeddings, FAISS Cloud & AI Services: AWS SageMakerAI, Bedrock, Lambda, Step Functions, Inference MLOps & Model Deployment: CI/CD, Model Optimization, API Integration Workflow Automation & AI Infrastructure Key skills: Machine Learning, Deep Learning, Python, SQL, TensorFlow, PyTorch, Hugging Face, AWS SageMakerAI, Lambda, Model Optimization, API Integration, AI Infrastructure
Posted 1 week ago
3.0 - 4.0 years
3 - 4 Lacs
Bengaluru, Karnataka, India
On-site
Adobe InDesign is the industry-leading page design software and layout app that allows you to build, pre-flight, and publish professional documents for print and digital media. InDesign has everything you need to make posters, books, digital magazines, eBooks, interactive PDFs and more. InDesign is part of creative suite of products from Digital Media. InDesign is used by millions every month and touches the lives of people across the globe. Our development team solves a wide variety of engineering problems - complex typography and layout algorithms, graphics handling, rendering, sophisticated UI design and much more. We are looking for a Technical member to join our team! What you'll do Own product solutions through their end to end development lifecycle ensuring high quality. As a key member of the team you will be expected to design, build, test and deploy stable, scalable and simple solutions. Work very closely with product management, experience designers, quality engineering for defining requirements for features. Work in highly collaborative, fast paced, agile environment. Work with highly flexible and geographically distributed teams. Mentor junior team members. Must have skills: B.Tech /M.Tech in Computer Science & Engineering from a best-in-class institute. 3 to 4 years of hands on design / development experience. Proficient in C/C++ or JAVA, data structures and algorithm. Should be willing to work on C++. Knowledge of application development on multiple platforms including various flavours of Windows and Macintosh. Solid understanding of design patterns and how to apply them in real world problems. Applying standard methodologies and experience, find opportunities for automation and solve problems using automation tools (sometimes AI based) and technologies. Performance tuning of applications. Can do: attitude and willingness to take on challenges Nice to have skills: Working knowledge of JavaScript. Client server/Web based development. Python AI and ML Cloud platforms like AWS/Azure. Develop cloud enabled, highly scalable and distributed solutions
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough