Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 years
0 Lacs
Greater Kolkata Area
On-site
Join our Team About this opportunity: We are seeking a highly skilled, hands-on AI Architect - GenAI to lead the design and implementation of production-grade, cloud-native AI and NLP solutions that drive business value and enhance decision-making processes. The ideal candidate will have a robust background in machine learning, generative AI, and the architecture of scalable production systems. As an AI Architect, you will play a key role in shaping the direction of advanced AI technologies and leading teams in the development of cutting-edge solutions. What you will do: Architect and design AI and NLP solutions to address complex business challenges and support strategic decision-making. Lead the design and development of scalable machine learning models and applications using Python, Spark, NoSQL databases, and other advanced technologies. Spearhead the integration of Generative AI techniques in production systems to deliver innovative solutions such as chatbots, automated document generation, and workflow optimization. Guide teams in conducting comprehensive data analysis and exploration to extract actionable insights from large datasets, ensuring these findings are communicated effectively to stakeholders. Collaborate with cross-functional teams, including software engineers and data engineers, to integrate AI models into production environments, ensuring scalability, reliability, and performance. Stay at the forefront of advancements in AI, NLP, and Generative AI, incorporating emerging methodologies into existing models and developing new algorithms to solve complex challenges. Provide thought leadership on best practices for AI model architecture, deployment, and continuous optimization. Ensure that AI solutions are built with scalability, reliability, and compliance in mind. The skills you bring: Minimum of 10+ years of experience in AI, machine learning, or a similar role, with a proven track record of delivering AI-driven solutions. Hands-on experience in designing and implementing end-to-end GenAI-based solutions, particularly in chatbots, document generation, workflow automation, and other generative use cases. Expertise in Python programming and extensive experience with AI frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, and vector databases. Deep understanding and experience with distributed data processing using Spark. Proven experience in architecting, deploying, and optimizing machine learning models in production environments at scale. Expertise in working with open-source Generative AI models (e.g., GPT-4, Mistral, Code-Llama, StarCoder) and applying them to real-world use cases. Expertise in designing cloud-native architectures and microservices for AI/ML applications. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Kolkata Req ID: 763161 Show more Show less
Posted 1 day ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The ideal candidate will be responsible for conceptualizing and executing clear, quality code to develop the best software. You will test your code, identify errors, and iterate to ensure quality code. You will also support our customers and partners by troubleshooting any of their software issues. The role requires strong leadership abilities, critical thinking, and problem-solving skills to ensure a high-performing development team. Responsibilities Detect and troubleshoot software issues Write clear, quality code for software and applications, and perform test reviews. Develop, implement, and test APIs. Provide input on software development projects. Lead and manage a development team, fostering collaboration and efficiency. Apply problem-solving skills to overcome challenges in development. Demonstrate critical thinking in optimizing software architecture and design Requirements Technical Skills & Tech Stack: Proficiency in Node.js, React.js, Next.js . Experience with NoSQL databases, MongoDB, and MySQL. Strong understanding of API development and integration. Familiarity with scalable and secure application architectures. Qualifications: 6 years of development experience. Proven experience in team handling and leadership. Strong problem-solving and critical thinking abilities. Comfortable using programming languages and relational databases. Strong debugging and troubleshooting skills. Ability to mentor and guide junior developers. Effective communication and decision-making skills. This job was posted by Saswati Ray from Automios. Show more Show less
Posted 1 day ago
12.0 - 15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The Opportunity Join Adobe in the heart of Bangalore, where brand-new engineering meets outstanding innovation. As a Software Development Engineer, you will play a pivotal role in crafting the future of digital experiences. This is an outstanding opportunity to develop groundbreaking systems and services as part of a multifaceted and ambitious team made up of machine learning engineers, data engineers and front end engineers. Your work will be instrumental in delivering powerful technology that empowers users globally. You will be an experienced backend engineer for the Ai/ML, Data Platform, Search and Recommendations teams of the Adobe Learning Manager. What You’ll Do Build Java based services to power API for search, recommendations, Ai Assistants, reporting and analytics. Build backend systems such as indexing pipelines for search and vector datastores. Build data pipelines such as horizontally scalable data pipelines Provide technical leadership for the design and architecture of systems which are a blend of data, ML and services stacks. Work closely with Machine Learning Scientists, Data Engineers, UX Designers and Product Managers to develop solutions across search, recommendation, Ai Assistants and Data Engineering. Integrate Natural Language Processing (NLP) capabilities into the stack. Do analysis and present key findings, insights and concepts to key influencers and leaders and contribute to building the product roadmap. Deliver highly reliable services with great quality and operational excellence. What you need to succeed A Bachelor's degree in Computer Science or relevant streams. 12 to 15 years of relevant experience. At least 5 years of hands-on experience building micro-services and REST API using Java. At least 5 years of hands-on experience building data pipelines using Big Data technologies such as Hadoop, Spark or Storm. Strong Hands-on experience with RDBMS & NoSQL databases. Strong grasp of fundamentals on web services and distributed computing. Strong background in data engineering and hands-on experience with big data technologies. Strong analytical and problem-solving skills. Hands-on experience with Python, Elastic Search, Spark and Kafka would be a plus. Hands-on experience rolling out Ai and ML Based solutions would be a plus. Enthusiastic about technological trends and eager to innovate. Ability to quickly ramp up on new technologies Proven track record of Engineering generalist resourcefulness. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less
Posted 1 day ago
5.0 - 10.0 years
12 - 15 Lacs
Bengaluru
Work from Office
Who You Are: Technical Expertise: o Proficient in Java/J2EE with 5+ years of hands-on expertise. Specializes in: Spring Framework (Expert level) Microservices (Advanced level) RESTful/GraphQL APIs (Advanced level) Cloud environments like AWS/Azure (Intermediate level) Skilled at writing clean, scalable code that drives innovation. Experience includes working with: ORM JSON Event-Driven Architecture IOC AOP Containerization Service discovery Service mesh Lambda expressions Multi-threading Experience: o Proficient with: RDBMS (Intermediate level) NoSQL (Intermediate level) Jira (Advanced level) Git (Advanced level) Maven (Intermediate level) Jenkins (Intermediate level) o Utilizes these tools and platforms effectively in software development processes. Analytical Thinker: A strategic thinker passionate about engaging in requirements analysis and solving complex issues through software design and architecture. Team Player: A supportive teammate ready to mentor, uplift your team, and collaborate with internal teams to foster an environment of growth and innovation. Innovation-Driven: Always on the lookout for new technologies to disrupt the norm, youre committed to improving existing software and eager to lead the charge in integrating AI and cutting-edge technologies. (Flexibility for remote work within these locations may be considered for the right candidate.)
Posted 1 day ago
1.0 - 3.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Who we are? Mindtickle is the market-leading revenue productivity platform that combines on-the-job learning and deal execution to get more revenue per rep. Mindtickle is recognized as a market leader by top industry analysts and is ranked by G2 as the #1 sales onboarding and training product. We’re honoured to be recognized as a Leader in the first-ever Forrester Wave™: Revenue Enablement Platforms, Q3 2024! We are seeking a talented Backend Developer to join our dynamic team. You will work directly with our engineering teams to develop our next-generation, cloud-based solutions. This is an opportunity to join a fast-paced, innovative company and make a strong impact on new product development and user experience. The candidate will be responsible for designing, building, maintaining, and scaling components powering the platform at Mindtickle, selecting the most appropriate architecture for a system (or systems) to suit business needs and achieve desired results under given constraints. What’s in it for you? Design & Build - Designing and developing high-volume, low-latency, highly reliable applications for mission-critical systems and delivering high availability and performance. Collaborate - Collaborating within your product streams and team to bring best practices and leverage a world-class tech stack. Measurable outcome - You will need to set quantifiable objectives that encapsulate the quality attributes of a system. The fitness of the application is measured against set marks. We’d love to hear from you, if you: You hold a Bachelor's (B Tech / BS / BE) or Master's (M Tech / MS / ME) degree in Computer Science or its equivalent Top Tier Engineering College. 1-3 years of strong software development experience and software engineering skills, Java, Golang, Ruby, etc. Having an understanding of front-end technologies like React and TypeScript is a plus. Expertise and practical knowledge of operating systems and MySQL / NoSQL. Working knowledge of data warehouses, SQL queries. Sufficient understanding of REST APIs, OAuth, OOPs, SAML, and JWT concepts. Working knowledge of Amazon Web Services (AWS). You should be positive towards problem-solving and have a very structured thought process to solve problems. Experience with Docker and Kubernetes is a plus. Skills and Attributes: Self-motivated and a team player. Requires excellent communication skills – written, verbal, and presentation. Ability to learn and adapt to new technologies in a fast-paced environment. Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Lowe’s Lowe’s Companies, Inc. (NYSE: LOW) is a FORTUNE® 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About The Team We're a team in Bengaluru passionate about crafting robust and scalable software using cutting-edge technologies like Java, Spring, and microservices. Our work is vital to Lowe's, directly impacting critical applications for buying, moving, and selling products. We value open communication, continuous learning, and a supportive environment where everyone's contributions lead to exceptional results in tackling complex challenges. Job Summary We are seeking a talented and passionate Software Engineer to join our growing engineering team. You will play a key role in designing, developing, and deploying scalable and resilient microservices using Java, Spring, and event-driven architectures with Kafka. You will also work with a variety of data storage solutions, both SQL and NoSQL, to build robust and efficient applications. This is an excellent opportunity to contribute to impactful projects and work with cutting-edge technologies in a collaborative and innovative environment. Roles & Responsibilities Core Responsibilities: Design, develop, and maintain high-performance, scalable microservices using Java and the Spring framework (Spring Boot, Spring Cloud). Implement robust and efficient communication between microservices using RESTful APIs and event streaming with Kafka. Work with both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra) databases Write clean, well-documented, and testable code following best practices and coding standards. Participate in the full software development lifecycle, including requirements gathering, design, development, testing, deployment, and maintenance. Collaborate effectively with cross-functional teams, including product managers, designers and engineers. Troubleshoot and resolve production issues efficiently. Contribute to the continuous improvement of our development processes and technology stack. Stay up-to-date with the latest trends and technologies in software development, particularly in the Java/Spring ecosystem, microservices architecture, and data management. Participate in code reviews to ensure code quality and knowledge sharing. Years Of Experience Experience in Core java development.(11 and above) and spring microservies stack Knowledge on API Integrations & Rest services Knowledge on event driven architecture with kafka Good Knowledge of SQL/NO SQL and working experience with any of these databases like Oracle, Microsoft SQL, IBM DB2, MongoDB Knowledge on React framework, JS and HTML is good to have Education Qualification & Certifications Required Minimum Qualifications Bachelor's degree in Computer Science, CIS, or related field and 2+ years of experience in software development or a related field. Additional equivalent work experience may be substituted for the degree requirement. Skill Set Required Primary Skills (must have) Strong proficiency in Java and the Spring framework (Spring Boot, Spring Cloud). Hands-on experience designing and building microservices architectures. Solid understanding of RESTful API design principles. Experience with event-driven architectures and Kafka. Experience working with relational databases (e.g., PostgreSQL, MySQL) and writing efficient SQL queries. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra, Redis) and understanding their use cases. Experience with version control systems (e.g., Git). Knowledge on React framework, JS and HTML is good to have Secondary Skills (desired) Self-learner and ready to learn other technologies and ready to work in Application support. Experience on React would be preferred. Should write clean, well-documented, and testable code following best practices and coding standards. Good written and verbal communication skills. Strong individual contributor to the team Should be able to collaborate effectively with cross-functional teams, including product managers, designers and engineers. Should have good analytical Skill Retail Industry Experience is optional. Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits. Show more Show less
Posted 1 day ago
0.0 - 4.0 years
0 Lacs
Mohali, Punjab
On-site
Job Title: Python Developer Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a motivated and skilled Python Developer with 3-4 years of professional experience to join our dynamic engineering team. The ideal candidate will be proficient in developing, deploying, and maintaining robust Python-based applications and services. You will play a key role in the entire software development lifecycle, from conceptualization and design through testing, deployment, and ongoing maintenance. While core Python development is essential, we highly value candidates with an interest or experience in emerging technologies like AI/ML and Large Language Model (LLM) applications. Key Responsibilities: Design, develop, test, deploy, and maintain high-quality, scalable, and efficient Python code. Collaborate closely with product managers, designers, and other engineers to understand requirements and translate them into technical solutions. Participate in the full software development lifecycle (SDLC) using Agile methodologies. Write clean, maintainable, well-documented, and testable code. Contribute to code reviews to ensure code quality, share knowledge, and identify potential issues. Troubleshoot, debug, and upgrade existing software systems. Develop and integrate with RESTful APIs and potentially other web services. Work with databases (like Postgersql) to store and retrieve data efficiently. Optimize applications for maximum speed, scalability, and reliability. Stay up-to-date with the latest industry trends, technologies, and best practices in Python development and related fields. Potentially assist in the integration of AI/ML models or contribute to projects involving LLM-based agents or applications. Minimum Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3-4 years of professional software development experience with a primary focus on Python. Strong proficiency in Python and its standard libraries. Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). Solid understanding of object-oriented programming (OOP) principles. Experience working with relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB, Redis). Proficiency with version control systems, particularly Git. Experience designing, building, and consuming RESTful APIs. Familiarity with Agile development methodologies (e.g., Scrum, Kanban). Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: AI/ML Knowledge: Basic understanding of machine learning concepts and algorithms. Experience with relevant Python libraries for data science and ML (e.g., Pandas, NumPy, Scikit-learn). Experience integrating pre-trained ML models into applications. Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch) is a plus. LLM Experience: Demonstrable interest or hands-on experience in building applications leveraging Large Language Models (LLMs). Experience working with LLM APIs (e.g., OpenAI GPT, Anthropic Claude, Google Gemini). Familiarity with LLM frameworks or libraries (e.g., LangChain, LlamaIndex). Understanding of basic prompt engineering techniques. Experience building or experimenting with LLM-powered agents or chatbots. Containerization & Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. CI/CD: Experience setting up or working with Continuous Integration/Continuous Deployment (CI/CD) pipelines (e.g., Jenkins, GitLab CI, GitHub Actions). Asynchronous Programming: Experience with Python's asynchronous libraries (e.g., asyncio, aiohttp). What We Offer: Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. Competitive salary and comprehensive benefits package. Opportunities for professional development and learning (e.g., conferences, courses, certifications). A collaborative, innovative, and supportive work environment. Job Type: Full-time Pay: ₹16,526.97 - ₹68,399.45 per month Benefits: Food provided Health insurance Location Type: In-person Schedule: Monday to Friday Job Type: Full-time Pay: ₹16,495.06 - ₹68,422.63 per month Benefits: Food provided Health insurance Schedule: Day shift Morning shift Supplemental Pay: Yearly bonus Work Location: In person
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 day ago
4.0 years
0 Lacs
India
Remote
We seek a skilled and passionate FullStack .NET Developer to join our growing engineering team. As a .NET Developer, you will be responsible for designing, developing, and maintaining end-to-end applications using .NET Core, Angular, and C#. You will work closely with backend and frontend teams to deliver high-quality, scalable, and secure web applications. Qualifications : 4+ years of experience in FullStack Development. Strong experience with .NET Core framework and C# Good hands-on experience with Angular: Version 11,12 Experience with RESTful APIs and web services. Strong understanding of SQL (e.g., SQL Server, PostgreSQL), and experience with database design and query optimization. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus Why join us? You'll have the opportunity to collaborate on multiple global projects, essentially gaining experience across multiple technologies simultaneously More reasons to join us: 4.4 Glassdoor Rating Fully remote work environment Exposure to cutting-edge technologies and international clients spanning various industries Opportunities to engage in diverse projects and technologies, with cross-domain training and support for career or domain transitions, including certifications reimbursement Profitable and bootstrapped company Flexible working hours with a 5-day workweek Over 30 paid leaves annually Merit-based compensation with above-average annual increments Sponsored team luncheons, festive celebrations, and semi-annual retreats Show more Show less
Posted 1 day ago
2.0 - 4.0 years
8 - 12 Lacs
Bengaluru
Work from Office
At Broadridge, weve built a culture where the highest goal is to empower others to accomplish more. If you re passionate about developing your career, while helping others along the way, come join the Broadridge team. K ey Responsibilities - Collaborate with internal stakeholders to understand pain points and translate them into technical requirements - Design and develop full-stack applications that leverage generative AI and other AI technologies - Create functional prototypes and proof-of-concept solutions using design thinking methodologies - Integrate with our established systems and enterprise data sources - Build intuitive user interfaces that make AI solutions accessible to non-technical colleagues - Work in a fast-paced environment with rapid iteration cycles Required Qualifications - 6+ years of experience in full-stack development - Strong proficiency in modern web frameworks (React, Angular, Vue) and backend technologies (Node. js, Python, Java) - Experience with database design and implementation (SQL and NoSQL) - Familiarity with cloud platforms (AWS, Azure, GCP) - Understanding of AI/ML concepts and experience integrating with AI services or APIs - Ability to work with established systems and navigate complex enterprise environments - Excellent problem-solving skills and ability to learn new technologies quickly Preferred Qualifications - Experience with LLM integration, prompt engineering, or RAG architectures - Knowledge of containerization and microservices architecture - Background in UI/UX design principles - Experience with agile development methodologies - Understanding of financial services technology and workflows Personal Attributes - Entrepreneurial mindset with comfort in ambiguous situations - Scrappy problem-solver who can work with limited resources - Strong communicator who can translate between technical and business languages - Self-starter who thrives in a fast-paced environment - Collaborator who works well in small, close-knit teams - Practical approach to technology that recognizes AI is not a silver bullet - Adaptable and resilient in the face of changing requirements We are dedicated to fostering a collaborative, engaging, and inclusive environment and are committed to providing a workplace that empowers associates to be authentic and bring their best to work. We believe that associates do their best when they feel safe, understood, and valued, and we work diligently and collaboratively to ensure Broadridge is a company and ultimately a community that recognizes and celebrates everyone s unique perspective.
Posted 1 day ago
4.0 - 14.0 years
35 - 40 Lacs
Bengaluru
Work from Office
NVIDIA is searching for a highly motivated software engineer for the NVIDIA NetQ team that is building a next gen Network management and Telemetry system in cloud using modern design principles at internet scale. NVIDIA NetQ is a highly scalable, modern network operations toolset that provides visibility, troubleshooting, and validation of your Cumulus fabrics in real time. NetQ utilizes telemetry and delivers actionable insights about the health of your data center network, integrating the fabric into your DevOps ecosystem. What youll be doing: Building and maintaining infrastructure components like NoSQL DB (Cassandra, Mongo), TSDB, Kafka etc Maintain CI/CD pipelines to automate the build, test, and deployment process and build improvements on the bottlenecks. Managing tools and enabling automations for redundant manual workflows via Jenkins, Ansible, Terraforms etc Enable performing scans and handling of security CVEs for infrastructure components Enable triage and handling of production issues to improve system reliability and servicing for customers What we need to see: 5+ years of experience in complex microservices based architectures and Bachelors degree. Highly skilled in Kubernetes and Docker/containerd. Experienced with modern deployment architecture for non-disruptive cloud operations including blue green and canary rollouts. Automation expert with hands on skills in frameworks like Ansible & Terraform. Strong knowledge of NoSQL DB (preferably Cassandra), Kafka/Kafka Streams and Nginx. Expert in AWS, Azure or GCP. Having good programming background in languages like Scala or Python. Knows best practices and discipline of managing a highly available and secure production infrastructure. Ways to stand out from the crowd: Experience with APM tools like Dynatrace, Datadog, AppDynamics, New Relic, etc. Skills in Linux/Unix Administration. Experience with Prometheus/Grafana. Implemented highly scalable log aggregation systems in past using ELK stack or similar. Implemented robust metrics collection and alerting infrastructure. NVIDIA is widely considered to be one of the technology world s most desirable employers. We have some of the most forward-thinking and hardworking people on the planet working for us. If youre creative, passionate and self-motivated, we want to hear from you! NVIDIA is leading the way in ground-breaking developments in Artificial Intelligence, High-Performance Computing and Visualization. The GPU, our invention, serves as the visual cortex of modern computers and is at the heart of our products and services.
Posted 1 day ago
2.0 - 6.0 years
7 - 11 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. The individual in this role as ML Engineer will be accountable for Design, develop, and deploy machine learning models and algorithms to solve business problems. Below are the key responsibilities / activities that need to be planned, attested and executed under the remit of this role by working effectictly and collborattively with the different delivery teams. In this role, you will: Design, develop, and deploy machine learning models and algorithms to solve business problems. Preprocess and analyze large datasets to extract meaningful insights. Build and optimize data pipelines (Extract data from Oracle, Elasticsearch, storage buckets, etc. ) for training and deploying ML models. Collaborate with data scientists, software engineers, and stakeholders to integrate ML solutions into production systems. Monitor and maintain deployed models to ensure performance and accuracy over time. Research and implement state-of-the-art machine learning techniques and tools. Document processes, experiments, and results for reproducibility and knowledge sharing. Stay up to date with tech, prototype with and learn new technologies, proactive in technology communities Develop & maintain ML models for supervision domain. E. g. Anomaly detection, Global Search Engine, ChatBot, specialized/customized models Develop innovative solutions in areas such as machine learning, Natural Language Processing (NLP), advanced and semantic information search, extraction, induction, classification and exploration Create products that provide a great user experience along with high performance, security, quality, and stability Requirements To be successful in this role, you should meet the following requirements: Minimum 7 years of software development experience. 2+ years of relevant experience in ML technologies mentioned below. Excellent problem-solving and communication skills. Strong experience in Python (3. x). Excellent working knowledge on scikit-learn, TensorFlow / PyTorch, Docker/Kubernetes. Good experience of SQL, Oracle/PostgreSQL, any NoSQL database, File buckets. Excellent knowledge and demonstrable experience in using open source NLP packages such as NLTK, Word2Vec, SpaCy. Experience in setting up supervised & unsupervised learning models including data cleaning, data analytics, feature creation, model selection & ensemble methods, performance metrics & visualization Solid understanding of ML algorithms, ML statistics, and data structures. Excellent interpersonal, presentation and analytical skills. What additional skills will be good to have Familiarity with MLOps practices for model deployment and monitoring. Experience working in investment banking domain with exposure trade life cycle, front office controls supervision. Experience in automating the continuous integration/continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement You ll achieve more when you join HSBC. .
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role summary: Reporting to the Lead Service Reliability Engineer, the Service Reliability Engineer is part of an enablement team that provides expertise and support to specialist teams designing, developing and running customer-facing products as well as internal systems. The Service Reliability Engineer will on a day-to-day basis be responsible for the Observability of Creditsafe’s Technology estate and will be involved in the monitoring and escalation of events. A large part of the role will involve improving the monitoring system and processes. This role will involve integrating AI capabilities to reduce noise and improve incident mean time to repair. Role objectives: Ensure our products are ready for life in Production Embed reliability, observability, and supportability as features, across the lifecycle of solution development Help to guide our engineering team’s transformation Raise the bar for engineering quality Deliver higher service availability Improve Creditsafe’s Monitoring Capabilities utilizing AI technologies Personal qualities: Trustworthy and quick thinking Optimistic & Resilient; breed positivity and don’t give up on the “right thing” Leadership & Negotiation; sell not tell, build support and consensus Creativity and High standards; develop imaginative solutions without cutting corners Fully rounded; experience of dev, support, security, ops, architecture and sales As a Service Reliability Engineer, you should have: A track record of troubleshooting and resolving issues in live production environments and implementing strategies to eliminate them Experience in a technical operations support role Demonstratable knowledge of AWS CloudWatch – Creating dashboards, metrics, and log analytics Knowledge of one or more high-level programming languages such as Python, Node, C# and Shell scripting experience. Proactive Monitoring and Alert Validation - Monitor critical infrastructure and services; validate alerts by analyzing logs, performance metrics, and historical data to reduce false positives. Incident Response and Troubleshooting - Perform troubleshooting; escalate unresolved issues to appropriate technical teams; actively participate in incident management and communication. Knowledge of AI/ML frameworks and tools for building operational intelligence solutions and automating repetitive SRE tasks. Continuous Improvement – Improvement of monitoring solutions, reduction of alert noise and implementation of AI technologies: AI/ML experience in operations, including predictive analytics for system health, automated root cause analysis, intelligent alert correlation to reduce noise and false positives, and hands-on experience with AI-powered monitoring solutions for anomaly detection and automated incident response. Strong ability and enthusiasm to learn new technologies in a short time particularly emerging AI/ML technologies in the DevOps, Platform and SRE space. Proficient in container-based environments including Docker and Amazon ECS. Experience of automating infrastructure using “as code” tooling. Strong OS skills, Windows and Linux. Understanding of relational and NoSQL databases. Experience in a hybrid cloud-based infrastructure. Understanding of infrastructure services including DNS, DHCP, LDAP, virtualization, server monitoring, cloud services (Azure and AWS). Knowledge of continuous integration and continuous delivery, testing methodologies, TDD and agile development methodologies Experience using CI/CD technologies such as Terraform and Azure Dev Ops Pipel Show more Show less
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future? If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Job Description You are an experienced software engineer with a Minimum of 4 -7 years of experience insoftware development using Microsoft .NET Technologies. Required Technical Competencies At least 3 years of experience in .NET Core. Technically very skilled in C#, and related frameworks and libraries. Experience with SQL and NoSQL databases. Proficiency in cloud and microservices architectures, understanding of domain-driven designprinciples and event driven communication. Rest API /GRPC. Hands-on experience with containerization (Docker)/orchestration tools (Kubernetes) is a plus. Preferred Qualifications Certifications - Microsoft certifications in .NET Core, Azure Kubernetes, or similar are a plus. Agile Methodologies - Familiarity with Agile and Scrum development methodologies. Familiarity with web based front-end technologies, such as SAP Fiori/UI5 is a plus. You strive to improve and develop yourself, the team and the product. You are organized, self-driving and take ownership over the tasks given to you. You communicate well, both when itcomes to technical details as well as business needs. You have analytical skills, a problem-solving mindset and you are working at ease in an environment which continuously is evolving. We value your data privacy and therefore do not accept applications via mail. Who We Are And What We Believe In We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group’s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Digital & IT is the hub for digital development within Volvo Group. Imagine yourself working with cutting-edge technologies in a global team, represented in more than 30 countries. We are dedicated to leading the way of tomorrow’s transport solutions, guided by a strong customer mindset and high level of curiosity, both as individuals and as a team. Here, you will thrive in your career in an environment where your voice is heard and your ideas matter. Show more Show less
Posted 1 day ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you’ll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Years of Experience: Candidates with 4+ years of hands on experience Required Skills Familiarity with the Conversational AI domain, conversational design & implementation, customer experience metrics, and industry-specific challenges Understanding of conversational (chats, emails and calls) data and its preprocessing (including feature engineering if required) to train Conversational AI systems. Strong problem-solving and analytical skills to troubleshoot and optimize conversational AI systems. Familiarity with NLP/NLG techniques such as parts of speech tagging, lemmatization, canonicalization, Word2vec, sentiment analysis, topic modeling, and text classification. NLP and NLU Verticals Expertise: Text to Speech (TTS), Speech to Text (STT), SSML modeling, Intent Analytics, Proactive Outreach Orchestration, OmniChannel AI & IVR (incl. Testing), Intelligent Agent Assist, Contact Center as a Service (CCaaS), Modern Data for Conversational AI and Generative AI. Experience building chatbots using bot frameworks like RASA/ LUIS/ DialogFlow/Lex etc. and building NLU model pipeline using feature extraction, entity extraction, intent classification etc. Understanding and experience on cloud platforms (e.g., AWS, Azure, Google Cloud, Omilia Cloud Platform, Kore.ai, OneReach.ai, NICE, Salesforce, etc.) and their services for building Conversational AI solutions for clients Expertise in Python or PySpark. R and JavaScript framework. Expertise in visualization tools such as Power BI, Tableau, Qlikview, Spotfire etc. Experience with evaluating and improving conversational AI system performance through metrics and user feedback Excellent communication and collaboration skills to work effectively with cross-functional teams and stakeholders. Proven track record of successfully delivering conversational AI projects on time Familiarity with Agile development methodologies and version control systems. Ability to stay updated with the latest advancements and trends in conversational AI technologies. Strong strategic thinking and ability to align conversational AI initiatives with business goals. Knowledge of regulatory and compliance requirements related to conversational AI applications Experience in the telecom industry or a similar field Familiarity with customer service operations and CRM systems Nice To Have Familiarity with data wrangling tools such as Alteryx, Excel and Relational storage (SQL) ML modeling skills: Experience in various statistical techniques such as Regression, Time Series Forecasting, Classification, XGB, Clustering, Neural Networks, Simulation Modelling, Etc. Experience in survey analytics, organizational functions such as pricing, sales, marketing, operations, customer insights, etc. Understanding of NoSQL databases (e.g., MongoDB, Cassandra) for handling unstructured and semi-structured data. Good Communication and presentation skills Show more Show less
Posted 1 day ago
6.0 - 9.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Responsibilities The Back End Developer resource will have the following responsibilities as it relates to their workstream: Developing and implementing low-latency, highly performant server-side components by writing efficient, reusable, and maintainable code; Collaborating with team members to contribute to improvements in process and infrastructure; Leveraging available tools/systems to proactively identify and resolve defects; Implementing effective security protocols and data protection measures; and, Working with frontend developers on the integration of application elements. Skills Required Experience of 6-9 years Having at least 3 years of relevant experience working as a Back-end Developer on enterprise applications; Having expert level Node.js & Typescript experience with components, classes and associated technologies; Demonstrating good knowledge of popular node libraries/components in the open-source community; Being able to brainstorm and collaborate with architects, front-end developers, and product team members to come up with practical application solutions; Following and implementing the latest coding trends, tricks/hacks and best practices along with conveying the message to the other team members; Having excellent verbal and written communication skills and able to interact professionally with a diverse group of people; Having experience working with microservice architecture; Having familiarity integrating with external APIs and SDKs; Having strong skills writing testable and maintainable code; Having a strong foundation with unit testing; Having a good understanding of asynchronous programming; Applying proficient understanding of Git; and Having experience with both RDBMS and NoSQL databases. Preferred Skills Node.js, Typescript, Javascript, Express.js, NestJs Couchbase, PostgreSQL, MongoDB, Redis Jest, Yarn, NPM Show more Show less
Posted 1 day ago
0 years
0 Lacs
Chandigarh, India
On-site
We are seeking a skilled C Developer to work on Linux-based applications, focusing on designing, developing, and optimizing system-level and embedded applications. The ideal candidate will have strong proficiency in C programming, experience in Linux environments, and a nice-to-have background in Point of Sale (POS) applications. Responsibilities Develop and maintain C-based applications for Linux environments, ensuring high performance and reliability. Work with system-level APIs, inter-process communication (IPC), multi-threading, and memory management. Optimize Linux system performance and troubleshoot low-level issues. Integrate and enhance POS systems(if applicable) with custom business logic and security features. Work with device drivers, hardware integration, and embedded systems if required. Collaborate with cross-functional teams, including product managers, testers, and other engineers. Perform code reviews, debugging, and performance tuning to improve application efficiency. Ensure compliance with security standards and industry best practices. Requirements Strong proficiency in C programming with experience in system-level programming on Linux. Experience with Linux internals, shell scripting, and debugging tools (GDB, Valgrind, Strace, etc. ). Knowledge of multi-threading, IPC (pipes, message queues, shared memory), and network programming. Familiarity with SQL or NoSQL databases for data storage and retrieval. Experience with Makefiles, CMake, and version control systems (Git, SVN, etc. ). Knowledge of POS-based application development(optional but preferred). Ability to troubleshoot performance bottlenecks and security vulnerabilities. This job was posted by Kalpana Choudhary from Antier Solutions. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 day ago
4.0 - 8.0 years
6 - 10 Lacs
Mumbai
Work from Office
JOB SUMMARY This position leads and advocates various data science teams on best practices around the development and implementation of advance analytic systems and predictive and prescriptive models. This position works with a team of data scientists, data analysts, data engineers, machine learning engineers, business and data domain owners, application developers, and architects in the creation and delivery of insights from large and disparate data to empower confidence in business decisions. This position leads the evaluation and adoption of emerging technologies that support the use of statistical modeling, machine learning, distributed computing, and run time performance tuning with the goal of deploying optimal processes and introducing new products and services to the market. This position supports senior leadership by planning and championing the execution of broad advanced analytics initiatives aimed at delivering value to internal and external stakeholders. This position may manage people within the department. RESPONSIBILITIES Leads and oversees the data analysts, data scientist team, machine learning engineers, and big data specialists in the implementation of models and systems that provide optimal results as well as scale and evolve the solutions to meet future business needs. Acts as subject matter expert on UPS business processes, data, and advanced analytics capabilities to scope problems, data and model requirements, and proven predictive and prescriptive techniques. Maintains broad understanding of implementation, integration, and inter-connectivity issues with emerging technologies to define strategies that support the creation, development and delivery of analytic solutions that meet business needs. Develops and prototypes algorithms to ensure analytic results satisfy problem statements and business needs. Interprets and analyzes large scale datasets to discover insights to support the build of analytic systems and predictive models as well as experiment with new and emerging models and techniques. Identifies and evaluates emerging/cutting edge open source, data science/machine learning libraries, data platforms, and vendor solutions to support the conception, planning, and prioritization of data projects across the enterprise. Provides thought leadership, technical guidance, and counsel for data science project teams to evaluate strategic alternatives, determine impact, recommend courses of action, and design and implement solutions. Champions best practices for adoption of Cloud-AI technologies, opensource software, machine learning libraries/packages, and data science platforms to derive useful information and insights that empower business decisions. Communicates with business customers and senior leadership team with various levels of technical knowledge, educates them about our systems, and shares insights and recommendations that can inform business strategies. Manages analytics projects/teams and serves as a point of contact for teams when multiple units are assigned to the same project to ensure team actions remain in synergy while communicating with stakeholders to keep the project aligned with goals. QUALIFICATIONS Requirements: Ability to engage key business and executive-level stakeholders to translate business problems to high level analytics solution approach. Multiple years of experience working with large-scale, complex datasets to create machine learning, predictive, forecasting, and/or optimization models. Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment Expertise in data management pipelines involving data extraction, analysis and transformation using either data querying languages (e.g. SQL, NoSQL, BQ), or scripting languages (e.g. Python, R) and/or statistical/mathematical software (e.g. R, Matlab, SAS) Hands-on experience in launching moderate to large scale advanced analytics projects in production at scale; adapts available Cloud-AI technologies and machine learning frameworks with or without the use of enterprise data science platforms. Proven ability to convey rigorous technical concepts and considerations to non-experts, and strong analytical skills, attention to detail. Direct experience in developing analytical solutions that empowers business decisions and product creation using various set of techniques (e.g. Supervised, Unsupervised, Deep Learning, NLP) Excellent verbal and written communication skills with the ability to communicate data through a story framework and convey data-driven results to technical and non-technical audiences, and effectively advocate technical solutions to research scientists, engineering teams as well as business audiences. Masters Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience.
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Description: As a Senior Technical Lead - Front End – React, you will be responsible for developing user interfaces using ReactJS. You will be expected to have a strong understanding of HTML, CSS, JavaScript, and ReactJS. You should also have experience in working with state management libraries like Redux and MobX. Roles & Responsibilities: • Strong proficiency in JavaScript, including DOM manipulation & java script object model • Thorough understanding of React.JS, its core principles like Hooks, Lifecycle, etc. and workflows such asFlux / Redux • Familiar in writing test cases and providing thorough test coverage • Familiar with newer specifications of ECMA Scripts along with Bootstrap, HTML & CSS • Experience in designing Restful APIs • Hands-On with design patterns, error / exception handling & resource management • Exposure to DevOps, associated CI/CD tools and code versioning tools like GIT • Knowledge of modern authorization mechanisms like JSON Web Token • Experience working with various data stores, SQL or NoSQL • Decent knowledge of OOPS concepts Technical Skills Skills Requirements: • Strong proficiency in React.js and JavaScript. • Experience in front-end web development using HTML, CSS, and JavaScript frameworks. • Knowledge of web design principles and web accessibility standards. • Familiarity with software development life cycle (SDLC) and agile methodologies. • Must have excellent communication skills and be able to communicate complex technical information tonon- technical stakeholders in a clear and concise manner. • Must understand the company's long-term vision and align with it. • Should be open to new ideas and be willing to learn and develop new skills. • Should also be able to work well under pressure and manage multiple tasks and priorities. Nice-to-have skills Qualifications Qualifications • 8-10 years of work experience in relevant field • B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred Company Value Show more Show less
Posted 1 day ago
4.0 - 9.0 years
20 - 35 Lacs
Bengaluru
Hybrid
Responsibilities Build services and a platform that power TD AI Understand, follow, and contribute to TD AI's standards for the software development life cycle Work closely with product and tech team to deliver efficient, maintainable, and unit-tested software Collaborate with product & business teams to understand customer journeys and solve problems that help TD AI grow. Lead microservice-based architecture and design of new and old systems, and has a good record of time estimate vs actual delivery & work on cloud infrastructures like GCP, AWS, etc Drive product and design discussions Elevate the skills of the team through technical mentorship Own a functional area from product conception to delivery, including the documentation, code quality, UT coverage, etc Participate in code reviews, write tech specs, and collaborate closely with other people Requirements Strong problem solving skills and are a fast learner Good knowledge of data structures and algorithms At least 3+ years of experience working in a product company At least 4+ years of experience in Back-end development. Proficient in at least one programming language like Java, C#, Python or GoLang Experience working with different kinds of data stores - SQL, NoSQL, etc and queuing systems Should have extensive experience in development and contribution of backend architecture, system designs, design of distributed systems & microservices Experience in working in application logging and monitoring systems like Prometheus Should know how to improve the performance of backend systems. Good grasp of fundamentals of computer science and web development A team player and should be able to give and receive constructive feedback A passion for learning new technologies to deliver exceptional products. Benefits Work with a world class team working on a very forward looking problem. Competitive salaries. Medical/Health insurance for the self & family A reward system that recognizes your hard work and achievements. Get ready for surprise gifts whenever you perform exceptionally Open culture Work accessories and a great work location Hybrid mode & Flexible Timings Good snacks, games, and a cool office to make you feel at home Frequent office parties to develop a bond between colleagues Events & Hackathons with exciting prizes
Posted 1 day ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: SQL Database Administrator (DBA) Experience: 3+ years Location: Noida (Work from Office) Requirements & Responsibilities: Minimum 3 years of experience as a DBA or in a related role. Proficient in database technologies with a specific understanding of RDBMS like PostgreSQL, MySQL, and NoSQL data stores such as HBase, MongoDB, etc. Install, configure, and maintain the performance of database servers — both relational and non-relational. Develop and implement processes for optimizing database security and integrity. Set and enforce database standards, policies, and procedures. Manage database access and permissions. Perform regular database backups and ensure data restoration strategies are in place. Monitor and tune database performance for high availability and efficiency. Install, upgrade, and manage database tools and applications. Diagnose and troubleshoot database issues and performance bottlenecks. Research, recommend, and implement emerging database technologies. Create and maintain database reports, dashboards, and visualizations. Automate recurring database tasks to improve operational efficiency. Strong command of SQL and familiarity with SQL Server tools. Advanced knowledge of database security, backup and recovery techniques, and performance monitoring standards. Sound understanding of relational and dimensional data modeling. Proficient in PowerShell and Unix Shell scripting. Impeccable attention to detail. Qualification: Bachelor’s degree in Computer Science, Information Technology, or a related field (B.E. / B.Tech / MCA) Show more Show less
Posted 1 day ago
2.0 - 7.0 years
4 - 7 Lacs
Mumbai
Work from Office
Responsibilities: Build scalable APIs with Node.js/Express. Deploy & manage AWS infrastructure. Develop scalable Node.js apps using Express, Firestore & Docker. Design databases & optimize performance. 2+ years experience required.
Posted 1 day ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Proven experience as a Full Stack Developer, with at least 8+ years of experience in healthcare software development. Strong proficiency in front-end technologies such as .Net, React, HTML, CSS, and modern frameworks/libraries (e.g., React, Angular, Vue.js). Solid understanding of back-end technologies such as Node.js, Python and experience with server-side frameworks (e.g., Express, Spring Boot). Experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) in healthcare applications. Strong knowledge of algorithms, design patterns and object-oriented fundamental concepts. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform, and experience with deploying and managing applications in cloud environments. Knowledge of healthcare standards and regulations including HIPAA, HL7, FHIR, and understanding of interoperability and data exchange protocols . Strong problem solving and analytical skills with the ability to “roll up your sleeves” and work with stakeholders to create timely solutions and resolutions. Excellent problem-solving skills, attention to detail, and ability to work effectively in a fast-paced environment. Show more Show less
Posted 1 day ago
8.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via -2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at least: Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
NoSQL is a rapidly growing field in India with plenty of job opportunities for skilled professionals. Companies across various industries are increasingly adopting NoSQL databases to handle massive amounts of data efficiently. If you are a job seeker interested in pursuing a career in NoSQL, here is a guide to help you navigate the job market in India.
These cities are known for their thriving tech industry and have a high demand for NoSQL professionals.
The average salary range for NoSQL professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with multiple years of experience can earn upwards of INR 15 lakhs per annum.
Typically, a career in NoSQL progresses as follows: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
With each role, you take on more responsibilities and work on more complex projects.
In addition to NoSQL expertise, other skills that are often expected or helpful in this field include: - Data modeling - Database administration - Cloud computing - Programming languages such as Java, Python, or JavaScript
Here are 25 interview questions for NoSQL roles to help you prepare:
As you prepare for your journey into the world of NoSQL jobs in India, remember to stay updated on industry trends, continuously upskill yourself, and showcase your expertise confidently during interviews. With determination and dedication, you can land a rewarding career in the dynamic field of NoSQL. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16808 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
7074 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5319 Jobs | Redwood City
Uplers
4724 Jobs | Ahmedabad
Accenture in India
4290 Jobs | Dublin 2
BAJAJ FINANCIAL SECURITIES LIMITED
3613 Jobs | Pune
Virtusa
3452 Jobs | Southborough