Home
Jobs
Companies
Resume
878 Job openings at Rarr Technologies
About Rarr Technologies

Rarr Technologies specializes in providing innovative software solutions to optimize business processes and improve efficiency.

Automation Test Engineer + ETL

Hyderabad

8 - 12 years

INR 7.0 - 11.0 Lacs P.A.

Work from Office

Full Time

Responsibilities: Maintain and adopt Agile best practices and lifecycles for process workflows (e. g. , Kanban, CI/CD). Collaborate with business users and analysts to refine and understand both functional and non-functional requirements during SIT & UAT stages. Develop automated test scripts to validate functional and technical requirements in data processing pipeline and perform data quality checks. Collaborate with data analysts in profiling data and monitoring data trends. Collaborate with Developers/DevOps Engineers on code management, peer review, and continuous integrated testing in CI/CD pipelines. Assure quality at different phases of SDLC by adhering to processes and strategies defined by Eastspring IT. Execute manual/automated/exploratory tests and provide QA sign-off to business users for releases. Maintain test process, design, and execution artifacts in test management system, complying with audit regulations. Prepare testing traceability reports and other testing metrics. Qualifications / Experience: Recognized degree or higher in Computer Science or related Engineering fields. At least 8 years of experience in Test Automation, using test frameworks for Database (ETL Testing) and Data analytical testing. Working knowledge of testing Data management platform tools similar to Golden Source . Sound knowledge in Java programming, SQL queries, and Cucumber (Java) testing framework. Good knowledge in testing scheduling/orchestration tools (e. g. , Control-M, Azure Data Factory). Working knowledge of relational databases and comfortable with testing SQL jobs and stored procedures with awareness of data security. Basic understanding of data quality, profiling, and analytics concepts. Working experience with test management tools such as Jira with Xray/Zephyr. Working knowledge of tools such as Bitbucket, Jenkins, Confluence, and familiar with Git branching model. Working experience in Agile projects, Behavior Driven Development (BDD) approach to software development and testing. Good to have basic programming knowledge in Python. Good to have knowledge of Azure cloud platform. Good to have working experience in the Investment Bank or Asset Management industry. Other Traits: Positive attitude and collaborative mindset. Willing to work across projects and perform manual/automation/exploratory testing. Highly motivated to stay updated with the latest developments in technology and acquire deep technical knowledge and skills. Excellent communication, presentation, and interpersonal skills. Java, Seleinum, Testing, Sql, Etl

Lead Cloud Engineer

Bengaluru

7 - 10 years

INR 10.0 - 14.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are seeking an experienced Lead Cloud Engineer to take over an existing approval engine product and deploy it throughout our organization. The ideal candidate will possess a strong technical background and be proficient in managing, maintaining, and growing the solution over time. This role requires a deep understanding of various Azure services, Power Platform, and TypeScript. The Lead Cloud Engineer will work closely with a dedicated Project Manager to ensure the successful implementation and management of the approval engine. Key Responsibilities: Responsible for design, development, and maintenance of backend systems using . NET technologies Develop, manage, and optimize Azure Functions using TypeScript. Implement and manage decision rules engines for automated decision-making processes. Work with Azure Storage Accounts, including Queues, Blob Storage, and Table Storage. Design, implement, and troubleshoot Power Automate workflows, including handling HTTP triggers. Secure and manage sensitive information using Azure Key Vaults. Deploy applications and services to Azure using YAML scripts and Azure Deployment Slots. Deploy and manage Azure infrastructure using Azure Resource Manager (ARM) templates. Utilize Mustache templating for dynamic content generation. Understand and leverage PowerApps and Power Automate workflows. Ensure the ongoing maintenance and updating of existing software. Collaborate closely with team members to develop new features and enhance functionality. Write clean, secure, efficient, and well-documented code. Troubleshoot and debug issues as they arise. Required Skills and Experience: Back End - C#. Net, . Net Core, ASP. Net Web API, Dapper, Microservices, Event Driven Architecture/Kafka Databases - Relational MSSQL, Non-Relational - Table Storage, Blob Storage, Cosmos DB Infrastructure - DevOps - Git Repositories, CI/CD Pipelines, Containerization Infrastructure - Microsoft Azure - Developing fully cloud native apps using serverless architecture Fundamentals - Strong understanding on OOPS, SOLID principals, Development practices, Design Patterns Strong understanding of service oriented architecture/Microservices Enjoys producing top quality code in a fast-moving environment Effective team player working in a team; willingness to put the needs of the team over their own Azure Functions in TypeScript Proficiency in creating, managing, and optimizing Azure Functions using TypeScript. Decision Rules Engine Experience with implementing and managing decision rules engines for automated decision-making processes. Ability to learn and adapt to new skills on the job. Azure Storage Accounts Knowledge of working with Azure Storage Accounts, specifically: Queues Blob Storage Table Storage Power Automate Flows and HTTP Triggers Expertise in designing, implementing, and troubleshooting Power Automate workflows. Proficiency in handling HTTP triggers. Azure Key Vaults Experience with securing and managing sensitive information using Azure Key Vaults. Deployment to Azure from YAML Scripts and Azure Deployment Slots Proficiency in deploying applications and services to Azure using YAML scripts for configuration and automation. Azure Deployment of Infrastructure (ARM) Experience with deploying and managing Azure infrastructure using Azure Resource Manager (ARM) templates. Mustache Templating Proficiency in using Mustache templating for dynamic content generation. Power Platform Understanding of how PowerApps work. Understanding of Power Automate Workflows. Excellent English communication skills (written, oral), with good listening capabilities Exceptionally good technical analytical, debugging, and problem-solving skills Has a reasonable balance between getting the job done vs technical debt Qualifications: Minimum of 8 years of experience in a similar role, with at least 3 years in a senior or lead position. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to work independently and manage multiple tasks effectively. Experience collaborating with a Project Manager to ensure successful project delivery. Certification in Azure Solutions Architect or similar credentials. Experience in large-scale deployment projects. Familiarity with Agile methodologies. Javascript, Azure, . Netcore, Typescript

GS Data Test Engineer

Hyderabad

8 - 13 years

INR 5.0 - 9.0 Lacs P.A.

Work from Office

Full Time

KEY ACCOUNTABILITIES Maintain and adopt Agile best practices and lifecycles for process workflows (e. g. , Kanban, CI/CD) Collaborate with business users and business analysts to refine and understand both functional and non functional requirement during SIT & UAT stages. Develop automated test scripts to validate functional and technical requirements in data processing pipeline and to perform data quality checks Collaborate with data analysts in profiling data and monitoring data trends Collaborate with Developers/DevOps Engineers on code management, peer review, continuous integrated testing in CI/CD pipelines Assure quality at different phases of SDLC by adhering to process and strategies defined by Eastspring IT Execute manual / automated / exploratory tests and provide QA sign-off to business users for releases Maintain test process, design and execution artifacts in test management system complying the audit regulations Prepare testing traceability reports and other testing metrics Etl Testing, Selenium -, Java, Sql, Sos

Senior Python Engineer

Chennai, Bengaluru

4 - 7 years

INR 5.0 - 9.0 Lacs P.A.

Work from Office

Full Time

About Us: We are a fast-growing AI-first SaaS company revolutionizing enterprise IT and HR operations with cutting-edge AI and automation. Our AI-powered solutions optimize workflows, enhance efficiency, and drive digital transformation. Recognized for innovation, we are expanding our engineering team and looking for passionate professionals to build, scale, and optimize AI-driven applications. Role Overview: We are looking for an experienced Python Engineer specializing in Data Engineering, LLM API Development, and AI-driven applications. If you have expertise in building scalable data pipelines, optimizing AI models, and deploying APIs, this is your opportunity to work on next-generation AI solutions. Key Responsibilities: Develop & Optimize Data Pipelines: Build scalable, high-performance pipelines for structured and unstructured datasets. AI-Driven API Development: Design, develop, and deploy secure, scalable APIs for LLM-powered solutions in production environments. LLM Model Optimization: Work on OpenAI, and Hugging Face models to refine AI performance and accuracy. Cloud & AI Infrastructure: Implement MLOps best practices using AWS, Docker, Kubernetes, and vector databases like Pinecone & FAISS. Big Data & Streaming Processing: Utilize Kafka, and Spark Streaming for large-scale AI data processing. Security & Monitoring: Implement Prometheus, Grafana, OAuth, and JWT for secure API integrations. Must-Have Skills: Python Programming: Expertise in developing robust data pipelines & API development (FastAPI, Flask, RESTful APIs). Data Engineering: Strong experience with Pandas, PySpark, SQL, and Airflow for handling large datasets. Cloud & Containerization: Hands-on experience with AWS, Docker, and Kubernetes. LLM & NLP Expertise: Familiarity with OpenAI, and Hugging Face for prompt optimization. API Development & Deployment: Proficiency in FastAPI, Flask, and RESTful APIs for AI-driven applications. Good to Have: MLOps & AI Infrastructure: Experience with MLflow, Kubeflow, TensorFlow, PyTorch. Big Data Processing: Exposure to Kafka, Spark Streaming. Monitoring & Security: Understanding of Prometheus, Grafana, OAuth, JWT. Who Should Apply? AI Enthusiasts Passionate about AI, Machine Learning, and LLMs. Tech Innovators Experienced in SaaS, AI, or product-based companies. Problem-solvers Who thrive in a fast-paced, high-growth environment. Preferred Industry Experience: AI, SaaS, or Product-Based Companies. Candidates from IT services with strong AI/ML exposure may also be considered. Selection Process: 1 HR Screening & Profile Shortlisting 2 Technical Task Evaluation 3 L1 Interview with CTO 4 Final Evaluation & Offer Apply Now & Be a Part of the AI Revolution! Data Engineering, Api Developement, Ml Ops, Big Data Processing, Cloud Connector Configuration, Monitoring & Controlling, Python Programing, Llm (Large Language Models), Strong Problem-Solving And Analytical Skills

Power BI SCM

Bengaluru

6 - 9 years

INR 8.0 - 11.0 Lacs P.A.

Work from Office

Full Time

Minimum of 6 years professional experience Minimum of 4+ years of experience on Power BI and SQL Specific Knowledge/Skills Bachelor/master s degree in computer science or equivalent Fluent in Power BI, DAX and Power Query Experienced in creating PowerApps and with using patch functions in PowerApps Working knowledge of UI/UX design to be able to implement interactive dashboards Should be able to form good narrative (story telling) using their visualization capabilities Good to have- SQL language (MSSQL)/Python Good to have- ERP systems experience such as SAP Good to have- experience in a similar role in Supply Chain PREFERRED QUALIFICATIONS: Experience in large corporate company with complex supply chain processes and multiple inventory locations Excellent communication skills Ability to effectively manage, influence, negotiate, and communicate with internal business partners to meet organizational capacity needs Sql, Power Bi, Supply Chain, Power Quary, Scm, Dax

Data Scientist / ML Engineer

Bengaluru

5 - 10 years

INR 9.0 - 13.0 Lacs P.A.

Work from Office

Full Time

Overview: We are seeking a talented and innovative Data Scientist to join our growing team. In this role, you will be responsible for applying machine learning and generative AI techniques to solve complex problems, drive business insights, and enhance product offerings. You will collaborate with engineers to build and deploy ML models in microservices architectures, ensuring the solutions are scalable, maintainable, and integrated with APIs. Key Responsibilities: Analyze large, structured, and unstructured datasets to extract meaningful insights and identify business opportunities. Develop, test, and implement machine learning and generative AI models to drive intelligent decision-making and automation. Work closely with engineering teams to integrate machine learning models into production systems using microservices architectures. Design and develop APIs to enable seamless communication between data models and applications. Build and maintain scalable data pipelines to facilitate data collection, transformation, and storage for model training and inference. Utilize Python and relevant libraries (e. g. , Pandas, NumPy, TensorFlow, PyTorch) to preprocess data, train models, and perform statistical analysis. Collaborate with product and business teams to understand requirements and deploy machine learning solutions that meet business needs. Perform continuous monitoring, testing, and optimization of deployed models to ensure high performance and reliability. Stay updated on the latest trends and advancements in machine learning, generative AI , and related fields to apply cutting-edge techniques in your work. Document processes, methodologies, and model outputs for transparency and future improvements. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Statistics, or related field. Proven experience in Python programming, including the use of libraries such as Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, or Keras for machine learning. Hands-on experience with microservices architecture and containerization technologies like Docker or Kubernetes . Experience with building and deploying APIs to enable machine learning models to interact with other systems and applications. Strong understanding of Machine Learning algorithms, including supervised and unsupervised learning, deep learning, and generative AI techniques such as GANs (Generative Adversarial Networks) and language models. Ability to work with cloud platforms (AWS, GCP, or Azure) for model deployment and scalability. Knowledge of data engineering concepts, including data wrangling, ETL processes, and working with distributed systems. Familiarity with modern version control systems (e. g. , Git) and agile development practices. Strong analytical and problem-solving skills with the ability to communicate complex technical concepts to non-technical stakeholders. Experience with data visualization tools (e. g. , Tableau, Power BI, or Matplotlib) is a plus. Preferred Qualifications: Experience with advanced Generative AI models, such as transformers (e. g. , GPT, BERT). Knowledge of DevOps practices and CI/CD pipelines for machine learning deployment. Familiarity with the integration of ML models into business applications and customer-facing products. Strong communication skills and the ability to collaborate with cross-functional teams including engineers, product managers, and business stakeholders. Api, Python, Gen Ai, Machine Learning, Microservecies

Python Developer

Bengaluru

2 - 3 years

INR 3.0 - 7.0 Lacs P.A.

Work from Office

Full Time

About the Role: We are looking for a skilled Python Developer to join our team. The ideal candidate will have experience in building scalable applications, writing clean and efficient code, and working with modern frameworks and libraries. You will collaborate with cross-functional teams to develop, optimize, and maintain high-quality software solutions. Key Responsibilities: Develop, test, and maintain Python-based applications. Write clean, scalable, and efficient code following best practices. Integrate third-party APIs and services as needed. Work with databases (SQL/NoSQL) for data storage and retrieval. Optimize applications for performance and scalability. Collaborate with frontend developers, designers, and other stakeholders. Troubleshoot and debug issues in production and development environments. Write unit tests and participate in code reviews. Stay updated with emerging technologies and industry best practices. Required Qualifications: Strong experience in Python (3. x) and its ecosystem. Experience with one or more Python frameworks (Django, Flask, FastAPI, etc. ). Proficiency in working with relational and/or NoSQL databases (PostgreSQL, MySQL, MongoDB, etc. ). Knowledge of RESTful API design and implementation. Familiarity with version control systems (Git, GitHub, GitLab). Experience with cloud platforms (AWS, GCP, Azure) is a plus. Understanding of CI/CD pipelines and containerization (Docker, Kubernetes) is a plus. Strong problem-solving skills and attention to detail. Preferred Qualifications: Experience with data engineering, machine learning, or big data technologies. Familiarity with message queues (Kafka, RabbitMQ, etc. ). Knowledge of distributed computing and microservices architecture. Experience with DevOps practices and infrastructure automation. Restapi, Python, Mongodb, Postgres, Cloud, Cicd, Containerization

NodeJS Engineer

Bengaluru

2 - 3 years

INR 4.0 - 7.0 Lacs P.A.

Work from Office

Full Time

We are looking for: Engineering Graduates or Postgraduates with 2-7 years of experience in software development. Strong proficiency with JavaScript and ES6 or ECMA 2015 on Node. js platform (must have). Knowledge and understanding of async non-blocking architecture. Knowledge of creating REST-ful APIs. Knowledge of Data Structures and Algorithms, Node Global variables and In-built libraries. Experience in building high-volume transactional customer facing systems. Understanding the nature of asynchronous programming and its quirks and workarounds. Basic understanding of front-end technologies, such as HTML5, and CSS3 Writing reusable, testable, and efficient code. Integration of data storage solutions [RDBMS, No SQL DB] Agile development experience in fast paced environment; working experience in small teams/pods/squads (highly desired). Contribution to open source projects or experience working with open-source ecosystems will be a good addition to have. Strong analytical skills, with a penchant for solving complex programming problems is appreciated. Good to have: Knowledge of frameworks such as Express, KOA. Knowledge of packages BABEL, Webpack. Soft Skills required: Excellent communication skills Team player and ability to work with different multi-cultural teams. Proactive attitude on identifying problems and providing solutions. Creative and innovative thinking. Restful Api, Html, Typescript, Javascript, Node . Js, Css, Api

Ruby on Rails Full Stack Developers

Hyderabad

5 - 10 years

INR 8.0 - 12.0 Lacs P.A.

Work from Office

Full Time

MUST HAVE SKILLS : Strong ROR in backend with min 6months+ exp. In front-end technologies like: vuejs/reactjs/ angularjs. AWS/ Azure Cloud (min. 2+ years) Details: 0. 6+ years of experience with web frameworks (preferred: Rails or Rack, Django) 1+ years of Angular, React, or Vue. js Demonstrated experience with AWS Services (services preferred: Lambda, SQS, S3) Experience working in a software product driven environment Demonstrable knowledge of front-end technologies such as JavaScript, HTML5, CSS3 Workable knowledge of relational databases (ex: MySQL, Postgres) React. Js, Azure, Ror (Ruby On Rails), Vue Js, Aws, Angular. Js

Software Development in Test (SDET)

Hyderabad

5 - 10 years

INR 10.0 - 15.0 Lacs P.A.

Work from Office

Full Time

5+ years of SDET experience in a complex web-based software development environment 3+ years of performance and API testing experience using tools like LoadRunner, WebLOAD or Apache JMeter Experience with CodeceptJS/ Playwright/ Cypress testing framework. Experience using GitHub Post-secondary education in Computer Science, Engineering, the Sciences or Mathematics. Experience with Quality Assurance in an Agile environment. Experience with Ruby and Javascript test libraries (RSpec, MiniTest, Jest) is an asset. Experience with test automation in CI/CD workflows in GitHub Actions or an equivalent such as Jenkins or AWS CodeBuild Git Hub, Api, Playwright, Cypress

API Analyst / Developer

Bengaluru

8 - 12 years

INR 7.0 - 11.0 Lacs P.A.

Work from Office

Full Time

About the Role We are looking for a skilled API Governance Expert who is also a hands-on developer to join our team. In this hybrid role, your primary responsibility will be to define and enforce API governance frameworks , ensuring that our APIs align with industry standards and organisational objectives. Additionally, you will contribute as a developer within a squad , actively participating in the design, coding, and deployment of software solutions . Key Responsibilities API Governance Establish and enforce API governance frameworks, standards, and best practices across the organisation. Review and approve API designs , ensuring they meet governance criteria, security, and performance benchmarks. Collaborate with teams to implement API lifecycle management , including versioning, documentation, and deprecation policies. Provide guidance on API security, compliance, and performance optimisation . Act as a subject matter expert on API-related governance and serve as a point of contact for queries and escalations. Development Work closely with a development squad to design, develop, and deploy software solutions. Participate in code reviews , ensuring adherence to best practices and coding standards. Assist in API integration with various internal and external systems. Contribute to the continuous improvement of the development process , including CI/CD practices, automated testing, and deployment strategies. Troubleshoot and resolve issues in both governance and development activities. Must Have Knowledge Strong understanding of API governance principles, standards, and best practices . Deep knowledge of RESTful APIs, GraphQL, and microservices architecture . Expertise in API security protocols (e. g. , OAuth2, JWT, mTLS). Understanding of API lifecycle management , including versioning, documentation, and deprecation policies. Skills Hands-on experience in one or more modern programming languages (e. g. , Java, GoLang, Node. js, Python). Ability to design, build, and maintain APIs following governance best practices. Proficiency with API management platforms (e. g. , Apigee, Kong, AWS API Gateway). Strong problem-solving and debugging skills. Excellent communication and leadership skills to influence and mentor teams. Experience Minimum 5 years of experience in API governance, API design, or software development . Proven track record in establishing and enforcing API governance frameworks. Experience working in agile development environments and cross-functional teams. Good to Have Knowledge Familiarity with cloud-based services and DevOps practices (e. g. , Docker, Kubernetes, AWS, Azure). Understanding of service mesh technologies (e. g. , Istio, Linkerd) and API observability. Knowledge of regulatory and compliance requirements related to API security. Skills Experience in CI/CD pipeline automation and API testing frameworks. Ability to advocate for and drive API-first development within engineering teams. Strong stakeholder management skills to align API governance with business objectives. Experience Exposure to regulated industries , such as financial services. Prior experience in a governance or compliance-focused role . Contributions to open-source API governance tools or frameworks . Governance, Design & Development, Microservice Architecture, Api

Azure Databricks Architect

Bengaluru

8 - 15 years

INR 14.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Lead Data architects lead the design and implementation of data collection, storage, transformation, orchestration (movement) and consumption to achieve optimum value from data. They are the technical leaders within data delivery teams. They play a key role in modelling data for optimal reuse, interoperability, security, and accessibility as well as in the design of efficient ingestion and transformation pipelines. They ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. And they instill trust through the employment of data quality frameworks and tools. The data architect at Chevron predominantly works within the Azure Data Analytics Platform, but they are not limited to it. The Senior Data architect is responsible for optimizing costs for delivering data. They are also responsible for ensuring compliance to enterprise standards and are expected to contribute to the evolution of those standards resulting from changing technologies and best practices. Key Responsibilities: Design and overseeing the entire data architecture strategy. Mentor junior data architects to ensure skill development in alignment with the team strategy. Design and implement complex scalable, high-performance data architectures that meet business requirements. Model data for optimal reuse, interoperability, security, and accessibility. Develop and maintain data flow diagrams, and data dictionaries. Collaborate with stakeholders to understand data needs and translate them into technical solutions. Ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. Ensure data quality, integrity, and security across all data systems. Qualifications: Experience in Erwin, Azure Synapse, Azure Databricks, Azure DevOps, SQL, Power BI, Spark, Python, R. Ability to drive business results by building optimal cost data landscapes. Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e. g. , Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Certifications such as AWS Certified Solutions Architect , IBM certified data architect or similar are a plus. Azure Synap, Sql, Spark, Python, Powerbi, Azure Devops, Azure Data Brick

UIV Developer

Bengaluru

5 - 8 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

UIV DEVELOPER The Unified Inventory (UIV) is Nokia s cutting-edge, cloud-native, multi-vendor, and multi-domain software solution that delivers real-time, end-to-end visibility of service and resource inventories. It underpins Fulfilment and Assurance applications for real-time automatic operations such as service fulfilment, closed-loop operations, and automatic network recovery. We are seeking an experienced Senior Developer with hands-on expertise in network domains and strong programming skills to contribute to the UIV team. Must have: Minimum 5 years of hands-on experience in NMS, Inventory, Network topology, assurance, fulfilment domains. Experience in UIV (Unified Inventory) or UIM (Unified Inventory Management) in Telco domain. Strong expertise in Telecom in at least one of the domains ex. 5G slicing, 5G SA/NSA network, IP MLPS, Optical transport, IMS, VoLTE, NFV/SDN network etc. Proficient in programming languages such as Java, Python, and working with XML, XSLT, and JSON. Hands-on experience in Java development with exposure to any framework like microservices. Experience in network communication protocols &technologies: SNMP, REST, SOAP, JMS, Kafka. Deep knowledge of Telecom and networking fundamentals Technical Experience & Skills: The candidate is required to have hands on experience in one or more of the following domains. Having experience in multi-vendor network will be highly preferred. 5G slicing, 5G SA/NSA network IP MLPS, Optical transport IMS, VoLTE NFV/SDN network Experience in UIV (Unified Inventory) or UIM (Unified Inventory Management) in Telco domain. Experience in building UIV Discovery and Reconciliation Config and E2E development of Inventory Adaptors (Network Adaptors, Federation Adaptors, Event Adaptors etc) Hands-on experience in: Java, Python XML, XSLT, JSON SNMP, REST, SOAP, JMS, Kafka Knowledge of protocols such as SMPP, SNMP, OpenSSL/SSH Hands-on exp on any Relational Database & complex SQL Queries. Knowledge of scripting Languages like Shell Perl or Python. Knowledge of REST APIs and Web services. Experience in Agile and JIRA as a way of working & delivery. Familiarity with cloud-native architecture and basic cloud concepts. Familiarity with Docker containers and monitoring tools such as Prometheus, Grafana, and Elastic Search. Basic knowledge of Kubernetes and Helm charts Experience with CI/CD processes, Jenkins, Gitlab and unit testing frameworks like Junit. Nokia UIV Solution Developer trainings and certifications recommended. Unified Inventory, Nsa, 5g Slicing, Nms, Uiv, Uim

Matillion ETL Developer

Bengaluru

5 - 9 years

INR 5.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Job Summary: The ETL Developer will be responsible for designing, developing, and maintaining ETL processes using Matillion and Snowflake. The ideal candidate will have a minimum of 5 years of experience in ETL development, a strong background in data integration, and a deep understanding of data warehousing concepts. This role requires close collaboration with data analysts, data scientists, and business stakeholders to ensure data is effectively integrated, transformed, and made accessible for various business needs. Key Responsibilities: Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake. Optimize and troubleshoot ETL workflows to ensure high performance and reliability. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. Develop and implement data quality checks to ensure accuracy and consistency of data. Monitor ETL processes and address any issues promptly to minimize downtime. Maintain comprehensive documentation of ETL processes, data mappings, and data flow diagrams. Stay current with industry best practices and emerging technologies in ETL, data warehousing, and data integration. Qualifications: Bachelor s degree in computer science, Information Technology, or a related field. Minimum of 5 years of experience in ETL development. Proficiency in Matillion ETL tool. Extensive experience with Snowflake cloud data warehouse. Strong SQL skills and experience with complex queries and performance tuning. Solid understanding of data warehousing concepts, data modeling, and schema design. Experience with data integration from various sources such as APIs, flat files, and databases. Familiarity with cloud platforms (e. g. , AWS, Azure, GCP) and their data services. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Ability to work in a fast-paced, dynamic environment and manage multiple priorities. Preferred Qualifications: Experience with other ETL tools and data integration platforms. Knowledge of programming languages such as Python or Java. Experience with big data technologies (e. g. , Hadoop, Spark). Certifications in Matillion, Snowflake, or related technologies. Python, Hadoop, Data Warehouse, Snowflake

FLowone BST/Catalog Developer

Bengaluru

6 - 9 years

INR 5.0 - 8.0 Lacs P.A.

Work from Office

Full Time

FLOWONE BST/Catalog DEVELOPER Must have: Hands-on experience in Nokia FlowOne product - development, testing & integration Experience in BST/Catalog design and development based on CDPA and CDFF Strong expertise in Telecom - OSS, Assurance and Fulfilment domain. Minimum 6 years of OSS Telecom domain & Service Provisioning and Activation Experience in building Nokia FlowOne service templates/workflows for onboarding/service stitching. Basic knowledge of Relational databases ex: PostgreSQL Experience in Unix, Shell/Perl Scripting. Knowledge of REST APIs/XML/JSON/SOAP/SOA and Web services. Familiarity with tools like SOAP UI and Postman. Knowledge of Telecom and networking fundamentals Technical Experience & Skills: Develop, configure, and maintain workflows within the Nokia FlowOne platform, focusing on modules like Order Manager, Order Hub, Catalogue, and BST. Basic understanding of FlowOne Operation activity. BST Implementation with core focus on development, testing & integration Catalog Implementation based on CDFF Framework Good experience in Relational databases e. g. PostgreSQL Server/SQL server etc. Experience on Unix, Shell/Perl and Python Scripting. Knowledge of REST APIs/XML/SOAP/SOA and Web services. Understanding of JSON structure and XML parsing. Familiarity with creating simulators with tools like SOAP UI and Postman. Experience working with file-based protocols such as SFTP and FTP for data exchange. Experience in configuring and managing transaction workflows in LDAP. Knowledge of protocols such as SMPP, SNMP, OpenSSL/SSH Deep knowledge of 4G, and 5G call flows Experience in Agile and JIRA as a way of working & delivery. Experience in DevOps process in any cloud platform. Experience collaborating with team to enhance deployment processes through CI/CD pipelines (e. g. , Jenkins, GitLab CI etc. ) Nokia FlowOne training and certifications recommended. Order Management, Nokia Flowone, Testing, Cdff, Rest & Soap Api, Integration

Murex Datamart Lead

Bengaluru

6 - 11 years

INR 12.0 - 16.0 Lacs P.A.

Work from Office

Full Time

Murex Datamart Key Duties & Responsibilities Work as part of the development team with the Bankss Treasury and Markets IT team. Work on Multiple projects related to automation and digitization of Banks processes and systems Analyze, design and configure database for murex DataMart implementation. Detailed understanding and working knowledge of configuration of murex DataMart objects. Ability to carry out configuration for accounting, transaction, compliance, PL, cash flow etc reports. SQL language, preferably in Sybase environment. Should have knowledge of procedure, function, trigger, index and tuning Market Data experience including MDRS, MDCS and RTBS. Unix system commands and shell/Perl script programming. Should have experience of creating technical/functional documents along with requirement analysis/client facing roles Technical skills Must have Good knowledge and expertise in datamart Knowledge of datamart reporting , feeders, batch of feeders Strong analytical and debugging ability, modifying and enhancing existing complex Datamart objects Good exposure dynamic tables, pre and post filters, feeders, batch of feeders, extractions, reporting tables and processing scripts Experience in Simulation based Reports and Risk Matrix based reports, complex reports would be required Able to configure, execute, and troubleshoot batch reports in MXG Able to design and optimize the usage of dynamic tables Experience of Sybase DB or oracle database Experience in Datamart/EOD solution design and effort estimation with limited support required Knowledgeable in Unix shell scripting Knowledge and hands on experience in implementing, developing and supporting MX. 3 End of Day processes and Trade Life Cycle Management using workflows engine. (At least 5 years) Experience in leading the integration stream for a MX. 3 implementation for Integration or Reporting stream, including leading and coordinating Design sessions and sprint showsessions. Strong Knowledge of SQL/RDBMS technology Good to have Good to have knowledge on GOM definition, MxML development. Experience on different Asset classes, Trade workflows, trade attributes and Financial and Non-Financial static data Good understanding of both exchange traded and OTC Derivatives with specific focus on Credit and Rates products with clarity on their life cycle Understanding of Murex BO functionalities like Confirmation and settlements. Murex Datamart, Db, Sybase, Mxg, Mxml

Nodejs Devops Specialist

Bengaluru

5 - 12 years

INR 9.0 - 13.0 Lacs P.A.

Work from Office

Full Time

As a Developer you are accountable for : Develop, enhance, and maintain our engineering portal based on Spotify Backstage. Write clean, efficient, and maintainable JavaScript and Node. js code. Implement and improve CI/CD pipelines to ensure seamless deployment. Apply DevOps principles to streamline development workflows and infrastructure. Design and implement robust testing strategies, including unit, integration, and end-to-end tests. Monitor, troubleshoot, and optimise application performance. Collaborate with architects, engineers, and product owners to deliver high-quality solutions. Must have knowledge, skills and experiences Strong understanding of JavaScript and Node. js development. Solid grasp of DevOps practices and modern CI/CD pipelines. Knowledge of testing techniques, including unit and integration testing. Skills: Proficiency in JavaScript and Node. js. Ability to design, develop, and maintain scalable backend applications. Experience implementing and optimising CI/CD pipelines (e. g. , GitHub Actions, Jenkins, GitLab CI). Strong debugging and performance optimisation skills. . Experience: Minimum 5 years of experience developing Node. js applications. Experience working in Agile development environments Hands-on experience with Spotify Backstage, including developing plugins or extending its functionality. Good to have knowledge, skills and experiences Knowledge: Familiarity with containerisation and orchestration (Docker, Kubernetes). Understanding of cloud platforms (AWS, Azure, or GCP). Knowledge of Infrastructure as Code (Terraform, Helm, or similar). Skills: Experience with observability tools (Prometheus, Grafana, OpenTelemetry). Ability to implement security best practices in CI/CD and DevOps pipelines. Strong communication and collaboration skills. Experience: Exposure to large-scale distributed systems. Experience in contributing to open-source projects or internal developer platforms. Prior experience in a financial or enterprise environment is a plus. Javascript, Nodejs, Ci/Cd Pipe Line

Senior Developer 1

Bengaluru, Gurgaon

8 - 10 years

INR 6.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Job Description 8-10 Yrs of Software Development Experience, Java, SpringBoot, Kubernetes, Containers, Microservices, Helm, Knowledge of Azure/AWS Services, Working knowledge in SQL/NoSql DBs, Good Knowledge of SIP Signaling Kuberbates, Containers, Java, Spring Boot, Microservices, Helm

Senior Developer 2

Bengaluru, Gurgaon

5 - 7 years

INR 5.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Job Description 5-7 Yrs of Software Development Experience, Java, SpringBoot, Kubernetes, Containers, Microservices, Helm, Knowledge of Azure/AWS Services, Working knowledge in SQL/NoSql DBs, Good Knowledge of SIP Signaling Microservices, Kuberbates, Java, Spring Boot, Containers, Helm

Active Directory Professional

Hyderabad

5 - 8 years

INR 5.0 - 9.0 Lacs P.A.

Work from Office

Full Time

We are looking for a strong Active Directory professional with expertise in Azure, Migration, and Hardening . The ideal candidate should have a deep understanding of Active Directory fundamentals and hands-on experience in managing and securing directory services. Required Skills: Strong expertise in Active Directory administration. Experience with Azure AD integration. Hands-on experience in AD migrations and hardening. Good troubleshooting and problem-solving skills. Hardening, Active Directary, Azure, Migration

FIND ON MAP

Rarr Technologies

Rarr Technologies

Rarr Technologies

Information Technology

San Francisco

50-100 Employees

878 Jobs

    Key People

  • Jane Doe

    CEO
  • John Smith

    CTO
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview

Automation Test Engineer + ETL (1)
Lead Cloud Engineer (1)
GS Data Test Engineer (1)
Senior Python Engineer (1)