About the Role We are seeking a highly motivated and talented AWS QuickSight Developer to join our growing team. The Developer will be responsible for designing, developing, and maintaining data visualizations and reports that provide valuable insights into business performance. This role requires a strong understanding of AWS QuickSight and data visualization tools. Responsibilities: Design, develop, and maintain data visualization dashboards and reports using AWS Quicksight. Good knowledge of ETL pipelines. Write SQL queries for data extraction, transformation, and loading. Conduct data analysis and identify trends, patterns, and anomalies in data. Collaborate with business stakeholders to understand their data needs and translate them into actionable insights. Ensure data quality and accuracy. Stay abreast of the latest trends and technologies in the BI and data analytics space. Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 5+ years of experience in Business Intelligence and Data Analytics. Strong experience with data visualization using Quicksight, Tableau, or similar tools. Excellent analytical and problem-solving skills. Strong attention to detail and accuracy. Ability to work independently and as part of a team. Bonus Points: Banking domain knowledge Experience on Tableau and similar tools. Experience with cloud-based data platforms (AWS, Azure, GCP). Experience with Agile development methodologies.
Job description We're looking for a highly skilled and visionary Agentic AI Implementation Engineer who can design, build, and deploy intelligent systems using Agentic techniques. This role will involve creating autonomous or semi-autonomous agents powered by LLMs that are capable of task planning, contextual decision-making, and iterative information retrieval. You'll work at the intersection of advanced prompt engineering, tool orchestration, and multi-step reasoning, with a strong focus on performance, scalability, and user-centric design. Key Responsibilities Architect and implement agentic workflows using RAG pipelines, LLM agents, and external tool integrations. Design modular, agentic systems that include planning, memory, tool use, and context-aware reasoning. Develop and optimize custom GPTs using advanced prompt engineering and OpenAI\u2019s custom instructions, functions, and APIs. Integrate knowledge bases, vector stores (e.g., FAISS, Pinecone, Weaviate), and APIs into a cohesive Agentic RAG architecture. Fine-tune agent behaviors for various real-world applications (e.g., customer support, research assistants, code agents). Collaborate closely with product managers, UX designers, and backend engineers to ship scalable and robust solutions. Rapidly prototype ideas, run LLM experiments, and iterate on designs using both quantitative and qualitative metrics. Monitor system performance, detect and address reasoning failures, hallucinations, and retrieval mismatches. Stay updated with the latest research and advancements in Agentic AI, RAG, tool use, and autonomous agents. Must-Have Skills Strong expertise in Agentic RAG frameworks such as LangGraph, AutoGPT, CrewAI, LangChain Agents Proven ability to design and implement custom GPTs using advanced prompt strategies and OpenAI API (functions, tools, memory). Hands-on experience with vector databases (e.g., Cosmos DB, Pinecone, ChromaDB), embeddings, and semantic search. Deep understanding of retrieval augmentation , context compression, multi-hop querying, and memory management. Fluency in Python and experience with modern LLM tooling (e.g., LangChain, LlamaIndex). Strong systems thinking and ability to balance trade-offs between model performance, latency, and accuracy. Comfortable with fast-paced, iterative environments and exploratory development. Desired Skills Experience with autonomous agents and frameworks like AutoGen, OpenAgents, or BabyAGI. Understanding of AI safety, ethics, and control mechanisms in agentic systems. Familiarity with evaluation techniques for LLM pipelines (e.g., hallucination detection, prompt testing frameworks).
Job Description: SEO Expert (Contract - 5+ Years Experience) Duration: Few Months (Contract) Location: Remote/Flexible Compensation: Competitive, based on experience Overview We are seeking an experienced SEO Expert with over 5 years of proven track record to join us on a short-term contract basis. The ideal candidate will drive our SEO project independently, deliver tangible improvements, and provide comprehensive reports on progress and results. Key Responsibilities Develop, implement, and manage effective SEO strategies to improve organic search rankings, website traffic, and conversion rates. Conduct in-depth keyword research, competition analysis, and technical SEO audits. Optimize website content, structure, and on-page elements for search engines and user experience. Build and execute link-building strategies to increase website authority. Monitor and report on key SEO performance metrics, providing detailed analysis and actionable recommendations. Continuously track industry trends, algorithm updates, and best practices to ensure ongoing results. Collaborate with content, web development, and marketing teams as needed. Take full ownership of the project, working with minimal supervision, and proactively communicating status and results. Qualifications 5+ years of proven SEO experience with successful project outcomes. Strong expertise with SEO tools (e.g., Google Analytics, Search Console, SEMrush, Ahrefs, Screaming Frog, etc.). Deep understanding of on-page, off-page, and technical SEO. Excellent analytical, problem-solving, and communication skills. Self-motivated, results-driven, and able to work independently. Experience in generating detailed SEO reports and presenting to stakeholders. Ability to commit to a contract role for a few months and deliver results within set timelines. Nice to Have Previous experience in B2B or similar industry domains. Experience working in a remote or flexible work setup.Role & responsibilities Preferred candidate profile
QUICKSIGHT EXPERIENCE IS MUST Minimum of 1.5 years. About the Role We are seeking a highly motivated and talented AWS QuickSight Developer to join our growing team. The Developer will be responsible for designing, developing, and maintaining data visualizations and reports that provide valuable insights into business performance. This role requires a strong understanding of AWS QuickSight and data visualization tools. Responsibilities: Design, develop, and maintain data visualization dashboards and reports using AWS Quicksight. Good knowledge of ETL pipelines. Write SQL queries for data extraction, transformation, and loading. Conduct data analysis and identify trends, patterns, and anomalies in data. Collaborate with business stakeholders to understand their data needs and translate them into actionable insights. Ensure data quality and accuracy. Stay abreast of the latest trends and technologies in the BI and data analytics space. Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 5+ years of experience in Business Intelligence and Data Analytics. Strong experience with data visualization using Quicksight, Tableau, or similar tools. Excellent analytical and problem-solving skills. Strong attention to detail and accuracy. Ability to work independently and as part of a team. Bonus Points: Banking domain knowledge Experience on Tableau and similar tools. Experience with cloud-based data platforms (AWS, Azure, GCP). Experience with Agile development methodologies.
QUICKSIGHT EXPERIENCE IS MUST Minimum of 1.5 years. About the Role We are seeking a highly motivated and talented AWS QuickSight Developer to join our growing team. The Developer will be responsible for designing, developing, and maintaining data visualizations and reports that provide valuable insights into business performance. This role requires a strong understanding of AWS QuickSight and data visualization tools. Responsibilities: Design, develop, and maintain data visualization dashboards and reports using AWS Quicksight. Good knowledge of ETL pipelines. Write SQL queries for data extraction, transformation, and loading. Conduct data analysis and identify trends, patterns, and anomalies in data. Collaborate with business stakeholders to understand their data needs and translate them into actionable insights. Ensure data quality and accuracy. Stay abreast of the latest trends and technologies in the BI and data analytics space. Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 5+ years of experience in Business Intelligence and Data Analytics. Strong experience with data visualization using Quicksight, Tableau, or similar tools. Excellent analytical and problem-solving skills. Strong attention to detail and accuracy. Ability to work independently and as part of a team. Bonus Points: Banking domain knowledge Experience on Tableau and similar tools. Experience with cloud-based data platforms (AWS, Azure, GCP). Experience with Agile development methodologies.
Seeking an experienced Avaloq Data Specialist (10+ yrs) with expertise in Avaloq, Oracle, and banking data models. Fully remote role offering flexibility, career growth, and impact on banking transformation projects. Health insurance Provident fund
Job Description We are seeking a highly experienced Avaloq Data Specialist to join our team. This is a senior-level role for a professional with a strong background in Avaloq data extraction, Oracle backend systems, and Avaloq data model expertise. The successful candidate will play a key role in extracting, structuring, and analyzing data to support business-critical decision-making within a banking environment. This is a remote role, offering flexibility and the opportunity to work with a leading banking technology platform from anywhere. Key Responsibilities • Perform data extraction, transformation, and analysis from the Avaloq Banking Suite, leveraging Oracle as the backend. • Develop a deep understanding of Avaloqs data structures, models, and business logic to ensure accurate data handling. • Partner with business and technical stakeholders to gather data requirements and deliver actionable insights. • Ensure data quality, consistency, and compliance with banking standards. • Support data optimization, query tuning, and reporting frameworks for efficiency. • Provide subject matter expertise in Avaloq data management for projects, audits, and regulatory needs. Requirements • 10+ years of experience working with Avaloq in a banking environment. • Proven expertise in Oracle databases, including SQL and performance optimization for data extraction. • Strong knowledge of Avaloq data models, object layers, and customization possibilities. • Bachelor’s degree in Computer Science, Information Technology, or related field. • Preferred: Avaloq-specific certifications such as: • Avaloq Certified Customization Professional (ACCP) • Avaloq Certified Integration Professional (ACIP) • Avaloq Certified Operations Professional (ACOP) • Excellent analytical, problem-solving, and communication skills. • Ability to work independently and effectively in a fully remote setup. Why Join Us • Work from Anywhere: 100% remote role with flexibility. • Banking Domain Expertise: Be part of projects at the core of financial services and digital banking transformation. • Career Growth: Gain exposure to cutting-edge Avaloq implementations and expand your specialization with certification opportunities. • Impactful Work: Directly contribute to critical banking operations by ensuring data accuracy, reliability, and availability. • Collaborative Culture: Join a team of experienced Avaloq and Oracle professionals in a supportive environment. This role is ideal for a senior Avaloq expert who thrives in complex banking data environments and wants to make an impact while enjoying the benefits of remote work.
Job Responsibilities: As a Java, Spring Boot, and Angular Developer, you will play a key role in the end-to-end development, design, and maintenance of full-stack applications. Youll work on both the front-end and back-end to deliver high-performance solutions. Experience 8-10 Years. Design and implement code using Java, Spring Boot and angular. Write well-structured, clean, and efficient code. Develop RESTful APIs and microservices. Develop dynamic, interactive, and responsive web applications using Angular. Build reusable UI components and manage state using RxJS and NgRx. Ensure the application is optimized for speed and scalability. Implement user-friendly interfaces with a focus on UI/UX best practices. Collaborate with UX/UI designers to transform designs into seamless user experiences. Collaborative Development. Work closely with other developers, and product managers to deliver high-quality solutions. Write unit, integration, and end-to-end tests for backend and frontend components. Use Git for version control. Performance and Optimization: Required Skills: Core Technologies: Java 19+: Strong knowledge of Java language features, OOP principles, and JVM internals. Spring Boot: Expertise in building RESTful web services, microservices, and knowledge of Spring Cloud. Angular (15+): Proficiency in building dynamic web applications using Angular, TypeScript, HTML, and CSS. REST APIs: Solid experience in designing and developing RESTful APIs. Version Control: Familiarity with Git and branching strategies. Additional Skills: Security: Experience in securing applications using JWT, OAuth2, and role-based access control. oracle databases exposure is added advantage
Job descriptionRequired : Hands-on experience with writing and optimizing complex SQL queries involving multiple joins, subqueries, and aggregations. Proficiency in performance tuning SQL queries for efficiency and scalability. Worked on at least one development project from an ETL Perspective, demonstrating expertise in extracting, transforming, and loading data. File Processing using ETL tools, showcasing ability to handle various file formats and processing requirements. Experience in Shell/Python Scripting, particularly for automating ETL processes and data manipulation tasks. Hands-On Experience writing Business Logic SQL or PL/SQL, ensuring data integrity and adherence to business rules. ETL Testing and Troubleshooting, including identifying and resolving data quality issues and ETL process failures. Good to have experience in Building a Cloud ETL Pipeline, leveraging cloud services for scalable and reliable data processing. Hands-on experience in Code Versioning Tools like Git, SVN, ensuring code integrity and collaboration. Required Skills and Abilities: Mandatory Skills - Hands-on and deep experience working in ETL Development, demonstrating proficiency in all stages of the ETL process. Mandatory Skills - Strong in SQL Query and Shell Scripting, with a focus on writing efficient, maintainable code. Better Communication skill to understand business requirements from SME, facilitating clear communication between technical and non-technical stakeholders. Basic knowledge of data modeling, enabling understanding of database structures and relationships. Good Understanding of E2E Data Pipeline and Code Optimization, ensuring efficient data flow and processing. Additional Content for SQL and Complex SQL Coding and Experience: Extensive experience in crafting SQL queries for data analysis, reporting, and decision-making purposes. Hands-on experience in optimizing SQL queries for large datasets, including index optimization and query plan analysis. Demonstrated ability to work with complex data structures and hierarchies within SQL queries, ensuring accurate and efficient data retrieval. Experience in designing and implementing database schemas to support ETL processes, including table partitioning and indexing strategies. Experience in troubleshooting and resolving performance issues related to SQL queries and database operations. Proven track record of delivering high-quality SQL code within tight deadlines and under changing requirements. Note: Minimum 5-7 years experience.
Job Description: Responsible for the day-to-day maintenance of the application systems in operation, including tasks related to identifying and troubleshooting application issues and issues resolution or escalation. Responsibilities also include root cause analysis, management communication and client relationship management in partnership with Infrastructure Service Support team members. Ensures all production changes are made in accordance with life-cycle methodology and risk guidelines. Responsible for coaching and mentoring less experienced team members and or acting as a subject matter expert. In depth Functional knowledge of the application(s) supported and inter dependencies. Key Responsibilities This includes but are not limited to: Knowledgeable in SQL ANSI Knowledgeable in deployment, stabilization, supporting 1 of the following. - ETL Tools (Informatica, DBT and like) - Dashboard (Tableau, AWS Quicksight and like) - Messaging Tool (Confluent, Kafka and like) - Machine Learning (CDSW, CML and like) - Database (Snowflake, RDBMS and like) Can create processes and procedures to improve manual processes by operationalization. Has good documentation skills for Root Cause Analysis (RCA) Scope and responsibilities: 1. Advanced Troubleshooting: L2 support teams possess a deeper understanding of ETL systems and processes, allowing them to investigate and resolve more complex issues that may arise. This includes analyzing log files, examining database performance metrics, and tracing data flow to identify root causes of problems. 2. Incident Resolution: L2 support teams are responsible for resolving incidents that cannot be addressed at the L1 level. They may work closely with development teams or external vendors to implement fixes or workarounds for issues affecting ETL processes. 3. Performance Tuning: L2 support teams focus on optimizing the performance of ETL jobs and processes to ensure efficient data extraction, transformation, and loading. This may involve tuning database queries, optimizing ETL workflows, or recommending infrastructure upgrades to improve system throughput and reduce processing times. 4. Capacity Planning: L2 support teams monitor resource utilization and system capacity to anticipate and mitigate potential bottlenecks in ETL processing. They may provide recommendations for scaling infrastructure or adjusting job schedules to accommodate growing data volumes or changing business requirements. 5. Change Management: L2 support teams participate in the change management process to review and approve proposed changes to ETL systems and configurations. They assess the potential impact of changes on production environments and ensure that proper testing and validation procedures are followed before implementing updates. 6. Knowledge Sharing and Training: L2 support teams contribute to the development of knowledge base articles, best practices, and training materials to empower L1 support teams and other stakeholders with the skills and information needed to effectively manage and support ETL processes. High level Requirements This includes but are not limited to: Excellent problem-solving skills and documentation skills. Proven customer service skills. Familiarity with Information Technology Infrastructure Library (ITIL) methodologies. Good communication skills (verbal and written) and the ability to work within a 24x7 support environment.
Job description QUICKSIGHT EXPERIENCE IS MUST Minimum of 3 years. About the Role : We are seeking a highly motivated and talented AWS QuickSight Developer to join our growing team. The Developer will be responsible for designing, developing, and maintaining data visualizations and reports that provide valuable insights into business performance. This role requires a strong understanding of AWS QuickSight and data visualization tools. Responsibilities: Design, develop, and maintain data visualization dashboards and reports using AWS Quicksight. Good knowledge of ETL pipelines. Write SQL queries for data extraction, transformation, and loading. Conduct data analysis and identify trends, patterns, and anomalies in data. Collaborate with business stakeholders to understand their data needs and translate them into actionable insights. Ensure data quality and accuracy. Stay abreast of the latest trends and technologies in the BI and data analytics space. Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. 5+ years of experience in Business Intelligence and Data Analytics. Strong experience with data visualization using Quicksight, Tableau, or similar tools. Excellent analytical and problem-solving skills. Strong attention to detail and accuracy. Ability to work independently and as part of a team. Bonus Points: Banking domain knowledge Experience on Tableau and similar tools. Experience with cloud-based data platforms (AWS, Azure, GCP). Experience with Agile development methodologies.