Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Srinagar, Jammu & Kashmir, India
On-site
We are looking for a Backend Developer who is self-driven, solution-oriented, and thrives in a collaborative environment. You’ll play a key role in developing new backend modules, enhancing existing systems, and ensuring our platforms remain robust, secure, and scalable. Ideal candidates will have hands-on experience with Node.js frameworks and building enterprise-grade applications. Position: Backend Developer Location: Onsite (Srinagar, Jammu and Kashmir) Type: Full-Time Experience: 3–4 Years Key Responsibilities: Design, build, and maintain secure, scalable RESTful APIs using Node.js and Express.js. Architect and manage relational database schemas involving complex modules such as appointments, prescriptions, CRM logs, invoices, and more. Implement essential backend features: Secure authentication mechanisms (JWT/Auth0) Role-based access control Real-time communication using WebSockets/Socket.IO Geo-location based logic (e.g., pharmacy/lab matching) Integrate third-party services: Video SDKs: Jitsi or Twilio Payment Gateways Cloud Storage: Cloudinary or Amazon S3 Develop reusable modules for: CRM Order tracking Appointment scheduling Notifications Work closely with frontend, mobile, and DevOps teams to deliver well-integrated, optimized features. Maintain high standards for code quality, database performance, and API documentation. Requirements: 3–4 years of hands-on experience in backend development with Node.js and Express.js. Strong understanding of relational databases (preferably PostgreSQL; MySQL is also acceptable with strong normalization/indexing experience). Practical experience with: ORMs like Prisma, TypeORM, or Sequelize Schema design, data relationships, transactions Handling JSON data, advanced search, pagination, and filtering Experience with real-time communication tools (WebSockets/Socket.IO). Integration experience with services such as Stripe/Razorpay, Jitsi/Twilio, Cloudinary, or S3. Familiarity with Redis for caching, background jobs, or session handling is a plus. Proficiency with Postman, Swagger/OpenAPI, and Git workflows. Bonus Skills: Experience in healthcare, booking, or delivery-based platforms. Built or contributed to multi-role applications with complex permission logic. Exposure to cloud deployments on platforms like AWS or Hostinger. We value real-world skills over just theoretical knowledge. If you've built something impressive, we'd love to see it! Please include links to your GitHub repositories, live projects, portfolios, or demo apps along with your application.
Posted 4 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Fullstack SDE - II at NxtWave, you Build applications at a scale and see them released quickly to the NxtWave learners (within weeks )Get to take ownership of the features you build and work closely with the product tea mWork in a great culture that continuously empowers you to grow in your caree rEnjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster )NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidl yBuild in a world-class developer environment by applying clean coding principles, code architecture, etc .Responsibilitie sLead design and delivery of complex end-to-end features across frontend, backend, and data layers .Make strategic architectural decisions on frameworks, datastores, and performance patterns .Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns .Build and maintain shared UI component libraries and backend service frameworks for team reuse .Identify and eliminate performance bottlenecks in both browser rendering and server throughput .Instrument services with metrics and logging, driving SLIs, SLAs, and observability .Define and enforce comprehensive testing strategies: unit, integration, and end-to-end .Own CI/CD pipelines, automating builds, deployments, and rollback procedures .Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices .Partner with Product, UX, and Ops to translate business objectives into technical roadmaps .Facilitate sprint planning, estimation, and retrospectives for predictable deliveries .Mentor and guide SDE-1s and interns; participate in hiring .Qualifications & Skill s3–5 years building production Full stack applications end-to-end with measurable impact .Proven leadership in Agile/Scrum environments with a passion for continuous learning .Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies .Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot) .Expert in designing RESTful and GraphQL APIs and scalable database schemas .Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis) .Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, api gateway etc .Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright) .Frontend profiling (Lighthouse) and backend tracing for performance tuning .Secure coding: OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes .Strong communicator able to convey technical trade-offs to non-technical stakeholders .Experience in reviewing pull requests and providing constructive feedback to the team .Qualities we'd love to find in you : The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality softwa reStrong collaboration abilities and a flexible & friendly approach to working with tea msStrong determination with a constant eye on solutio nsCreative ideas with problem solving mind-s etBe open to receiving objective criticism and improving upon itEagerness to learn and zeal to gr owStrong communication skills is a huge pl usWork Location : Hyderab ad About Nxt WaveNxtWave is one of India’s fastest-growing ed-tech startups, revolutionizing the 21st-century job market. NxtWave is transforming youth into highly skilled tech professionals through its CCBP 4.0 programs, regardless of their educational backgro und.NxtWave is founded by Rahul Attuluri (Ex Amazon, IIIT Hyderabad), Sashank Reddy (IIT Bombay) and Anupam Pedarla (IIT Kharagpur). Supported by Orios Ventures, Better Capital, and Marquee Angels, NxtWave raised $33 million in 2023 from Greater Pacific Capi tal.As an official partner for NSDC (under the Ministry of Skill Development & Entrepreneurship, Govt. of India) and recognized by NASSCOM, NxtWave has earned a reputation for excelle nce.Some of its prestigious recognitions incl ude:Technology Pioneer 2024 by the World Economic Forum, one of only 100 startups chosen glob ally‘Startup Spotlight Award of the Year’ by T-Hub in 2023‘Best Tech Skilling EdTech Startup of the Year 2022’ by Times Business Aw ards‘The Greatest Brand in Education’ in a research-based listing by URS M ediaNxtWave Founders Anupam Pedarla and Sashank Gujjula were honoured in the 2024 Forbes India 30 Under 30 for their contributions to tech educa tionNxtWave breaks learning barriers by offering vernacular content for better comprehension and retention. NxtWave now has paid subscribers from 650+ districts across India. Its learners are hired by over 2000+ companies including Amazon, Accenture, IBM, Bank of America, TCS, Deloitte and m ore. Know more about NxtW ave: https://www.cc bp.inRead more about us in the ne ws – Economic Times | CNBC | YourStory | VCC ircle
Posted 4 weeks ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction A career in IBM Software means you’ll be part of a team that transforms our customer’s challenges into solutions. Seeking new possibilities and always staying curious, we are a team dedicated to creating the world’s leading AI-powered, cloud-native software solutions for our customers. Our renowned legacy creates endless global opportunities for our IBMers, so the door is always open for those who want to grow their career. IBM’s product and technology landscape includes Research, Software, and Infrastructure. Entering this domain positions you at the heart of IBM, where growth and innovation thrives. Your Role And Responsibilities Lead the design, development, and deployment of scalable, secure backend systems using Java, J2EE, and GoLang. Architect and implement robust RESTful APIs and microservices aligned with enterprise cloud-native standards. Collaborate closely with DevOps, QA, and frontend teams to deliver end-to-end product functionality. Set coding standards, influence architectural direction, and drive adoption of best practices across backend systems. Own performance tuning, monitoring, and high availability for backend services using tools like Prometheus, ELK, and Grafana. Implement security, compliance, and privacy by design principles in backend systems. Lead incident response and resolution of complex production issues across multi-cloud environments (e.g., AWS, Azure, OCP). Mentor and guide junior developers and contribute to team-wide knowledge sharing and skill development. Actively participate in Agile ceremonies and contribute to continuous delivery and process improvement. Preferred Education Master's Degree Required Technical And Professional Expertise 6+ years of backend software development experience focused on scalable, secure, cloud-native enterprise systems. Deep expertise in Java, J2EE, and GoLang for building distributed backend systems. Advanced experience in architecting and implementing RESTful APIs, service meshes, and inter-service communication. Expert in Postgres or equivalent RDBMS — data modeling, indexing, and performance optimization at scale. Proven track record with microservices architecture, including Docker, Kubernetes, and service deployment patterns. Expert-level familiarity with backend-focused CI/CD tooling (Jenkins, GitLab CI/CD, ArgoCD) and IaC tools (Terraform, CloudFormation). Strong knowledge of monitoring/logging tools such as Prometheus, Grafana, ELK, and Splunk, focusing on backend telemetry and observability. Experience deploying applications on cloud platforms: AWS (EKS, ECS, Lambda, CloudFormation), Azure, or GCP. Familiarity with DevSecOps, secure coding practices, and compliance-aware architecture for regulated environments. Proficient in integration, load, and unit testing using JMeter, RestAssured, JUnit, etc. Leadership in backend architecture, performance tuning, platform modernization, and mentoring of technical teams. Effective cross-functional collaboration skills in multi-team, multi-region environments. Preferred Technical And Professional Experience Deep understanding of backend architecture patterns including microservices, event-driven architecture, and domain-driven design. Experience implementing security and privacy by design principles in cloud-native backend systems. Hands-on expertise with cryptographic protocols and standards such as TLS, FIPS, and experience integrating with Java security frameworks (e.g., JCE, Spring Security). Strong grasp of secure coding practices, with experience identifying and mitigating OWASP Top 10 vulnerabilities. Exposure to designing and developing shared platform services or backend frameworks reused across products or tenants (e.g., in multi-tenant SaaS environments). Familiarity with API security patterns, including OAuth2, JWT, API gateways (e.g., Kong, Apigee). Prior experience working on compliance-oriented systems (e.g., SOC2, HIPAA, FedRAMP) or architecting for high-assurance environments. Proficiency with Shell scripting, Python, or Node.js for infrastructure automation or backend utilities.
Posted 4 weeks ago
0.0 - 5.0 years
0 Lacs
Delhi, Delhi
On-site
Vacancy No. S14969 Category of Contract National Position Type National Application Deadline 07/08/2025 Job Posted On 09/07/2025 Duty Station Outposted Role can be based in New Delhi, Bucharest, Ankara, Budapest Country GLOBAL Different Locations Duration 12 Organizational Context The International Federation of Red Cross and Red Crescent Societies (IFRC) is the world’s largest humanitarian organization, with a network of 191-member National Societies. The overall aim of IFRC is “to inspire, encourage, facilitate, and promote at all times all forms of humanitarian activities by National Societies with a view to preventing and alleviating human suffering and thereby contributing to the maintenance and promotion of human dignity and peace in the world.” IFRC works to meet the needs and improve the lives of vulnerable people before, during and after disasters, health emergencies and other crises. IFRC is part of the International Red Cross and Red Crescent Movement (Movement), together with its member National Societies and the International Committee of the Red Cross (ICRC). The work of IFRC is guided by the following fundamental principles: humanity, impartiality, neutrality, independence, voluntary service, unity, and universality. IFRC is led by its Secretary General, and has its Headquarters in Geneva, Switzerland. The Headquarters are organized into three main Divisions: (i) National Society Development and Operations Coordination; (ii) Global Relations, Humanitarian Diplomacy and Digitalization; and (iii) Management Policy, Strategy and Corporate Services. IFRC has five regional offices in Africa, Asia Pacific, Middle East and North Africa, Europe, and the Americas. IFRC also has country cluster delegation and country delegations throughout the world. Together, the Geneva Headquarters and the field structure (regional, cluster and country) comprise the IFRC Secretariat. Background to the position In virtually all countries, people increasingly rely on and expect a diverse range of data and digital services (e.g., through their mobile devices) to interact with local governments, companies, and community organizations and services. This disruption is already happening to humanitarian assistance. Yet, the Digital Divide remains a persistent and significant challenge at both national and local levels. The need for a successful and large-scale digital transformation is urgent. Furthermore, digitally Transforming the IFRC and its 191 members is a complex process which requires collaborative action and support across the membership. Therefore, IFRC recently developed a Digital Transformation Strategy which was approved by the IFRC Governing Board in May 2021. The Digital Transformation Department (DTD) has full leadership responsibility for the implementation of the digital transformation strategy and the positive impact it will have on the 191 National Society members of the IFRC. The DTD provides strategic leadership and guides the IFRC Secretariat as well as the members network to adapt and innovate humanitarian services, drawing on digital services, data-enabled decision-making, and other opportunities for digital transformation in support of the IFRC’s Strategy 2030. Job Purpose The Data Platform Engineer is responsible for designing, implementing, and maintaining the global IFRC data platform solutions. This role involves engineering scalable and reliable data solutions to enable data ingestion, storage, processing, and analysis, ultimately leading to reliable data-enabled decision making. The Data Platform Engineer will collaborate with multiple cross-functional teams at a global level, on a variety of projects that support the internal and external-facing activities of the Red Cross and Red Crescent movement Job Duties and Responsibilities Data Platform Engineering: Design, implement, and manage end-to-end data solutions using Microsoft Azure services (Microsoft SQL Server, Azure SQL Database, Azure Data Lake, Azure Synapse Analytics, Microsoft Fabric). Data Ingestion and Integration: Develop and optimize data pipelines, ETL processes, and database performance using tools like SSIS, Azure Data Factory, and Databricks. Establish data quality checks and validation mechanisms during the ingestion process. Data Storage and Management: Determine appropriate data storage technologies and structures (e.g., databases, data lakes, object storage) based on the organization's needs, and support teams in implementation of the proper solution. Develop data management strategies, including data partitioning, indexing, and archiving, to optimize performance and storage efficiency. Data Processing and Analytics: Design and implement data pipelines to transform and analyze data at scale, primarily utilizing the Microsoft technology stack. Select and configure appropriate processing technologies, such as distributed computing platforms, data processing frameworks, and streaming systems. Collaborate with data analysts and data scientists to ensure the platform supports advanced analytics and machine learning workloads, ensuring data accessibility and accuracy. Data Security and Governance: Contribute to data governance policies on relevant topics, such as security and storage of data. Monitor and optimize data platform performance and availability to ensure high availability and performance. Ensure data security, backup, and disaster recovery strategies are in place and effective. Collaboration and Stakeholder Management: Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand their requirements and align the data platform accordingly. Explore and implement emerging features in Microsoft technologies (e.g. Fabrc) and integrate them into our data architecture to support business goals. Provide technical guidance and training to team members on data management best practices and Microsoft data technologies Contribute to an effective, high quality IFRC team: Support the unit manager with regular progress reports on results against objectives and responsibilities. Work in close consultation and develop partnerships with colleagues with data roles across the secretariat in Geneva and in the regions. Job Duties and Responsibilities (continued) Duties applicable to all staff: Work actively towards the achievement of the IFRC Secretariat’s goals. Abide by and work in accordance with the Red Cross and Red Crescent principles. Perform any other work-related duties and responsibilities that may be assigned by the line manager Education Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Experience A minimum of 5 years of progressively responsible postgraduate experience in data platforms engineering. Core Expertise in the Microsoft Data Stack, in particular proficiency with Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure SQL Database Strong knowledge of Azure Cloud architecture and networking principles. Familiarity with CI/CD pipelines for data workflows (e.g., using Azure DevOps). Proficiency in Python, PowerShell, or similar scripting languages. Strong knowledge of data platform technologies, including data ingestion, storage, processing, and analytics. Strong experience with ETL tools like SQL Server Integration Services (SSIS) and Azure Data Factory. Familiarity with Microsoft Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Databricks. Proficiency in database design, data warehousing, and data integration concepts Proficiency in cloud platforms and technologies, such as AWS, Azure, or Google Cloud. Experience with big data technologies, such as Hadoop, Spark, and distributed storage systems Familiarity with data governance, data security, and data privacy regulations (e.g., GDPR, CCPA). Experience within the RC/RC Movement and/ or international humanitarian or development organizations will be preferred. Knowledge, Skills and Languages Strong strategic and conceptual thinking; setting meaningful, long-term vision and strategy, consider long-term potential, propose challenging strategic goals. Propensity for embracing change and ambiguity: anticipate emerging conditions and demands, embrace widespread organisational change, navigate complex dynamics, view uncertainty and disruption as an opportunity. Ability to drive results, and create culture that fosters proactive action, actively prioritize, set high standards. Developing others: Push autonomy and empowerment, view people development as imperative, create culture of accountability Data products development, business value development, data product deployment, and resource mobilization. Knowledge on applying artificial intelligence techniques, such as NLP and machine learning will be preferred Data modelling, statistics. will be preferred Strong presentation, written and oral communication skills. Able to network effectively and influence and inspire others including peers, the membership and other stakeholders. Focused on quality and standards, results, and accountability Excellent interpersonal skills; proven people’s management skills (staff and consultants), including conflict resolution Preferred Certifications: Microsoft Certified: Azure Data Engineer Associate. Microsoft Certified: Azure Solutions Architect Expert. Proactive approach to finding creative and constructive solutions to difficult issues. Proven teamwork and trust-building skills, including development of effective and efficient networks and partnerships within and outside of the organisation Proven training, knowledge transfer and supervisory skills as part of the people’s management. Competencies, Values and Comments Competencies, Values and Comments Values: Respect for diversity; Integrity; Professionalism; Accountability Core competencies: Communication; Collaboration and teamwork; Judgement and decision making; National society and customer relations; Creativity and innovation; Building trust Functional competencies: Strategic orientation; Building alliances; Leadership; Empowering others
Posted 4 weeks ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Company Description Innopad Solutions is a technology company specializing in software, web, and mobile app development services for startups and SMEs. With a team of 40+ developers skilled in cutting-edge technologies like .NET, Python, MERN/MEAN, React-native, Swift, and Kotlin, we offer rapid and cost-effective solutions. Our clientele ranges from small businesses to Fortune 500 companies, seeking our expertise to achieve their digital goals. About Us Innopad Solutions is a technology-driven company specializing in building custom software, mobile apps, and SaaS platforms for global clients. We’re looking for a Lead .NET Developer to join our growing team in Ahmedabad , who can lead high-impact projects, communicate effectively with clients, and thrive in a fast-paced, collaborative environment. Key Responsibilities Lead and manage .NET-based web and software development projects from planning to deployment. Collaborate directly with clients and stakeholders to gather requirements, provide updates, and guide technical decisions. Architect and implement solutions using .NET 8+, .NET Core, and .NET MVC . Build responsive, component-driven applications using Blazor . Work with MS SQL Server for designing, optimizing, and maintaining relational databases. Deploy and manage applications in Microsoft Azure and AWS environments. Maintain coding standards, conduct code reviews, and mentor junior developers. Optional but valuable: Contribute to cross-platform development using MAUI or WPF . Proficient in developing front-end applications using React, Next.js, or Angular. Required Skills and Qualifications 5+ years of hands-on experience with .NET Core , .NET MVC , and .NET 8 or later . Proficiency in Blazor for modern front-end development. Strong experience in MS SQL Server (queries, stored procedures, indexing, performance tuning). Experience with cloud hosting and deployment on Azure and/or AWS . Strong problem-solving, system design, and architecture skills. Excellent spoken and written English —this is a client-facing role with daily interactions. Ability to manage priorities and deliver high-quality solutions in a fast-paced environment. Nice To Have Experience with MAUI or WPF for desktop or cross-platform app development. Exposure to DevOps, CI/CD pipelines, or Infrastructure as Code (IaC). How To Apply If you meet the above criteria and are ready to take on a leadership role, send your resume and portfolio to career@innopadsolutions.com
Posted 4 weeks ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Greetings from Ionixx!!! Job Title : Senior Backend Engineer / Java API Developer Experience : 10+ Years Employment Type : [Full-timeRemote] Job Overview We are seeking a highly experienced Senior Backend Engineer with over 10 years of experience in backend development, specifically using Java, PostgreSQL, AWS core services, and Apache Kafka. The ideal candidate will have a strong background in designing, developing, and maintaining scalable APIs and distributed systems in a cloud-based environment. Key Responsibilities Design, develop, and maintain scalable and secure RESTful APIs using Java (Spring Boot or similar frameworks). Optimize and manage PostgreSQL databases, including schema design, indexing, and performance tuning. Build and maintain data pipelines and real-time processing systems using Apache Kafka. Leverage AWS core services (e.g., EC2, S3, Lambda, RDS, CloudWatch, DynamoDB) to build cloud-native applications. Collaborate with cross-functional teams, including product managers, front-end developers, and QA, to deliver high-quality software. Participate in code reviews and design discussions, and mentor junior team members. Troubleshoot production issues and ensure high system availability and performance. Required Qualifications 10+ years of hands-on experience in backend development. Strong proficiency in Java, with experience in frameworks like Spring Boot, Dropwizard, or similar. Deep understanding of PostgreSQL : SQL queries, indexing, performance optimization, and database design. Hands-on experience with Apache Kafka for building event-driven and real-time applications. Proven experience working with AWS core services in a production environment. Solid understanding of microservices architecture, security best practices, and REST API standards. Experience with version control systems like Git and CI/CD tools. Strong problem-solving skills and the ability to work independently or in a team. Preferred Qualifications Experience with containerization tools like Docker and orchestration tools like Kubernetes. Exposure to monitoring and logging tools like Prometheus, Grafana, CloudWatch, or the ELK stack. Experience working in Agile/Scrum environments. (ref:hirist.tech)
Posted 1 month ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Designation Postgre SQL Developer No. of Positions 1 Location Ahmedabad Qualification BE/B Tech – IT or CS or MCA or Certifications Experience range Minimum 4 years experience as Postgresql Developer. Experience of working as DBA will be an added advantage. Critical skills required 1. PostgreSQL Developer skill is essential. 2. Sound operating knowledge on Windows/UNIX / Linux platform. 3. Should have worked on Enterprise DB. 4. Should have Post GIS Experience (Optional). Cost to company - Best in the Industry Roles and Responsibilities: 1. Planning, Installation and Configuration of PostgreSQL 9.4 / 9.6/ 10/ 11/ 12 2. PostgreSQL tools installation, Configurations PGAdmin III, IV, repmanager, PGPOOL, PGBouncer, pgbadger. 3. Performance tuning configuration parameters enabling. 4. Advanced Server version 9.X,10.x,11.x Installation, Configurations, Backup & Recovery, High Availability, Security and Performance Tuning. 5. Co-ordinate and support with other teams for resolves troubleshooting issues. 6. Managed Production databases in 24/7 environment. 7. Used pg_dump, pg_resotre, pg_basebackup and other PostgreSQL built-in packages. 8. Up gradation of PostgreSQL server from 9.4 to 9.6. 9.6 to 10. 9. Designed, developed and maintained PostgreSQL database server. 10. Prepared and updated scripts for various administrative tasks, Wrote UNIX Shell Scripts for scheduling backup jobs and day to day routine tasks. 11. Configuration of streaming replication using replication slots. 12. Conducted Linux administration activities and managed test databases. 13. Check the load Balancing and Replication and connection. 14. Working with Postgis and pgrouting, multicorn extensions. 15. Creation of databases and creating the indexing on tables. 16. Check the Base Backup Files Maintain the physical storage on Servers. 17. Creating users, roles, and necessary privileges to users according to the business requirement With Password Policy. 18. Re-indexing and Vacuuming on Database from time to time. 19. Providing Technical support for applications team for DB related issue. 20. Data loading and data migration. 21. Prepare the database design and E-R Diagram. Company Profile – Nascent Info Technologies Pvt Ltd, an Ahmedabad based CMMI Dev Level 3 certified IT / ITeS company which is into software and applications development as well as into Digital Communications since its inception. Nascent Info Technologies specializes in the business of providing services like Software design and development, product planning and development, mobile apps development, datacentre management, datacentre consultancy and technical support, GIS application development and deployment. Our expertise helps in reducing costs and enhancing output by bringing the strategic advantage of Software Outsourcing. Nascent deals with PHP, Java and other Open source technologies, adding Value to information system through R&D. Crafting machine intelligence in line with human intelligence. Nascent has developed various wide ranges of Mobile apps for smart phone, useful apps for traveller, Book readers, developed comprehensive apps for conducting Survey and integrated apps for ERP as well as decision making mobile based apps for tourism. In GIS we are dealing in GIS based products and services. Nascent has developed decision making web based tools for hospitality and power sector. Provided services to urban development authorities and municipal corporation authorities and in private sectors too. Contact: hr@nascentinfo.com Company: www.nascentinfo.com
Posted 1 month ago
1.0 - 3.0 years
3 - 6 Lacs
New Delhi, Gurugram
Work from Office
Hiring For US / UK Travel BPO With Meta / PPC Call Experience Cruise/ Flight Sales Experience must Fluent English communication Open For immediate joining and rotational shift must "No other Process Experience can apply Call Shristi 7838882457 Required Candidate profile Below Mentioned Current profile and Salary Brackets Customer Support- 30 to 45 k Sales - 40 to 65 k ( PPC/Meta/ Cruise) SEO - 30 k QA - upto 35 k Perks and benefits Both side transport Meal incentive
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Title: AI Engineer / Developer Location: Gurugram Job Type: Full-Time Shift: EST shift (7pm IST-4am IST) Company Overview: Serigor Inc is Maryland based, CMMI L3, Woman Owned Small Business (WOSB) specializing in IT Services, IT Staff Augmentation, Government Solutions and Global Delivery. Founded in 2009, we are a leading IT services firm that delivers deep expertise, objective insights, a tailored approach and unparalleled collaboration to help US government agencies and Fortune 500 companies confidently face the future while increasing the efficiency of their current operations. Our professional services primarily focus on ITS services portfolio including but not limited to Managed IT Services, Enterprise Application Development, Testing and Management Consulting, Salesforce, Cloud and Infrastructure Consulting, DevOps Consulting, Migration Consulting, Service Management, Custom Implementation and IT Operations & Maintenance, Remote Application & Infrastructure Monitoring and Management practices. Position Overview: We are seeking a talented and self-driven AI Engineer / Developer to join our team and contribute to cutting-edge projects involving large language models (LLMs) and document intelligence. This role offers the flexibility to work remotely and can be structured as full-time or part-time, depending on your availability and interest. You will play a critical role in leveraging generative AI to extract structured insights from unstructured content, refine prompt engineering strategies, and build functional prototypes that bridge AI outputs with real-world applications. If you're passionate about NLP, LLMs, and building AI-first solutions, we want to hear from you. Key Responsibilities: Document Intelligence : Leverage large language models (e.g., OpenAI GPT, Anthropic Claude) to analyze and extract meaningful information from various types of documents, including PDFs, contracts, compliance records, and reports. Data Structuring : Convert natural language outputs into structured data formats such as JSON, tables, custom templates, or semantic tags for downstream integration. Prompt Engineering : Design, write, and iterate on prompts to ensure high-quality, repeatable, and reliable responses from AI models. Tooling & Prototyping : Develop lightweight tools, scripts, and workflows (using Python or similar) to automate, visualize, and test AI interactions. Model Evaluation : Run controlled experiments to evaluate the performance of AI-generated outputs, identifying gaps, edge cases, and potential improvements. Pipeline Integration : Collaborate with software engineers and product teams to integrate LLM pipelines into broader applications and systems. Traceability & Transparency : Ensure each piece of extracted information can be traced back to its original source within the document for auditing and validation purposes. Required Skills & Qualifications: Experience : Minimum of 3 years in AI/ML development, with a strong focus on natural language processing (NLP) , document analysis , or conversational AI . LLM Expertise : Hands-on experience working with large language models (e.g., GPT-4, Claude, Mistral) and prompt-based interactions. Programming Skills : Proficient in Python and experienced with modern AI frameworks such as LangChain , Hugging Face Transformers , or spaCy . Document Processing : Knowledge of embeddings , chunking strategies , and vectorization techniques for efficient document indexing and retrieval. Vector Databases : Familiarity with FAISS , Chroma , Pinecone , or similar vector DBs for storing and querying embedding data. Analytical Mindset : Strong ability to design, run, and interpret structured tests to measure and enhance the accuracy of AI outputs. Preferred Qualifications: RAG Workflows : Experience implementing Retrieval-Augmented Generation (RAG) systems for dynamic document querying and synthesis. Domain Exposure : Familiarity with legal , regulatory , or compliance-based documents and the unique challenges they pose. LLMOps & Deployment : Exposure to deploying AI models or pipelines, including experience with web APIs , LLMOps tooling , or cloud-native AI environments .
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Skill required: Marketing Operations - Digital Asset Management (DAM) Designation: Digital Content Management Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Help balance increased marketing complexity and diminishing marketing resources. Drive marketing performance with deep functional and technical expertise, while accelerating time-to-market and operating efficiencies at scale through Data and Technology, Next Generation Content Services, Digital Marketing Services & Customer Engagement and Media Growth Services. Role requires Digital Marketing Ads & Promotion creation/design Consists of management tasks and decisions surrounding the ingestion, annotation, cataloguing, storage, retrieval and distribution of digital assets What are we looking for? Problem solving and research skills Project management skills Ability to work under pressure and to work on multiple projects concurrently Excellent organizational and communication skills Effective verbal and written communication skills Demonstrates collaborative and professional work ethic Willingness to be flexible and respond to quickly changing priorities Experienced in one or more digital asset management technologies Ability to design systems for tagging images and cataloging taxonomy Knowledge of Adobe Creative Cloud software Knowledge of Capture One (for photo assets) In-depth knowledge of file formats (print, digital, and video) Basic knowledge of Search Engine Optimization (SEO) Organizational skills to develop a successful strategy that expands future digital content discovery and delivery to the CMS Basic knowledge of talent contracts and Rights Management Working knowledge of media usage rights terminology Familiarity with legal approval processes in the licensing industry Roles and Responsibilities: The DAM librarian manages internal company assets, provides skills in content and digital asset management (i.e.taxonomy,indexing,cataloging,archiving, metadata tagging,content management systems).As a function reporting to Creative Services, the Digital Asset Management (DAM) team strategically implements and manages the DAM tool which will house all final creative assets,including,but not limited to,graphic design elements,photography,packaging,catalog content,iconography,video,music,and content.This role will need to work with the users of the DAM system to support their use and navigation around the solution.The DAM Librarian will be the first point of contact for the users and be responsible for working collaboratively in order to understand their working processes with respect to metadata and taxonomy. This requires working closely with the other content teams,such as CMS, Platform Marketing, Regulatory Operations,Labeling,and Publishing,to ensure consistent metadata standards are used. In addition,the DAM Librarian is responsible for managing schemas and dictionaries within the DAM system.This position will influence the strategic set-up and business use of digital assets and unstructured content (i.e.,videos,images,etc).It will help establish process / user management standards,including best practice metadata, taxonomy, permissions and workflows that span the company’s global businesses. It will lead the alignment of taxonomy across company BUs and markets. It will provide support for user accounts, training and establish / track KPIs. It will be the functional steward for"single source of the truth" by making DAM the master resource for digital assets.Primary Responsibilities and Duties: Prepare a long-term plan for categorizing, indexing and archiving all content and information resources, whether they be generated in-house or derived from third-party agents Develop a comprehensive taxonomy for organizing information resources based on business goals and requirements from stakeholders Assess, recommend, and purchase corporate library development tools as required,and track new standards and methodologies Compile and maintain a detailed inventory of existing electronic and print marketing tactic resources,and identify knowledge gaps and make recommendations Properly select and annotate a large volume of media into the digital asset management system to facilitate retrieval and use in production Evaluate digital assets for archiving, research rights issues,and maintain quality control of archiving operations Support effective working relationship
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About the role: This is a remote, full time consulting position (contract) responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics) to optimize digital channels and technology innovations with the end goal of creating competitive advantages for food services industry around the globe. We’re looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges. We’re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data and cloud technologies. Candidates must also be proficient with data programming languages (Python and SQL), AWS cloud and Snowflake Data Platform. The data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects, with the intent of elevating the customer experience and driving revenue and profit growth globally. Qualification & Experience: Must have a full-time Bachelor's degree in Computer Science or similar from an accredited institution At least 3 years of experience as a data engineer with strong expertise in Python, Snowflake, PySpark, and AWS Proven experience delivering large-scale projects and products for Data and Analytics, as a data engineer Skill Set Requirement: Vast background in all things data-related 3+ years of real-world data engineering development experience in Snowflake and AWS (certifications preferred) Highly skilled in one or more programming languages, must have Python, and proficient in writing efficient and optimized code for data integration, storage, processing, manipulation and automation Strong experience in working with ELT and ETL tools and being able to develop custom integration solutions as needed, from different sources such as APIs, databases, flat files, and event streaming. Including experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data Strong programming skills in SQL, with proficiency in writing efficient and optimized code for data integration, storage, processing, and manipulation Strong experience in designing and implementing Data Warehousing solutions in AWS with Snowflake Good understanding of Data Modelling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Proven experience as a Snowflake Developer, with a strong understanding of Snowflake architecture and concepts Proficient in Snowflake services such as Snowpipe, stages, stored procedures, views, materialized views, tasks and streams Robust understanding of data partitioning and other optimization techniques in Snowflake Knowledge of data security measures in Snowflake, including role-based access control (RBAC) and data encryption Experience with Kafka, Pulsar, or other streaming technologies Experience orchestrating complex task flows across a variety of technologies, Apache Airflow preferred Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, ECS/ECR, IAM, CloudWatch, EKS, API Gateway, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues Responsibilities: Follow established design and constructed data architectures. Developing and maintaining data pipelines (streaming and batch), ensuring data flows smoothly from source (point-of-sale, back of house, operational platforms and more of a Global Data Hub) to destination. Handle ETL/ELT processes, including data extraction, loading, transformation and loading data from various sources into Snowflake to enable best-in-class technology solutions Play a key role in the Data Operations team - developing data solutions responsible for driving Growth Contribute to standardizing and developing a framework to extend these pipelines globally, across markets and business areas Develop on a data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best-in-breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.) Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points Ensure the reliability, scalability, and efficiency of data systems are maintained at all times Assist in the configuration and management of Snowflake data warehousing and data lake solutions, working under the guidance of senior team members Work with cross-functional teams, including Product, Engineering, Data Science, and Analytics teams to understand and fulfill data requirements Contribute to data quality assurance through validation checks and support data governance initiatives, including cataloging and lineage tracking Takes ownership of storage layer, SQL database management tasks, including schema design, indexing, and performance tuning Continuously evaluate and integrate new technologies to enhance data engineering capabilities and actively participate in our Agile team meetings and improvement activities Fusemachines is an Equal opportunity employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR s7fNo7jx6d
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Database Designer Location: Work From Office – Chennai Employment Type: Contract About The Role We’re hiring a Database Designer to build, structure, and optimize our database systems. You’ll focus on creating reliable, efficient, and scalable databases that serve our products and internal tools. Responsibilities Design logical and physical databases based on user and business requirements Translate data models into optimized database schema Work closely with software developers and data teams to integrate systems Optimize performance and storage Maintain documentation for schemas, relationships, and procedures Requirements 5+ years of experience in database design or data modeling Strong knowledge of RDBMS like PostgreSQL, MySQL, or Oracle Familiarity with normalization, indexing, and performance tuning Hands-on experience with ER modeling tools (e.g., ERwin, Lucidchart) Strong SQL skills Nice To Have Exposure to NoSQL databases (MongoDB, Cassandra) Understanding of data privacy and security standards Skills: normalization,postgresql,performance tuning,cassandra,sql,security standards,indexing,rdbms,databases,mysql,oracle,aws,nosql,data privacy,mongodb,er modeling tools,data modeling
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary Job Description for Python Developer Must to have Python Programming: ¿ Proficiency in Python 3.12, with a strong understanding of its syntax and features. ¿ Experience in writing clean, readable, and maintainable code. ¿ Knowledge of Python's standard library and Data processing and transformation using pandas library ¿ Strong understanding of asynchronous programming paradigms in Python. ¿ Experience with async/await syntax and event loops. Data Validation and Serialization: ¿ Proficiency in using Pydantic for data validation and settings management for Python based applications. ¿ Understanding of type annotations and data modelling with Pydantic. Web Frameworks and API Development: ¿ Experience in developing web applications using FastAPI. ¿ Knowledge of FastAPI's features such as dependency injection, routing, and authentication mechanisms. ¿ Ability to design and implement RESTful APIs with FastAPI. ¿ Writing unit test cases Database Management: ¿ Experience with MongoDB, including schema design, querying, indexing, and performance optimization. ¿ Familiarity with Motor, the async MongoDB driver for Python, and ability to write asynchronous database operations. Nice to have Microservices Architecture: ¿ Experience with designing, developing, and deploying microservices based applications. ¿ Understanding of microservices patterns and principles, including service discovery, configuration management, and inter service communication. ¿ Familiarity with container orchestration platforms like Kubernetes. Software Development Best Practices: ¿ Familiarity with version control systems, preferably Git. ¿ Understanding of testing practices, including unit testing and integration testing. ¿ Experience with continuous integration/continuous deployment (CI/CD) workflows using Azure Pipelines or similar tools.
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Percona is looking for a flexible and efficient Senior C/C++ Software Engineer to join its Open Source Software Development team. You will work on the Percona Server for MySQL, Percona XtraBackup, Percona XtraDB Cluster and Oracle MySQL. You will work from your home office, using online tools and resources to contribute to a fast moving and high quality development environment for Percona and its customers. Travel approximately once per year for meetings. Access to a reliable high-speed internet connection is required. Flexible work hours. Laptop & Internet access are provided. What You Will Do: Design and implement new features and improvements Diagnose and fix defects in Percona Server for MySQL, Percona XtraBackup, Percona XtraDB Cluster and Oracle MySQL products Perform periodic source code merges from other open source repositories Contribute to upstream projects that we leverage to bring our own solutions to the community Develop test cases for continuous integration deployment Participate in code and design reviews Mentor and guide other team members in their own career development Blog, present and evangelize our software to help increase adoption and keep our thriving community growing Engage with other departments in Percona to ensure we’re delivering value driven solutions What Have You Done: Bachelor's degree in Computer Science or a related field, or equivalent experience Ability to speak, listen and write effectively in fluent English Minimum 3+ years of experience in C/C++ development and object oriented design Strong understanding of SQL databases, preferably MySQL Strong understanding of DBMS internal algorithms such as those used in query optimiser, B-tree indexing, multiversion concurrency control, ARIES crash recovery, and others Strong understanding of compression and encryption algorithms and techniques Experience with client/server or distributed network communication systems Experience with high concurrency threading models, atomics, and locking primitives Knowledge of various programming and scripting languages such as BASH, Perl, and Python Proficiency with tools like gdb and strace Proficiency with git, github, and source code management methodologies Knowledge of Open Source software concepts and community What Will Make You Stand Out: Knowledge of high-performance algorithms on modern multicore hardware Established history of driving performance enhancements Proven experience in implementing features in Databases Experience with Continuous Integration and Continuous Delivery tools such as Jenkins, CircleCI, and Travis Experience with issue tracking, communication, and information sharing tools such as JIRA, G-Suite, Slack Experience/familiarity with various Linux distribution packaging systems Experience with virtualization and containment tools such as OpenVZ, VirtualBox, Docker, Kubernetes, etc Experience in Linux systems administration, including suitable expertise with file systems, hardware, and networking Success working in a distributed environment where e-mail, Slack, and voice calls are the only interactions with clients, colleagues, and managers on a daily basis Ability to work autonomously and mostly asynchronously with the rest of the team Why Percona? At Percona, we believe an open world is a better world. Our mission is to enable everyone to innovate freely, by providing the best open source database software, support, and services. We make databases and applications run better through a unique combination of expertise and open source software built with the community for you. Our technical teams are experts in MySQL, MongoDB, PostgreSQL, and MariaDB. Percona is proud to be a remote-only and globally dispersed workforce – we have colleagues in more than 50 countries! We offer a collaborative, highly-engaged culture where your ideas are welcome and your voice is heard. Our staff receives generous benefits including flexible work hours and various paid time off programs, all your equipment for your remote office, funds for career development (external training, certifications, conferences), ongoing connectivity allowances, and the opportunity to participate in our equity incentive plan. We also have benefits that support a healthy work/life balance such as The Percona Adventure Team, Work-from-Anywhere, FlowDays, FryDays, and overall flexibility. We also support being socially responsible through our PAVE volunteering program and Women Transforming Technology. If you love the idea of working with a high-growth tech company that is one of the best in the business and known globally as a leader in the open-source database space, let’s talk! Connect with us and stay up to date on our latest news and developments by following us on LinkedIn and Twitter. We look forward to connecting with you!
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
We are seeking an experienced Oracle SQL and PL/SQL Performance Tuning Specialist to join our team. The ideal candidate will be responsible for analyzing, troubleshooting, and optimizing database queries and PL/SQL code to ensure maximum performance and scalability for mission-critical applications. Key Responsibilities Analyze and improve the performance of complex SQL queries, PL/SQL packages, procedures, triggers, and functions. Identify bottlenecks in database applications and recommend tuning solutions. Work with developers to optimize database designs, indexing strategies, and SQL code. Perform root cause analysis for performance issues in production and non-production environments. Utilize Oracle tools (e.g., AWR, ADDM, SQL Trace, TKPROF, SQL Monitor) to monitor and analyze performance. Provide guidance and best practices for writing efficient SQL and PL/SQL code. Collaborate with DBAs, application developers, and architects to ensure performance considerations are addressed during development and deployment. Review execution plans and suggest query rewrites or optimizations. Participate in code reviews with a focus on performance aspects. Prepare documentation for tuning recommendations and performance baselines. Required Skills and Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 5+ years of experience in Oracle SQL and PL/SQL development and tuning. Strong knowledge of Oracle database architecture and internals. Expertise in interpreting Oracle execution plans. Experience using Oracle performance tools such as AWR, ADDM, SQL Trace, TKPROF, SQL Monitor, etc. Proficiency in writing complex SQL queries and PL/SQL code. Understanding of indexing strategies, partitioning, and optimizer hints. Knowledge of performance tuning in high-volume transactional systems. Strong analytical and problem-solving skills. Excellent communication and documentation skills. Preferred Skills Experience with Oracle versions 12c, 19c, or later. Exposure to Exadata, RAC, or Oracle Cloud Infrastructure (OCI). Familiarity with performance testing tools. Understanding of data modeling and database design. Experience in working in Agile environments.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Gurugram
Work from Office
Competencies / Experience: PL/SQL experience5 years. Creating CTE, Window Functions, Views, and Stored procedures 2 years Hands on experience with Snowflake2 years. Strong programming experience in Java script 1 years Additionally, familiarity with any of the following highly desirableJira, GitHub, Python AWS-Athena, Tableau.
Posted 1 month ago
6.0 - 8.0 years
12 - 18 Lacs
Kochi, Chennai, Bengaluru
Hybrid
Greetings! Currently hiring MSBI Developer & Support Role : MSBI Developer - Production Support Shift : Night Shift Exp : 6 to 8 Years Notice : Immediate to 20 Days Remote option available. JD: Work Type : Night Shift SSIS Highly proficient in SSIS with Strong knowledge of working in ETL development, Performance tuning, data profiling. Proficient in Data Migration from different sources using SSIS. Ability to walk through independently and logically understand existing SSIS Packages, Scheduling Jobs, and Debugging. SQL Well-versed in T-SQL Programming Highly proficient in SQL DB Complex Queries, indexing, stored procedures, functions, and joins. • Performance tuning & optimizations Basic DB Administration knowledge to handle common database procedures such as upgrade, backup, recovery, migration, etc. Knowledge of best practices when dealing with relational databases. SSRS Basic knowledge in SSRS report development & deployment. Azure Profound expertise in SQL, T-SQL, database design, and data warehousing principles. Strong experience with Microsoft Azure tools including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. Proficient in Python, PySpark, and PySQL for data processing and analytics tasks. Performance optimizations in database and query processing. Excellent problem-solving, analytical, and communication skills. Thanks, Safoora Nausheen Talent Acquisition Mobile: 9384788107 Mail to - safoora.imthiyas@aspiresys.com Website: www.aspiresys.com | Blog: http://blog.aspiresys.com
Posted 1 month ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
The Business Lead Data Analyst role is responsible for ensuring timely and accurate definition and execution of Enterprise Risk Management (ERM) requirements according to the Citi Data Governance Policy (CDGP) and Standard (CDGS). The role reports into the Enterprise Risk Management Data & Technology ERM Data Governance Lead. This is an opportunity to contribute at of a data program initiative supporting the development and creation of the data management and data governance model within ERM. The Business Lead Data Analyst responsibilities include, but are not limited, to defining and documenting Enterprise Risk Management data requirements for senior management reporting needs, defining data lineage, identifying gaps in data quality controls to reduce risk, partnering and collaborating with ERM’s data consumers and data providers on data quality remediation challenges; documenting data flows, processes, and procedures as needed; and ensuring data requirements are implemented including key metrics measuring the successful implementation. Focusing on commitments of each phase of the Enterprise Risk Management Data Plans, the Business Lead Data Analyst is instrumental in ensuring the documentation, collection and indexing of artifacts, ensuring various data management program tools are kept up to date, project tracking, and ensuring adherence to Citi policy, standards, and guidelines for regulatory remediation efforts. The role requires understanding of data management, data governance and data operating models; risk management or enterprise risk management; risk and/or enterprise risk processes, systems and reporting. The role requires a solid conceptual and practical grounding in Enterprise Risk Management. The role requires excellent systems and data analysis skills, in addition to strong presentation / communication skills to negotiate internally regarding conflicting priorities--often at a senior level. Responsibilities: Engage business data owners and technology teams to gather and document data requirements including data lineage, system and data flows and data quality rules for critical Enterprise regulatory and management reports. Aggregate, analyze, and document success criteria and metrics to measure progress against Benefits/Outcomes as defined by the report owner. Manage data quality scorecards and ongoing monitoring controls to identify data quality issues, perform root-cause analysis, identify recommendations for improvement, and remediation prioritization. Support data accuracy, timeliness and completeness by aligning work output to key data capabilities and tools including metadata repositories, data dictionaries, business process maps, metrics, controls, and scorecards. Partner with data consumers and upstream data providers to agree on the scope of critical data quality challenges and ensure implementation and adherence to Citi’s Data Operating model. Identify and document key project risks and manage to resolution or escalate accordingly. Support execution of and alignment to Citi’s Data Governance Policy (CDGP) and corresponding Standards. Support Data Leads in Milestone and Deliverable execution including gathering, storing, and publishing key project artifacts for closure. Manage individual project responsibilities including task and actions management, coordination and execution of plan activities, minutes, and status reporting within required timelines and to stakeholder quality expectations. Support standing up governance forums, reporting, and tooling for ERM Data Operating Model implementation. Coordinate between Enterprise Data Office (EDO), ERM , Risk Category, Risk Pillar, Finance, Technology, and PMO Teams. Provide project status reporting updates in coordination with respective PMO teams, including change controls, risks, issue, and path-to-green submissions. Support ERM Data & Tech Team in tracking and remediation of RAID log items. Distribute meeting invites as needed based on cadence and audience identified by the ERM Data Use Case (project) leads. Provide support documenting meeting minutes and action items in a centralized location. Qualifications 6-8 years of relevant experience with business systems and/or data analysis 5+ years of experience in banking and financial services industry Extensive experience working in data governance, data management or related roles including support of data policies and standards. Proven experience driving data quality initiatives, aligning business processes with data, data standards/policies, and data-related issue management and remediation efforts. Experience with implementing data technology solutions and capabilities and/or working on large cross-functional business initiatives. Strong understanding of data governance principles, frameworks, and best practices including supporting data quality initiatives, aligning business processes with data, implementing technology data quality solutions, and data-related issue management and remediation efforts, working on large, global cross line of business initiatives. Enterprise risk management or risk management category (e.g., Markets, Wholesale, Credit, Operational) experience preferred. Experience with data management processes, tools, and applications, including process mapping and lineage toolsets. Ability to communicate (both verbal and written) in a clear, confident, and open-minded manner to build trusting partnerships with a wide variety of audiences and stakeholders. Proven relationship management skills to partner and influence across organizational lines. Ability to quickly grasp and master new concepts / requirements and related product / functional knowledge. High-level professional proficiency in both Excel and PowerPoint and Data flowcharts are must- have skills. Should be an initiative-taking, highly focused, meticulous collaborator with high energy levels and the desire to learn and progress within the company. Proven ability to work with large data volumes and demonstrate a firm understanding of logical data structures and analysis techniques. Ability to work independently, multi-task, and take ownership of various parts of a project or initiative. Education: Bachelor’s/University degree, Master’s degree preferred Certification in data governance or related areas (e.g., CDMP, etc.) is a plus ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 month ago
3.0 - 8.0 years
11 - 15 Lacs
Pune
Work from Office
About the job: The Red Hat Quality Engineering team is looking for a Quality Engineer to join us in Pune India. In this role, you will write new and optimize existing test automation scripts and frameworks. You'll also work on the analysis of the non-deterministic failures, ways to reduce them and report bugs and defects. This team is home to some of the most well-known faces in the international QE community and will set you up for an exciting and rewarding SDET journey. Our team culture encourages you to come up with innovative solutions to problems and allow you to work with some of the brightest engineers in the open-source industry. What will you do Test new features and ensure fast feedback for developers Create and maintain testing artifacts Write automated test scripts, analyze test results, and identify and file bugs accordingly Test bugs and write bug reproducers Write API and Performance tests scripts What will you bring 3+ years of experience in Quality Engineering Hands-on experience writing code in at least one programming language, preferably Python, Java, or JavaScript Strong understanding of HTTP communication protocols Expertise in testing and working with RESTful APIs Proficiency API test automation Experience building and testing systems designed to handle large datasets, with attention to performance and scalability Familiarity with distributed systems and an understanding of challenges related to consistency, availability, and partitioning Solid grasp of core testing concepts, including functional, sanity, regression testing, etc. Excellent observation skills with a strong attention to detail Self-motivated with the ability to take direction and collaborate effectively within a team Eagerness to learn new technologies and apply problem-solving skills Decision-making skills on day-to-day development Intermediate written and verbal communication skills in English Apache Solr, Indexing, RestAssured, Python, AI/ML experience is good to have About Red Hat Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Inclusion at Red Hat Red Hats culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village. Equal Opportunity Policy (EEO) Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email application-assistance@redhat.com . General inquiries, such as those regarding the status of a job application, will not receive a reply.
Posted 1 month ago
4.0 - 8.0 years
8 - 13 Lacs
Bengaluru
Work from Office
In an increasingly connected world, the pandemic has highlighted just how essential telecom networks are to keeping society running.The Network Infrastructuregroup is at the heart of a revolution to connectivity, pushing the boundaries to deliver more and faster network capacity to people worldwide through our ambition, innovation, and technical expertise Join Optical Networks division, where innovation meets scale in the AI-driven data center era. With the recent acquisition of Infinera, weve united two industry leaders to create an optical networking powerhousecombining cutting-edge technology with proven leadership to redefine the future of connectivity. Infinera is now part of the Nokia Corporation and its subsidiaries. When you apply, the information you share will be handled with care and used only for recruitment purposes within the group. We are looking for experienced SW development engineers to join our R&D team. Should have good working experience on Java Single Stack SAP PI/PO system Should have strong expertise in all the standard Should have strong Java knowledge for development of complex UDFs and Java mapping programs Knowledge of test tools such as SOAPUI, FTP, Integration directory, for B2B, A2A with SAP PI/PO A good understanding of PO Installation and Experience of integrating SAP and NON SAP cloud Good to have knowledge of SAP CPI Good Communication skills Excellent client Liaising skills The resource is expected to work closely in Designing, Configuring, Developing, troubleshooting Integration Scenarios, SLD, Integration Repository Maintenance, User Administration, Client Administration The resource is expected to work closely with functional and application teams regarding the viability of to cloud Supporting the design, development and maintenance of SAP PI/PO interfaces solutions and implementation of new interfaces for the interpreting functional requirements and developing Developing and supporting complex Java mapping programs and Custom UDFs Creating Technical Specifications, Mapping documents, Unit Test documents and Functional Unit Testing Perform unit test and support functional, integration ,performance and regression testing Preparing technical and user documentation for entire systems and interdependent applications Working with application teams and users to identify, troubleshoot and remedy issues
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary SAP HANA DB Modeler/Developer" is responsible for designing, developing, and maintaining data models within the SAP HANA database, utilizing advanced modeling techniques to optimize data access and analysis for reporting and applications, often collaborating with business analysts to translate requirements into efficient database structures while ensuring data integrity and performance within the HANA platform. Must have skills should include: Hana Views( Analytical, Calculation etc.) , SAP HANA XS JavaScript and XS OData services, Advanced DB Modelling for Hana, SAP HANA Data Services ETL based Replication amongst others. Minimum 2 3 end to end implementations. Key responsibilities may include: ¿ Data Modeling: ¿ Designing and creating complex data models in SAP HANA Studio using analytical views, attribute views, calculation views, and hierarchies to represent business data effectively. ¿ Implementing data transformations, calculations, and data cleansing logic within the HANA model. ¿ Optimizing data structures for fast query performance and efficient data access. ¿ Development: ¿ Writing SQL scripts and stored procedures to manipulate and retrieve data from the HANA database. ¿ Developing custom HANA functions (CE functions) for advanced data processing and calculations. ¿ Implementing data loading and ETL processes to populate the HANA database. ¿ Performance Tuning: ¿ Analyzing query performance and identifying bottlenecks to optimize data access and query execution. ¿ Implementing indexing strategies and data partitioning for improved query performance. ¿ Collaboration: ¿ Working closely with business analysts to understand data requirements and translate them into technical data models. ¿ Collaborating with application developers to integrate HANA data models into applications. ¿ Security and Governance: ¿ Implementing data security measures within the HANA database, defining user roles and permissions. ¿ Maintaining data quality and consistency by defining data validation rules. Required Skills: ¿ Technical Skills: ¿ Strong understanding of relational database concepts and data modeling principles. ¿ Expertise in SAP HANA modeling tools and features (HANA Studio, Calculation Views, Analytical Views) ¿ Proficiency in SQL and SQL optimization techniques ¿ Knowledge of data warehousing concepts and best practices ¿ Soft Skills: ¿ Excellent analytical and problem solving abilities ¿ Strong communication skills to collaborate with business users and technical teams ¿ Ability to work independently and as part of a team
Posted 1 month ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. We’re looking for a Software Engineer Specialist to join our ever-evolving Valutec team and help us unleash the potential of every business. What you’ll own as - Software Engineer Specialist Need a quick study. Strong Microsoft SQL developer. POS transaction experience preferred. Ability to read .NET C sharp code. Strong understanding of indexing, triggers, stored procedures. Works well with users, flexible hours, available to take calls when necessary. What You’ll Bring Strong SQL skills and works well with Kanban team POS transaction experience Microsoft TSQL Ability to deliver results and targets Excellent communication and interpersonal skills. Ability to maintain internal and external stakeholder relationships Ability to work to tight deadlines in a fast paced environment About The Team Our Tech and Security teams keep us moving each day, no matter where we are in the world. From the hardware to the networks and everything between, they humbly make it all happen. The Valutec team supports the Valutec platform, which is a closed loop gift card platform. We provide customized reports to customers for their gift cards as well as making sure the platform is up and running. Work with internal groups such as IT and outside entities including internet service providers, financial institutions, legal counsel, and law enforcement in the implementation of policies and evaluation of specific fraud-related issues and deter fraudulent online customer-related activity. What Makes a Worldpayer What makes a Worldpayer? It’s simple: Think, Act, Win. We stay curious, always asking the right questions and finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open and winning and failing as one. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. LinkedIn # ( Susmita Tripathy) Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. We’re looking for a Software Engineer Specialist to join our ever-evolving Valutec team and help us unleash the potential of every business. What you’ll own as - Software Engineer Specialist Need a quick study. Strong Microsoft SQL developer. POS transaction experience preferred. Ability to read .NET C sharp code. Strong understanding of indexing, triggers, stored procedures. Works well with users, flexible hours, available to take calls when necessary. What You’ll Bring Strong SQL skills and works well with Kanban team POS transaction experience Microsoft TSQL Ability to deliver results and targets Excellent communication and interpersonal skills. Ability to maintain internal and external stakeholder relationships Ability to work to tight deadlines in a fast paced environment About The Team Our Tech and Security teams keep us moving each day, no matter where we are in the world. From the hardware to the networks and everything between, they humbly make it all happen. The Valutec team supports the Valutec platform, which is a closed loop gift card platform. We provide customized reports to customers for their gift cards as well as making sure the platform is up and running. Work with internal groups such as IT and outside entities including internet service providers, financial institutions, legal counsel, and law enforcement in the implementation of policies and evaluation of specific fraud-related issues and deter fraudulent online customer-related activity. What Makes a Worldpayer What makes a Worldpayer? It’s simple: Think, Act, Win. We stay curious, always asking the right questions and finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open and winning and failing as one. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. LinkedIn # ( Susmita Tripathy) Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
Remote
Mizuho Global Services India Pvt. Ltd. Mizuho Global Services Pvt Ltd (MGS) is a subsidiary company of Mizuho Bank, Ltd, which is one of the largest banks or so called ‘Mega Banks’ of Japan. MGS was established in the year 2020 as part of Mizuho’s long-term strategy of creating a captive global processing centre for remotely handling banking and IT related operations of Mizuho Bank’s domestic and overseas offices and Mizuho’s group companies across the globe. At Mizuho we are committed to a culture that is driven by ethical values and supports diversity in all its forms for its talent pool. Direction of MGS’s development is paved by its three key pillars, which are Mutual Respect, Discipline and Transparency, which are set as the baseline of every process and operation carried out at MGS. Position:- Senior Splunk SME About the Role: We are seeking a highly skilled and experienced Senior Splunk SME to join our dynamic team. You will play a pivotal role in leveraging Splunk's capabilities to drive actionable insights from our vast data sets, enabling us to make informed decisions and optimize our operations. Roles and Responsibilities: · Design, implement, and maintain Splunk environments, including data ingestion, indexing, search, and reporting. · Develop and optimize Splunk dashboards, alerts, and reports to meet specific business requirements. · Provide technical expertise and support for Splunk related projects and initiatives. · Troubleshoot and resolve Splunk related issues in a timely manner. · Collaborate with cross-functional teams to identify and address data security and compliance risks. · Stay up-to-date on the latest Splunk technologies and best practices. Additional Skills (Preferred): · Experience with Splunk Enterprise Security (ES). · Experience with Splunk Machine Learning Toolkit (MLTK). · Experience with Splunk Cloud. · Experience with scripting languages (e.g., Python, PowerShell). Qualifications: · Bachelor's degree in computer science, information technology, or a related field. · Splunk certifications (e.g., Splunk Certified Administrator, Splunk Certified Developer). · Strong problem-solving and analytical skills. · Excellent communication and interpersonal skills. · Ability to work independently and as part of a team. Relevant Experience: · 7-10 years of experience in Splunk administration, engineering, or a related field. · Strong understanding of Splunk architecture, components, and workflows. · Experience with designing and implementing Splunk search processing orders (SPOs). · Experience with developing and optimizing Splunk dashboards, alerts, and reports. · Experience with integrating Splunk with other enterprise applications and systems. · Experience with data security and compliance best practices. · Experience with cloud platforms (e.g., AWS, Azure, GCP). Address : Address: Mizuho Global Services India Pvt. Ltd, 11th Floor, Q2 Building Aurum Q Park, Gen 4/1, Ttc, Thane Belapur Road, MIDC Industrial Area, Ghansoli, Navi Mumbai- 400710. Interested candidates can send resume on mgs.rec@mizuho-cb.com along with the below details. Current CTC Expected CTC Notice period Experience in SOC Available for F2F ? Address: Mizuho Global Services India Pvt. Ltd, 11th Floor, Q2 Building Aurum Q Park, Gen 4/1, Ttc, Thane Belapur Road, MIDC Industrial Area, Ghansoli, Navi Mumbai- 400710.
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Client Type: US Client Location: Remote This is a 6-month freelance contract, offering up to 30 hours per week. We are seeking a dynamic and knowledgeable Subject Matter Expert (SME) to play a key role in the development and delivery of a certification program focused on Microsoft Data Architecture for Modern Data Stacks. The successful candidate will leverage their deep expertise in designing, implementing, and managing modern data architecture using Microsoft Fabric, Azure, and Power Platform tools to shape the curriculum, create training materials, and empower learners with practical skills in areas such as unified data lakes, lakehouse and data warehousing patterns, data ingestion, transformation, orchestration, and advanced data governance. This role requires a strong passion for education, excellent communication skills, and the ability to effectively convey complex technical information to support beginner, intermediate, and advanced architects. As this is a freelance position, we're seeking individuals with a proven track record of successfully managing freelance engagements and multiple client relationships. Responsibilities: Collaborate with a team of learning experience designers to help create and validate training materials. Review a skills task or job task analysis for accuracy and completeness, providing feedback on essential vs. nice-to-know tasks and suggesting improvements. Review a high-level program outline and provide feedback on the order and complexity of topics for the intended audience. Review a detailed program outline, ensuring alignment with the high-level program outline, and helping to confirm that content is presented in the correct order and format. Validate AI-generated content to ensure it conforms to learning objectives and is technically accurate. Support creation of content-specific graphics such as tables, flow charts, screen captures, etc. Create any assets required to make demonstrations that showcase specific procedures and skills. Create recordings of software demonstrations and related audio scripts. Coordinate with learning experience designers to build out the assets, steps, and technical elements needed for hands-on projects such as exercises, labs, and projects. Be available during US business hours, M-F, for content reviews, questions, and occasional meetings. Work on the company's systems for all work, including email, messaging platform, and cloud-based file storage systems. Log time weekly and invoice time monthly. Essential Tools & Platforms Microsoft Fabric OneLake Fabric Lakehouse Fabric Data Warehouse Data Factory (Fabric) Dataflows Gen2 Event Streams Data Activator Microsoft Purview Power BI Copilot in Fabric Copilot Studio Azure Monitor Azure Stream Analytics Microsoft Entra ID (formerly Azure AD) SQL Server / T-SQL Visual Studio Code (for development, if applicable for notebooks/scripts) Required Skills & Experience Proven hands-on experience designing, building, and optimizing modern data architectures using Microsoft Fabric, including OneLake, Fabric Lakehouse, Fabric Data Warehouse, Data Factory (Fabric), Dataflows Gen2, Event Streams, Data Activator, Microsoft Purview, and Power BI. Demonstrated ability to architect and implement unified data lakes with OneLake, leveraging open data formats (Delta, Parquet, Iceberg), and medallion architectures (bronze/silver/gold zones). Skilled in building and managing Lakehouse solutions using Delta tables, managed folders, and notebooks. Expertise in designing, deploying, and querying Fabric Data Warehouses with advanced T-SQL, including schema design (star, snowflake, data vault), partitioning, indexing, and compute scaling. Experience with distributed and replicated tables, and DirectLake mode for high-speed analytics. Practical experience creating robust ETL/ELT pipelines using Data Factory (Fabric), mapping dataflows, notebooks, and SQL transformations. Skilled in handling schema evolution, parameterization, error handling, and performance tuning. Experience with real-time data processing using Event Streams and Data Activator. Deep understanding of data governance using Microsoft Purview, including data cataloging, classification, sensitivity labeling, lineage visualization, and compliance mapping (GDPR, HIPAA). Ability to define domain ownership, stewardship, and glossary terms within Purview. Proficiency in enforcing identity and access control with Microsoft Entra ID, configuring row level and column-level security, and applying RBAC and service principal authentication across Lakehouse, Warehouse, and Power BI. Experience with auditing, monitoring, and securing data architectures. Strong experience building reusable Power BI semantic models, defining DAX measures, implementing incremental refresh, and leveraging DirectLake connectivity. Skilled in designing and publishing secure, interactive dashboards and embedded analytics solutions. Ability to recognize and apply Lakehouse, Warehouse, Mesh, and hybrid patterns based on business needs. Experience with performance optimization, cost control, modular design, and decentralized domain architectures. Familiarity with Copilot in Fabric, Data Agents, and AI-powered automation in Power BI for pipeline generation, natural language querying, and workflow optimization. Demonstrated success designing, reviewing, and delivering hands-on labs, real-world projects, and portfolio artifacts (architecture diagrams, pipeline configs, governance plans, BI reports) for intermediate data professionals. Exceptional ability to articulate complex technical concepts clearly for diverse audiences. Experience creating technical documentation, screencasts, and video tutorials. Strong understanding of adult learning principles and instructional best practices. Experience in reviewing training materials for technical accuracy and clarity. A strong understanding of adult learning principles is a plus. Essential experience in training, learning, and development, or teaching. Proven ability to create and deliver effective screencasts and video tutorials. Strong ability to articulate complex technical concepts in an accessible manner. Availability to work during the US time zones' business hours. Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related technical field. 5+ years of hands-on experience designing, building, and deploying complex data solutions specifically on Microsoft Fabric, Azure data services, and Power Platform, covering areas like data lakes, lakehouses, data warehouses, ETL/ELT, and governance. Demonstrable expertise with key Microsoft data services and tools, including, but not limited to, Microsoft Fabric, OneLake, Fabric Lakehouse, Fabric Data Warehouse, Data Factory (Fabric), Microsoft Purview, Power BI, Event Streams, and Data Activator. Proven experience in technical training, learning & development, or teaching, with a strong ability to create, review, and deliver high-quality, technically accurate training materials (e.g., course outlines, hands-on exercises, video tutorials). Exceptional technical communication skills, including the ability to articulate complex concepts clearly to diverse audiences and a strong understanding of adult learning principles. Strong problem-solving, debugging, and communication skills.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France