Home
Jobs
Companies
Resume

356 Parsing Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon Payment Experience Platform team at Amazon India Development Center, Hyderabad is looking for a SDM to build the next generation of Payments platform and product from the ground up. This is a rare opportunity to be part of a team that will be responsible for building a successful, sustainable and strategic business for Amazon, from the ground up! You will get the opportunity to manage Tier-1 Platforms like Reminders, SMS Parsing and Bills and Recharge AutoPay systems .This team will work on diverse technology stack from SOA, UI frameworks, big-data and ML algorithms. The candidate will be working to shape the product and will be actively involved in defining key product features that impact the business. You will be responsible to set up and hold a high software quality bar in a highly technical team of Software Engineers. Basic Qualifications 3+ years of engineering team management experience Knowledge of engineering practices and patterns for the full software/hardware/networks development life cycle, including coding standards, code reviews, source control management, build processes, testing, certification, and livesite operations Experience partnering with product or program management teams Preferred Qualifications Experience in communicating with users, other technical teams, and senior leadership to collect requirements, describe software product features, technical designs, and product strategy Experience in recruiting, hiring, mentoring/coaching and managing teams of Software Engineers to improve their skills, and make them more effective, product software engineers Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3002055 Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

Our Technology team is the backbone of our company: constantly creating, testing, learning and iterating to better meet the needs of our customers. If you thrive in a fast-paced, ideas-led environment, you’re in the right place. Why This Job’s a Big Deal Join an Agile team of professionals that are instrumental in building the next generation of travel applications. We constantly explore new technologies and engineer better solutions for ever demanding business needs. Our team of engineers, at all levels, work with the business leaders in defining the product roadmap and come up with innovative solutions to grow the future of travel. We design and develop our back end systems and REST APIs that serve hundreds of millions of searches a day, collecting and parsing data across thousands of partners to get the best deals for our customers. In This Role You Will Get To Participate in mission critical projects with direct impact on the evolution of Priceline's business. Be part of a cross-functional agile team that continuously experiments, iterates and delivers on new product objectives. Showcase your development skills of Core Java or similar programming languages. Apply your programming skills towards building low latency and high throughput transactional services with continuous integration and automated testing. Implement SQL composition skills that collects and queries data for investigation and analysis in real time from our applications. Utilize your knowledge to understand our codebase, systems and business requirements to effectively make changes to our applications. Effectively collaborate and engage in team efforts, speak up for what you think are the best solutions and be able to converse respectfully and compromise when necessary. Who You Are Bachelor’s degree or higher in Computer Science or related field. 4+ years of experience in software engineering and development. Strong coding experience with Core Java Thorough SQL compositions skills for composing queries and analysis. Comfort and experience with Spring boot and REST APIs. . Experience in Microservices is a MUST Experience with developing on Cloud, especially GCP OR AWS/ Azure would be an advantage Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust. The Right Results, the Right Way is not just a motto at Priceline; it’s a way of life. Unquestionable integrity and ethics is essential. Who We Are WE ARE PRICELINE. Our success as one of the biggest players in online travel is all thanks to our incredible, dedicated team of talented employees. Priceliners are focused on being the best travel deal makers in the world, motivated by our passion to help everyone experience the moments that matter most in their lives. Whether it’s a dream vacation, your cousin’s graduation, or your best friend’s wedding - we make travel affordable and accessible to our customers. Our culture is unique and inspiring (that’s what our employees tell us). We’re a grown-up, startup. We deliver the excitement of a new venture, without the struggles and chaos that can come with a business that hasn’t stabilized. We’re on the cutting edge of innovative technologies. We keep the customer at the center of all that we do. Our ability to meet their needs relies on the strength of a workforce as diverse as the customers we serve. We bring together employees from all walks of life and we are proud to provide the kind of inclusive environment that stimulates innovation, creativity and collaboration. Priceline is part of the Booking Holdings, Inc. (Nasdaq: BKNG) family of companies, a highly profitable global online travel company with a market capitalization of over $80 billion. Our sister companies include Booking.com, BookingGo, Agoda, Kayak and OpenTable. If you want to be part of something truly special, check us out! Flexible work at Priceline Priceline is following a hybrid working model, which includes two days onsite as determined by you and your manager (ideally selecting among Tuesday, Wednesday, or Thursday). On the remaining days, you can choose to be remote or in the office. Diversity and Inclusion are a Big Deal! To be the best travel dealmakers in the world, it’s important we have a workforce that reflects the diverse customers and communities we serve. We are committed to cultivating a culture where all employees have the freedom to bring their individual perspectives, life experiences, and passion to work. Priceline is a proud equal opportunity employer. We embrace and celebrate the unique lenses through which our employees see the world. We’d love you to join us and add to our rich mix! Applying for this position We're excited that you are interested in a career with us. For all current employees , please use the internal portal to find jobs and apply. External candidates are required to have an account before applying. When you click Apply, returning candidates can log in, or new candidates can quickly create an account to save/view applications. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Elastic, the Search AI Company, enables everyone to find the answers they need in real time, using all their data, at scale — unleashing the potential of businesses and people. The Elastic Search AI Platform, used by more than 50% of the Fortune 500, brings together the precision of search and the intelligence of AI to enable everyone to accelerate the results that matter. By taking advantage of all structured and unstructured data — securing and protecting private information more effectively — Elastic’s complete, cloud-based solutions for search, security, and observability help organizations deliver on the promise of AI. What Is The Role You will have the opportunity to work with a tremendous services, engineering, and sales team and wear many hats. This is a meaningful role, as a Consulting Architect, Observability you have an outstanding chance to create an immediate impact on the success of Elastic and our customers. What You Will Be Doing Deliver Elastic solutions and elastic stack expertise to drive customer business value from our products Work with clients to facilitate strategy, roadmap, design, and capacity planning in mission-critical environments workshops Strong customer advocacy, relationship building, and communications skills Comfortable working remotely in a highly distributed team Development of demos and proof-of-concepts that highlight the value of the Elastic Stack and Solutions Elastic solutions adaption and acceleration along with data modeling, query development and optimization, cluster tuning and scaling with a focus on fast search and analytics at scale Drive and manage the objectives, requirements gathering, project tasks/milestone, project status, dependencies, and timelines, to ensure engagements are delivered optimally and on time while meeting the business objectives Working closely engineering, product management, and support teams to identify feature improvements, extensions, and product defects. Facilitate feedback from field back to the product. Engaging with the Elastic Sales team to scope opportunities while assessing technical risks, questions, or concerns Be a mentor to your team members. What You Bring Bachelor’s, Master’s or PhD in Computer Science or related engineering field preferred, or equivalent combination of education, training, and experience. Minimum 5 years as a consultant, engineer or architect. Experiences in time series data ingestion. End to End Ingestion methods (Agent, Beats, and Logstash). Familiarity with messaging queues (Kafka, Redis). Experiences in Ingest optimization, data streams and sharding strategy. Experiences in Ingest lag analysis and improvement. Knowledge of Elastic Common Schema, data parsing and normalization. Enable customer to adapt Elastic Observability Solution and related OOTB features. Design and Build custom visual artifacts and understanding of key critical metrics that make valuable contributions to your customer. Identify thresholds for alerting. Familiarity with Fleet and agent installation policies, and scalability considerations. Knowledge in deploying enterprise observability (Metrics and Logs) solutions at scale (Application performance monitoring (APM), User experience monitoring (UEM), Infrastructure optimization, Network visibility and monitoring). Experience leading observability projects at both the architectural and program level. Experience working with monitoring tools that integrate into service management. Experience working to deliver and complete professional services engagements. Experience as a public speaker to large audiences on enterprise infrastructure software technology to engineers, developers, and other technical positions. Hands-on experience and an understanding of Elasticsearch and/or Lucene. Excel at working directly with customers to gather, prioritize, plan and implement solutions to customer business requirements as it relates to our technologies. Understanding and passion for open-source technology and knowledge and proficiency in at least one programming language. Strong hands-on experience with large distributed systems and application infrastructure from an architecture and development perspective. Knowledge of information retrieval and/or analytics domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, Google Cloud (GCP), Kafka, Redis, VM’s, Lucene. Occasional travel up to 20% Bonus Points: Big 4 Experience Deep understanding of our product, including Elastic Certified Engineer certification Comfortable with Ansible, JavaScript, Terraform ECK experience or Kubernetes Knowledge of machine learning and Artificial Intelligence (AI) Proven understanding of Java and Linux/Unix environment, software development, and/or experience with distributed systems Experience and curiosity about delivering and/or developing product training Experience contributing to an open-source project or documentation Additional Information - We Take Care Of Our People As a distributed company, diversity drives our identity. Whether you’re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn’t matter if you’re just out of college or your children are; we need you for what you can do. We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do. Competitive pay based on the work you do here and not your previous salary Health coverage for you and your family in many locations Ability to craft your calendar with flexible locations and schedules for many roles Generous number of vacation days each year Increase your impact - We match up to $2000 (or local currency equivalent) for financial donations and service Up to 40 hours each year to use toward volunteer projects you love Embracing parenthood with minimum of 16 weeks of parental leave Different people approach problems differently. We need that. Elastic is an equal opportunity employer and is committed to creating an inclusive culture that celebrates different perspectives, experiences, and backgrounds. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation. We welcome individuals with disabilities and strive to create an accessible and inclusive experience for all individuals. To request an accommodation during the application or the recruiting process, please email candidate_accessibility@elastic.co. We will reply to your request within 24 business hours of submission. Applicants have rights under Federal Employment Laws, view posters linked below: Family and Medical Leave Act (FMLA) Poster; Pay Transparency Nondiscrimination Provision Poster; Employee Polygraph Protection Act (EPPA) Poster and Know Your Rights (Poster) Elasticsearch develops and distributes encryption software and technology that is subject to U.S. export controls and licensing requirements for individuals who are located in or are nationals of the following sanctioned countries and regions: Belarus, Cuba, Iran, North Korea, Russia, Syria, the Crimea Region of Ukraine, the Donetsk People’s Republic (“DNR”), and the Luhansk People’s Republic (“LNR”). If you are located in or are a national of one of the listed countries or regions, an export license may be required as a condition of your employment in this role. Please note that national origin and/or nationality do not affect eligibility for employment with Elastic. Please see here for our Privacy Statement. Different people approach problems differently. We need that. Elastic is an equal opportunity/affirmative action employer committed to diversity, equity, and inclusion. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, pregnancy, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, disability status, or any other basis protected by federal, state or local law, ordinance or regulation. We welcome individuals with disabilities and strive to create an accessible and inclusive experience for all individuals. To request an accommodation during the application or the recruiting process, please email candidate_accessibility@elastic.co We will reply to your request within 24 business hours of submission. Applicants have rights under Federal Employment Laws, view posters linked below: Family and Medical Leave Act (FMLA) Poster; Equal Employment Opportunity (EEO) Poster; and Employee Polygraph Protection Act (EPPA) Poster. Please see here for our Privacy Statement. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Solutions Architect / Technical Lead - AI & Automation1 Key Responsibilities Solution Architecture & Development: Design end-to-end solutions using Node.JS (backend) and Vue.JS (frontend) for custom portals and administration interfaces. Integrate Azure AI services, Google OCR, and Azure OCR into client workflows. AI/ML Engineering Develop and optimize vision-based AI models (Layout Parsing/LP, Layout Inference/LI, Layout Transformation/LT) using Python. Implement NLP pipelines for document extraction, classification, and data enrichment. Cloud & Database Management Architect and optimize MongoDB databases hosted on Azure for scalability, security, and performance. Manage cloud infrastructure (Azure) for AI workloads, including containerization and serverless deployments. Technical Leadership Lead cross-functional teams (AI engineers, DevOps, BAs) in solution delivery. Troubleshoot complex technical issues in OCR accuracy, AI model drift, or system integration. Client Enablement Advise clients on technical best practices for scaling AI solutions. Document architectures, conduct knowledge transfers, and mentor junior engineers. Required Technical Expertise Frontend/Portal: Vue.JS (advanced components, state management), Node.JS (Express, REST/GraphQL APIs). AI/ML Stack: Python (PyTorch/TensorFlow), Azure AI (Cognitive Services, Computer Vision), NLP techniques (NER, summarization). Layout Engineering: LP/LI/LT for complex documents (invoices, contracts). OCR Technologies: Production experience with Google Vision OCR and Azure Form Recognizer. Database & Cloud: MongoDB (sharding, aggregation, indexing) hosted on Azure (Cosmos DB, Blob Storage, AKS). Infrastructure-as-Code (Terraform/Bicep), CI/CD pipelines (Azure DevOps). Experience: 10+ years in software development, including 5+ years specializing in AI/ML, OCR, or document automation. Proven track record deploying enterprise-scale solutions in cloud environments (Azure preferred). Preferred Qualifications Certifications: Azure Solutions Architect Expert, MongoDB Certified Developer, or Google Cloud AI/ML. Experience with alternative OCR tools (ABBYY, Tesseract) or AI platforms (GCP Vertex AI, AWS SageMaker). Knowledge of DocuSign CLM, Coupa, or SAP Ariba integrations. Familiarity with Kubernetes, Docker, and MLOps practices. Show more Show less

Posted 1 week ago

Apply

10.0 years

2 - 7 Lacs

Hyderābād

On-site

Key Responsibilities Solution Architecture & Development: Design end-to-end solutions using Node.JS (backend) and Vue.JS (frontend) for custom portals and administration interfaces. Integrate Azure AI services , Google OCR , and Azure OCR into client workflows. AI/ML Engineering: Develop and optimize vision-based AI models ( Layout Parsing/LP, Layout Inference/LI, Layout Transformation/LT ) using Python . Implement NLP pipelines for document extraction, classification, and data enrichment. Cloud & Database Management: Architect and optimize MongoDB databases hosted on Azure for scalability, security, and performance. Manage cloud infrastructure (Azure) for AI workloads, including containerization and serverless deployments. Technical Leadership: Lead cross-functional teams (AI engineers, DevOps, BAs) in solution delivery. Troubleshoot complex technical issues in OCR accuracy, AI model drift, or system integration. Client Enablement: Advise clients on technical best practices for scaling AI solutions. Document architectures, conduct knowledge transfers, and mentor junior engineers. Required Technical Expertise Frontend/Portal: Vue.JS (advanced components, state management), Node.JS (Express, REST/GraphQL APIs). AI/ML Stack: Python (PyTorch/TensorFlow), Azure AI (Cognitive Services, Computer Vision), NLP techniques (NER, summarization). Layout Engineering : LP/LI/LT for complex documents (invoices, contracts). OCR Technologies: Production experience with Google Vision OCR and Azure Form Recognizer . Database & Cloud: MongoDB (sharding, aggregation, indexing) hosted on Azure (Cosmos DB, Blob Storage, AKS). Infrastructure-as-Code (Terraform/Bicep), CI/CD pipelines (Azure DevOps). Experience: 10+ years in software development, including 5+ years specializing in AI/ML, OCR, or document automation . Proven track record deploying enterprise-scale solutions in cloud environments (Azure preferred). Preferred Qualifications Certifications: Azure Solutions Architect Expert , MongoDB Certified Developer , or Google Cloud AI/ML. Experience with alternative OCR tools (ABBYY, Tesseract) or AI platforms (GCP Vertex AI, AWS SageMaker). Knowledge of DocuSign CLM , Coupa , or SAP Ariba integrations. Familiarity with Kubernetes , Docker , and MLOps practices.

Posted 1 week ago

Apply

4.0 years

0 - 1 Lacs

India

On-site

Experience- 4+ Years Job description - we are looking for consultants with below skillsets: - Candidate should have good understanding of cloud integration (Mandatory) - Any SIEM tool experience is preferred (They have Securonix SIEM PS support available so candidate is not expected to work on Securonix side configuration but should understand how the integration works with any SIEM solution) - candidate should have good understanding of cloud integration methods available (Cloud-native connectors, API-based ingestion, agent-based) - Should have an understanding of the cloud models ( IaaS, PaaS, SaaS – security responsibilities in each) - Should have an understanding of the Logging Services : AWS CloudWatch and Azure Monitor - Scripting & Automation knowledge is preferred: Python, PowerShell, Bash – for automation and log parsing, Lambda functions, Azure Logic Apps, Amazon EventBridge Job Types: Full-time, Permanent Pay: ₹90,000.00 - ₹150,000.00 per month Schedule: Day shift Morning shift Work Location: In person

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows. Show more Show less

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Quick Take: We are looking for a motivated and detail-oriented junior to mid-level Database Administrator (DBA) with experience or strong interest in PostgreSQL. This is an excellent opportunity for candidates early in their database career to work in a collaborative IT environment and gain exposure to real-world production systems. The role involves supporting the design, implementation, and maintenance of PostgreSQL databases while ensuring high availability, performance, and data integrity. The Work: Database Implementation and Support • Assist in the installation, configuration, and maintenance of PostgreSQL databases. • Collaborate with senior DBAs and developers to understand database structure and application needs. • Support the creation and management of database objects (schemas, tables, indexes, views, functions). • Assist in the assessment, planning, and execution of database migrations from Microsoft SQL Server to PostgreSQL. • Work with tools such as pgloader, or custom scripts to migrate schema, data, and logic. • Participate in identifying incompatibilities and rewriting T-SQL logic into PL/pgSQL High Availability & Disaster Recovery • Learn and assist with PostgreSQL replication (streaming, logical) and HA technologies. • Participate in PITR (Point-in-Time Recovery), base backup strategies, and recovery testing processes under guidance. Monitoring and Alerting • Utilize monitoring tools (e.g., pg_stat_activity, Prometheus, pgAdmin) to observe performance metrics and database health. • Raise alerts and work with senior team members to troubleshoot performance or availability issues. Capacity Planning • Track basic storage and performance trends. • Assist in gathering data for future capacity planning. Backup and Recovery Operations • Run and validate pg_basebackup, pg_dump, and pgBackRest backups. • Participate in periodic backup restore drills and understand RTO/RPO implications. Performance Tuning • Hands-on PostgreSQL query planning using EXPLAIN (ANALYZE) and auto_explain. • Assist in analyzing long-running queries and identifying missing indexes. Security and Access Control • Manage PostgreSQL roles and privileges with oversight. • Participate in periodic access reviews and implement role-based access controls (RBAC). Patching and Updates • Support patching and PostgreSQL version upgrades under supervision. • Track new PostgreSQL releases and understand feature deprecations. Incident Handling • Assist in troubleshooting database-related incidents. • Escalate complex incidents with detailed documentation. Automation and Scripting • Write basic Bash, SQL, or Python scripts to automate regular maintenance tasks (e.g., VACUUM, backups). • Contribute to automation of alerts and log parsing Documentation • Maintain up-to-date documentation for database configurations and procedures. • Contribute to standard operating procedures (SOPs). Team Collaboration • Work closely with developers, sysadmins, and support teams. • Take part in team meetings, planning sessions, and code reviews where applicable. The Must- Haves • Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: • 0 to 5 years of experience working with PostgreSQL. • For freshers, Internship or project-based exposure to database management is a plus. Required Skills: • Understanding of relational database concepts and pgSQL. • Exposure to POSTGRESQL installation & configuration, security, and basic tuning. • Problem-solving and analytical skills. Communication: • Strong verbal and written communication skills. • Ability to document processes and follow standard procedures. Good to have skills, but not mandatory: • Basic understanding of Microsoft SQL Server. • Basic understanding of Kubernetes, Helm, or container orchestration. • Knowledge of scripting languages like PowerShell or Bash. • Awareness of monitoring tools like Prometheus/Grafana or pgAdmin. Over the years, we’ve discovered that the most effective and successful associates at apexanalytix are people who have a specific combination of values, skills, and behaviors that we call “The apex Way”. Read more about The apex Way - https://www.apexanalytix.com/careers/ Benefits At apexanalytix we know that our associates are the reason behind our successes. We truly value you as an associate and part of our professional family. Our goal is to offer the very best benefits possible to you and your loved ones. When it comes to benefits, whether for yourself or your family the most important aspect is choice. And we get that. apexanalytix offers competitive benefits for the countries that we serve, in addition to our BeWell@apex initiative that encourages employees’ growth in six key wellness areas: Emotional, Physical, Community, Financial, Social, and Intelligence. With resources such as a strong Mentor Program, Internal Training Portal, plus Education, Tuition, and Certification Assistance, we provide tools for our associates to grow and develop. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Role Summary We are seeking a full-time AI Engineer to lead the development of a smart application that reads Bills of Quantities (BoQs) and automatically proposes unit prices for each line item using OpenAI GPT models and internal pricing data. You will work at the intersection of natural language processing, construction data interpretation , and AI automation , creating tools that will shape the future of project estimation. Key Responsibilities Design and build a software application that reads and interprets BoQ documents (PDF, Excel, structured formats) Integrate with OpenAI (ChatGPT API) to analyze BoQ line items and suggest unit prices Fine-tune prompt engineering for accurate, context-aware pricing outputs Connect the system to internal historical databases or market references Develop interfaces for estimators , project managers , and procurement teams to interact with AI-generated results Ensure outputs are transparent, auditable, and adjustable Continuously improve model performance through user feedback and active learning Qualifications Strong programming skills in Python and experience working with APIs (especially OpenAI) Experience with document parsing tools (e.g., PDF parsers, OCR, NLP libraries) Background in machine learning or AI application development Familiarity with construction data , unit pricing , or quantity surveying (bonus) Ability to bridge AI and practical use cases in a multidisciplinary environment Strong problem-solving skills and a passion for applied AI Bachelor’s or Master’s in Computer Science, AI, Data Science , or related fields Bonus Skills Experience with LangChain , vector databases (e.g., Pinecone, Weaviate), or fine-tuning LLMs Knowledge of construction standards (e.g., CESMM, NRM, SMM7) Prior work in cost estimation or construction tech What We Offer Opportunity to work on a real-world AI application with immediate impact Work inside a leading construction firm actively investing in digital innovation Competitive salary and benefits Collaborative and supportive environment Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

India

On-site

Linkedin logo

Job Title Principal NLP Engineer based out of Mumbai. Note: This role is for one of our clients based out of Mumbai. Company Details We are on a mission to make the life of sales and solutions teams more efficient and help them to focus on selling. Our AI Sales Engineer empowers sales and pre-sales teams to improve win rates and close deals faster. We are an AI platform that acts as a central hub to collate and sift through all your content scattered across multiple repositories and tools such as Drive, Sharepoint, Confluence, CRM, Slack, product knowledge base. As a member of our team, you'll be at the forefront of this exciting technology revolution, working alongside some of the brightest minds in the industry to bring our platform to life. We're looking for individuals who are passionate about AI and its potential to drive real-world impact. Whether you're an AI expert or an aspiring one, we offer an environment that supports and challenges you, enabling your growth and success. Join us today and help us shape the future of AI for enterprises! Job Roles & Responsibilities Lead the design, development, testing and deployment of advanced NLP techniques & models, ensuring high performance, accuracy, and scalability across the application Collaborate with cross-functional teams, including product managers, UI/UX designers, and developers, to develop the best user experiences Conduct user research and evaluate user feedback in order to continuously finetune the models Stay updated on industry trends, new technologies, and best practices in AI, NLP, and conversational design Actively participate in code reviews, architecture discussions, and technical debates to ensure quality Mentor developers by providing guidance, feedback, and support throughout their career growth within the organization Cultural Expectations Bachelor’s or Master’s degree in Computer Science, Information Technology with a focus on language processing 10+ years of experience in developing NLP-based applications, with strong proficiency in Python Experience in machine learning frameworks, libraries such as TensorFlow, PyTorch, Hugging Face's Transformers, or SpaCy and large language models (LLMs) such as Claude, GPT, and practical knowledge of retrieval-augmented generation (RAG) systems Expertise in designing, implementing, and exposing RESTful APIs or microservices using FastAPI In-depth understanding of various NLP techniques for text processing, semantic extraction, tokenization, named entity recognition, dependency parsing, and knowledge of embeddings Exceptional problem-solving skills, with the ability to break down complex technical challenges, present effective solutions, implement them effectively Proven ability to build and lead high-performing engineering teams Excellent communication and decision making skills Hiring Process R1: Profile Shortlisting R2: Initial Screening R3: Tech evaluation R4: Founder interview Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Helios Full Stack Senior Java Developer – C12 Project Description: Citi is embarking on a multi-year technology initiative in Wholesale Lending Credit Risk (WLCR) Technology Space. In this Journey, we are looking for a highly motivated hands-on senior developer. We are building the platform, which supports various Messaging, API, and Workflow Components for Loans Services across the bank. Solution will be built from the scratch using latest technologies. The candidate will be a core member of the technology team responsible for implementing projects based on Java, Spring Boot, Kafka using latest technologies. Excellent opportunity to immerse in and learn within the Wholesale Lending Division and gain exposure to business and technology initiatives targeted to maintain lead position among its competitors. We work in a Hybrid-Agile Environment. The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Individual Contributor – Write good quality code in Angular JS 16 Well versed with UI/UX Designs (Figma), Unit test using Jest Individual Contributor - Write good quality code in Java, Sprint Boot (related stack) Well versed with JUnit, Mockito, Integration Tests and Performance Tests Ability to design, develop components with minimal assistance Ability to effectively interact, collaborate with development team Ability to effectively communicate development progress to the Project Lead Work with developers onshore, offshore and matrix teams to implement a business solution Write user/supported documentation Evaluate and adopt new dev tools, libraries, and approaches to improve delivery quality Perform peer code review of project codebase changes Acts as SME to senior stakeholders and /or other team members Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills Required: 4- 8 Years Of Experience Good Knowledge of UI/UX Design and Tools (e.g. Figma), Angular JS and Jest for unit testing Good Knowledge of Spring including Spring Framework, Spring Boot, Spring Security, Spring Web, Spring Data Hands-on Knowledge of: Threading, Collections, Exception Handling, JDBC, Java OOD/OOP Concepts, GoF Design Patterns, MoM and SOA Design Patterns, File I/O, and parsing XML and JSON, delimited files and fixed length files, String matching, parsing, building, working with binary data / byte arrays. Good knowledge of SQL (Oracle dialect is preferable) Experience working with SOA & Micro-services utilizing REST. Experience with design and implementations of cloud-ready applications and deployment pipelines on large-scale container platform clusters is a plus Experience working in a Continuous Integration and Continuous Delivery environment and familiar with Tekton, Harness, Jenkins, Code Quality, etc. Knowledge in industry standard best practices such as Design Patterns, Coding Standards, Coding modularity, Prototypes etc. Experience in debugging, tuning and optimizing components Understanding of the SDLC lifecycle for Agile methodologies Excellent written and oral communication skills Experience developing application in Financial Services industry is preferred Nice to have experience: Messaging Systems: RabbitMQ, ActiveMQ, Kafka, Tibco. IBM MQ, etc. Tomcat, Jetty, Apache HTTPD Able to work with build/configure/deploy automation tools Linux Ecosystem Kubernetes and Docker Autosys APIm APM Tools: Dynatrace, AppDynamics, etc. Caching Technologies: Hazelcast, MemCached, Redis etc Qualifications: 8 - 10 years of relevant experience in the Financial Service industry Intermediate level experience in Applications Development role Consistently demonstrates clear and concise written and verbal communication Demonstrated problem-solving and decision-making skills Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... As a Data Engineer with ETL/ELT expertise for our growing data platform and analytics teams, you will understand and enable the required data sets from different sources. This includes both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements. Transforming technical design. Working on data ingestion, preparation and transformation. Developing the scripts for data sourcing and parsing. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solve business problems. You'll Need To Have Bachelor’s degree or one or more years of relevant experience required, demonstrated through work experience and/or military experience. Experience with Data Warehouse concepts and Data Management life cycle. Even better if you have one or more of the following: Any related Certification on ETL/ELT developer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influencing partners. Why Verizon? Verizon is committed to maintaining a Total Rewards package which is competitive, valued by our employees, and differentiates us as an Employer of Choice. We are a ‘pay for performance’ company and your contribution is rewarded through competitive salaries, performance-based incentives and an employee Stock Program. We create an opportunity for us all to share in the success of Verizon and the value we help to create through this broad-based discretionary equity award program. Your benefits are market competitive and delivered by some of the best providers. You are provided with a full spectrum of health and wellbeing resources, including a first in-class Employee Assistance Program, to empower you to make positive health decisions. We offer generous paid time off benefits to help you manage your work life balance and opportunities for flexible working arrangements*. Verizon provides training and development for all levels, to help you enhance your skills and develop your career, from funding towards education assistance, award-winning training, online development tools and access to industry research. You will be able to take part in volunteering opportunities as part of our environmental, community and sustainability commitment. Your benefits package will vary depending on the country in which you work. subject to business approval If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

0 years

6 - 9 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant , AI ML Lead! In this role, we are looking for candidates who have relevant years of experience in Text Mining. The Text Mining Scientist (TMS) is expected to play a pivotal bridging role between enterprise database teams, and business /functional resources. At a broad level, the TMS will leverage his/her solutioning expertise to translate the customer’s business need into a techno-analytic problem and appropriately work with database teams to bring large scale text analytic solutions to fruition. The right candidate should have prior experience in developing text mining and NLP solutions using open-source tools. Responsibilities Develop transformative AI/ML solutions to address our clients' business requirements and challenges Project Delivery - This would entail successful delivery of projects involving data Pre-processing, Model Training and Evaluation, Parameter Tuning Manage Stakeholder/Customer Expectations Project Blue Printing and Project Documentation Creating Project Plan Understand and research cutting edge industrial and academic developments in AI/ML with NLP/NLU applications in diverse industries such as CPG, Finance etc. Conceptualize, Design, build and develop solution algorithms which demonstrate the minimum required functionality within tight timelines Interact with clients to collect, synthesize, and propose requirements and create effective analytics/text mining roadmap. Work with digital development teams to integrate and transform these algorithms into production quality applications Do applied research on a wide array of text analytics and machine learning projects, file patents and publish the papers Qualifications we seek in you! Minimum Qualifications / Skills MS in Computer Science, Information systems, or Computer engineering, Systems Engineering with relevant experience in Text Mining / Natural Language Processing (NLP) tools, Data sciences, Big Data and algorithms. Post-Graduation in MBA and Undergraduate degree in any engineering discipline, preferably Computer Science with relevant experience Full cycle experience desirable in atleast 1 Large Scale Text Mining/NLP project from creating a Business use case, Text Analytics assessment/roadmap, Technology & Analytic Solutioning, Implementation and Change Management, considerable experience in Hadoop including development in map-reduce framework Technology Open Source Text Mining paradigms such as NLTK, OpenNLP, OpenCalais , StanfordNLP, GATE, UIMA, Lucene, and cloud based NLU tools such as DialogFlow, MS LUIS Exposure to Statistical Toolkits such as R, Weka, S -Plus, Matlab, SAS-Text Miner Strong Core Java experience in large scale product development and functional knowledge of RDBMs Hands on to programing in the Hadoop ecosystem, and concepts in distributed computing Very good python/R programming skills. Java programming skills a plus Methodology Relevant years of experience in Solutioning & Consulting experience in verticals such as BFSI, CPG, with hands on delivering text analytics on large structured and unstructured data A solid foundation in AI Methodologies like ML, DL, NLP, Neural Networks, Information Retrieval and Extraction, NLG, NLU Exposed to concepts in Natural Language Processing & Statistics, esp., in their application such as Sentiment Analysis, Contextual NLP, Dependency Parsing, Parsing, Chunking, Summarization, etc Demonstrated ability to Conduct look-ahead client research with focus on supplementing and strengthening the client’s analytics agenda with newer tools and techniques Preferred Qualifications/ Skills Technology Expert level of understanding of NLP, NLU and Machine learning/Deep learning methods OpenNLP , OpenCalais, StanfordNLP , GATE, UIMA, Lucene, NoSQL UI development paradigms that would enable Text Mining Insights Visualization, e.g., Adobe Flex Builder, HTML5, CSS3 Linux, Windows, GPU Experience Spark, Scala for distributed computing Deep learning frameworks such as TensorFlow, Keras , Torch, Theano Methodology Social Network modeling paradigms, tools & techniques Text Analytics using Natural Language Processing tools such as Support Vector Machines and Social Network Analysis Previous experience with Text analytics implementations, using open source packages and or SAS-Text Miner Ability to Prioritize, Consultative mindset & Time management skills Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 4, 2025, 6:34:09 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description : SDET (Software Development Engineer in Test) Notice Period Requirement: Immediately to 2 Month(Officially) Job Locations: Gurgaon Experience: 5 to 8 Years Skills: SDET, Automation, Java programming, Selenium, Cucumber, WebApi, Rest Assured Job Type : Full-Time Job Description We are seeking an experienced and highly skilled SDET (Software Development Engineer in Test) to join our Quality Engineering team. The ideal candidate will possess a strong background in test automation with API testing or mobile testing or Web, with hands-on experience in creating robust automation frameworks and scripts. This role demands a thorough understanding of quality engineering practices, microservices architecture, and software testing tools. Key Responsibilities : - Design and develop scalable and modular automation frameworks using best industry practices such as the Page Object Model. - Automate testing for distributed, highly scalable systems. - Create and execute test scripts for GUI-based, API, and mobile applications. - Perform end-to-end testing for APIs, ensuring thorough validation of request and response schemas, status codes, and exception handling. - Conduct API testing using tools like Rest Assured, SOAP UI, NodeJS, and Postman, and validate data with serialization techniques (e.g., POJO classes). - Implement and maintain BDD/TDD frameworks using tools like Cucumber, TestNG, or JUnit. - Write and optimize SQL queries for data validation and backend testing. - Integrate test suites into test management systems and CI/CD pipelines using tools like Maven, Gradle, and Git. - Mentor team members and quickly adapt to new technologies and tools. - Select and implement appropriate test automation tools and strategies based on project needs. - Apply design patterns, modularization, and user libraries for efficient framework creation. - Collaborate with cross-functional teams to ensure the quality and scalability of microservices and APIs. Must-Have Skills : - Proficiency in designing and developing automation frameworks from scratch. - Strong programming skills in Java, Groovy, or JavaScript with a solid understanding of OOP concepts. - Hands-on experience with at least one GUI automation tool (desktop/mobile). Experience with multiple tools is an advantage. - In-depth knowledge of API testing and microservices architecture. - Experience with BDD and TDD methodologies and associated tools. - Familiarity with SOAP and REST principles. - Expertise in parsing and validating complex JSON and XML responses. - Ability to create and manage test pipelines in CI/CD environments. Nice-to-Have Skills : - Experience with multiple test automation tools for GUI or mobile platforms. - Knowledge of advanced serialization techniques and custom test harness implementation. - Exposure to various test management tools and automation strategies. Qualifications : - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5 Years+ in software quality engineering and test automation. - Strong analytical and problem-solving skills with attention to detail. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Responsibilities Data Engineer Profile PWC is looking for hands-on data engineers who can produce beautiful & functional code to solve complex analytics problems. If you are an exceptional developer with an aptitude to learn and implement using new technologies, and who loves to push the boundaries to solve complex business problems innovatively, then we would like to talk with you. Strong interpersonal skills and ability to manage and work with cross-functional teams. Being able to perform on multiple levels both as an individual contributor as well as driving initiatives. Go-getter attitude with a mindset of thinking innovative Experience building and optimizing ‘big data’ data pipelines, architectures and data sets involving petabyte and terabyte of data. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Expand and grow data platform capabilities to solve new data problems and challenges Apply complex big data concepts with a focus on collecting, parsing, managing, and analyzing large sets of data to turn information into insights Technical Skill set:- Must Have Extensive experience on big data platforms including Hadoop, Spark, Hive, Presto, Kudu. Hands-on experience in design and development of hybrid cloud architecture Extensive experience in at least one of the cloud platforms among Azure (Azure Data Factory, Databricks, Sybase Analytics, Azure functions, Azure stream analytics) OR AWS (S3, lambda, glue, Quick sight, EMR,EC2,Redshift ,Athena and Presto) Hands-on experience in developing large scale distributed applications in either java or python programming language Hands-on experience in either one of the real time streaming applications- KAFKA OR Azure event hub OR AWS Kinesis Advanced SQL programming skills Design DWH model, ensuring data design follows the prescribed reference architecture framework while reflecting appropriate business rules built for logical, physical and conceptual model. Experience in performance tuning of complex ETL mappings for relational and non-relational workloads Well versed with relational and no-Sql databases like MySQL, PostgreSQL, Mongo DB and Cassandra Hands-on experience in migration from on-premise systems to cloud. Good to have Hands-on experience on microservice architecture Knowledge on spring boot framework Knowledge on developing scalable web applications Developing distributed applications using cloud platforms. Mandatory Skill Sets Pyspark, SQL, Databricks Preferred Skill Sets Pyspark, SQL, Databricks Years Of Experience Required 5 - 8 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Android Debug Bridge (ADB), Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment, Performance Management Software {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Software and Product Innovation team you lead the implementation of user stories and solve business problems. As a Senior Associate, you guide and mentor junior team members while maintaining professional and technical standards to deliver quality client solutions. Responsibilities Design and implement Java Microservices architecture Collaborate with teams to define project scope and objectives Conduct code reviews to maintain quality standards Mentor junior developers in microservices practices Troubleshoot and resolve application issues promptly Stay updated on microservices trends and technologies Contribute to the software development lifecycle Document technical specifications and workflows What You Must Have Bachelor's Degree 4 years of experience in software engineering Oral and written proficiency in English required What Sets You Apart Proven experience in Java, Spring Boot, and Microservices Familiarity with RESTful APIs and JMS Understanding of financial applications, especially payment/wires Hands-on experience with DevOps practices and tools Demonstrating exceptional communication and interpersonal skills Knowledge of containerized deployments preferred Experience with microservices architecture and related technologies Ability to present technical solutions to executive stakeholders Understanding of message parsing in banking messages preferred Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About The Team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. Basic Qualifications Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones Preferred Qualifications Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A2896681 Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Do you want to join an innovative team of scientists who use machine learning and statistical techniques to create state-of-the-art solutions for providing better value to Amazon’s customers? Do you want to build and deploy advanced algorithmic systems that help optimize millions of transactions every day? Are you excited by the prospect of analyzing and modeling terabytes of data to solve real world problems? Do you like to own end-to-end business problems/metrics and directly impact the profitability of the company? Do you like to innovate and simplify? If yes, then you may be a great fit to join the Machine Learning and Data Sciences team for India Consumer Businesses. If you have an entrepreneurial spirit, know how to deliver, love to work with data, are deeply technical, highly innovative and long for the opportunity to build solutions to challenging problems that directly impact the company's bottom-line, we want to talk to you. Major responsibilities Use machine learning and analytical techniques to create scalable solutions for business problems Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes Design, development, evaluate and deploy innovative and highly scalable models for predictive learning Research and implement novel machine learning and statistical approaches Work closely with software engineering teams to drive real-time model implementations and new feature creations Work closely with business owners and operations staff to optimize various business operations Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation Mentor other scientists and engineers in the use of ML techniques Basic Qualifications 3+ years of building models for business application experience PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience Experience in patents or publications at top-tier peer-reviewed conferences or journals Experience programming in Java, C++, Python or related language Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Preferred Qualifications Experience using Unix/Linux Experience in professional software development Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka - A66 Job ID: A2851691 Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

As a Sr.Data Quality Engineer manages and coordinates with internal or external parties the collection, compilation, normalization, and standard analysis of data assets across diverse projects and data platforms. Develops, maintains, evaluates, and tests valuates data solutions within an organization. Develops and executes plans, policies, and practices that control, protect, deliver, and enhance the value and integrity of the organization's data and information assets and programs. Typical Functions Manipulates and queries data by building SQL and stored procedures single handedly in Snowflake Writes and executes test cases for data-related release items within Agile processes Writes complex SQL queries and handle large data set quality validation Builds a Data Quality Testing Framework from scratch to monitor the quality of data. Understands the basic concepts of data Warehousing, ETL to validate changes to them Executes common data quality tests to validate data accuracy, data completeness, data freshness, data integrity, data consistency Builds test cases to validate the generated data analytical report and dashboard Creates database tests to enforce data validation and constraints quality standards Applies your knowledge of dimensional modeling and data warehouse concepts, such as star schemas, snowflakes, dimensions, facts to conduct data analysis Performs statistical tests on large datasets to determine data quality and integrity Evaluates system performance and design, as well as its effect on data quality Collaborates with database developers to improve data collection and storage processes Runs data queries to identify coding issues and data exceptions, as well as cleaning data Gathers data from primary or secondary data sources to identify and interpret trends Keeps abreast of developments and trends in data quality analysis Collects, stores, processes, and analyses raw and/or complex data from multiple sources, recommends ways to apply the data, chooses and designs optimal data solutions, and builds data processing systems, using expertise in data warehousing solutions and working with the latest database technologies. Maintains, implements, and monitors the quality of the data and information with the architecture used across the company; reports on results and identifies and recommends system application changes required to improve the quality of data in all applications. Investigates data quality problems, conducts analysis to determine root causes of problems, corrects errors, and develops prototypes, process improvement plans across all programs, and proof of concepts for the selected solutions. Processes unstructured data into a form suitable for analysis, followed by doing the analysis. Integrates innovations and algorithms into the organization's data systems, working closely with engineering team. Implements complex data projects with a focus on collecting, parsing, managing, analysing, and visualising large sets of data to turn information into insights using multiple platforms. Serves as a data subject matter expert, collaborates with business owners or external clients to establish an analysis plan to answer key business questions, and delivers both reporting results and insights. Generates specific and operational reports in support of objectives and initiatives, and presents and communicates complex analytical data and results to appropriate audiences. Requirements Other duties or functions may be assigned. Bachelor's Degree in Engineering, Computer Science, or equivalent experience 6+ Years of Software Quality Assurance experience which includes Data Warehouse Testing Proficiency in SQL ideally within Snowflake SQL with MySQL, SQL Server, and/or PostgreSQL Experience with Sisense or Power BI 5+ years experience developing data driven software Proficiency in programming languages, including Structured Query Language (SQL) Experience writing DBT tests Experience In Agile Methods, Particularly Scrum, Preferred Demonstrated knowledge of critical thinking Professional experience around the data science lifecycle (feature engineering, training, model deployment) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hubli, Karnataka, India

On-site

Linkedin logo

Description Do you want to join an innovative team of scientists who use machine learning and statistical techniques to create state-of-the-art solutions for providing better value to Amazon’s customers? Do you want to build and deploy advanced algorithmic systems that help optimize millions of transactions every day? Are you excited by the prospect of analyzing and modeling terabytes of data to solve real world problems? Do you like to own end-to-end business problems/metrics and directly impact the profitability of the company? Do you like to innovate and simplify? If yes, then you may be a great fit to join the Machine Learning and Data Sciences team for India Consumer Businesses. If you have an entrepreneurial spirit, know how to deliver, love to work with data, are deeply technical, highly innovative and long for the opportunity to build solutions to challenging problems that directly impact the company's bottom-line, we want to talk to you. Major responsibilities Use machine learning and analytical techniques to create scalable solutions for business problems Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes Design, development, evaluate and deploy innovative and highly scalable models for predictive learning Research and implement novel machine learning and statistical approaches Work closely with software engineering teams to drive real-time model implementations and new feature creations Work closely with business owners and operations staff to optimize various business operations Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation Mentor other scientists and engineers in the use of ML techniques Basic Qualifications 3+ years of building models for business application experience PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience Experience in patents or publications at top-tier peer-reviewed conferences or journals Experience programming in Java, C++, Python or related language Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Preferred Qualifications Experience using Unix/Linux Experience in professional software development Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka - A66 Job ID: A2851691 Show more Show less

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Kollam

On-site

Amrita Vishwa Vidyapeetham, Bengaluru Campus is inviting applications from qualified candidates for the post of Flutter Devloper. For Details Contact: paikrishnang@am.amrita.edu Job Title Flutter Devloper Location Kollam , Kerala Required Number 2 Job description App Development Develop and maintain cross-platform mobile applications using Flutter and Dart. Build responsive and pixel-perfect UIs based on Figma/Adobe XD/UI designs. Implement new features and functionalities based on project requirements. State Management Use appropriate state management techniques such as BLoC, Provider, Riverpod, or GetX. Maintain scalable and clean state handling across screens and modules. API Integration Integrate RESTful APIs and handle data fetching, parsing, and error handling. Use tools like Dio or HTTP for network calls. Code Quality Write clean, maintainable, and testable Dart code. Follow version control best practices using Git. Testing and Debugging Conduct unit testing and widget testing. Debug and fix performance, UI, and logic issues during development and after release. Build & Deployment Understand how to build, sign, and release Android (APK/AAB) and iOS apps. Collaborate with seniors for publishing apps to the Play Store or App Store. Documentation Maintain proper documentation of code and app architecture. Write README files and API usage notes where applicable. Learning & Improvement Stay updated with Flutter releases and best practices. Actively learn and apply new tools or libraries relevant to the project. Qualification BTech/BCA/MCA/MTech Job category Project Experience 1-2 years Last date to apply June 20, 2025

Posted 1 week ago

Apply

0 years

0 - 0 Lacs

Delhi

On-site

What You'll Do (Key Responsibilities) As a Developer Trainee, you’ll be part of a structured training and hands-on development track designed to build your capability in Zoho Creator and Deluge scripting. Here’s what your role will involve: Zoho Creator Application Development Learn to design and build custom applications using Zoho Creator’s drag-and-drop interface . Create and configure forms, reports, dashboards, and workflows tailored to specific business use cases. Implement best practices in app structuring, form relationships, and user interface optimization. Deluge Scripting and Logic Building Use Deluge scripting to write server-side logic, automate processes, and create dynamic behaviors in apps. Write functions for validations, conditional workflows, API calls, and data transformations. Maintain readable, modular, and reusable code for future scalability. Workflow Automation and Business Rules Build multi-step workflows using Creator's process automation tools (workflow builder, schedules, approvals). Translate client business processes into logical, streamlined automation. Configure notifications, escalations, and reminders based on system or user actions. Integration and API Handling Assist in integrating Zoho Creator apps with other Zoho apps (CRM, Books, Desk, etc.) and third-party platforms using REST APIs. Configure webhooks, custom functions, and connectors for end-to-end data flow and synchronization. Learn OAuth tokens, API authentication, and JSON parsing in a guided setup. Data Modeling and Reports Design efficient database structures with proper form linking and relationship mapping. Create dynamic reports, charts, and dashboards to visualize critical business data. Optimize performance through effective use of filters, formulas, and custom views. Testing, Debugging, and Documentation Test applications across different scenarios and user roles. Identify and debug errors in forms, scripts, or workflows during development and deployment. Document modules, logic flow, known issues, and version changes clearly for internal and client use. Job Type: Full-time Pay: ₹18,000.00 - ₹20,000.00 per month Location Type: In-person Schedule: Day shift Monday to Friday Application Question(s): Do you reside in West Delhi? Please mention your current location. Can you join on immediate basis? Work Location: In person

Posted 1 week ago

Apply

6.0 years

0 - 0 Lacs

Coimbatore

Remote

SFCC| 6 to 8 years | Onsite | Bangalore/Pune| Work Timing: Standard IST Job Description: 6 to 8+ years of SFCC experience. Experience leading and working with geographically dispersed development teams. Experience in design and development of third party (backend) integration. Expertise in developing third party backend integrations like Address/ shipping/ tax validation, Payment authorization, CRM, OMS. Certified Demandware developer Demandware Architecture knowledge Experience of complex ecommerce implementations on Demandware platform Good understanding of object-oriented programming principles Proficient in Demandware foundational concepts with knowledge of Site Genesis, UX Studio, Pipelines, ISML templates, DW Scripts, Content Assets / Slots and Demandware catalogue including catalogue, category, and products Creating pipelines for feed jobs Understanding of Product Catalog/Price Books Understanding of Global Expansion and Locales Deep understanding of J2EE including knowledge of EJBs, JDBC, Servlets/JSP, XML parsing and manipulation and SOA Proficient in DOM structure and jQuery implementation Knowledge of SiteGensis, Search/Category landing, PDP pages code flow and pipeline structure Hands-on experience with all options/tasks/interfaces available in Business Manager Experience with web technologies, JavaScript, JQuery, HTML5, CSS, LESS, ecommerce, enterprise application integration, security, performance & scalability Experience building responsive and adaptive pages Good understanding of deployment considerations specific to Demandware SaaS solution Experience with agile SCRUM development methodologies Computer science degree, or equivalent qualification/experience Design and develop solutions that incorporate industry best practices for internet application design, usability, and data security Experience working in high-volume user environments where scalability is a primary concern Demandware CSSuite Gitlab/Github/SVN Passionate, yet methodical approach to problem solving against tight deliverables Self-starter with excellent communication skill In Summary if you looking - Good knowledge of SFRA architecture & hands-on exp in SFRA Good knowledge of product catalogs, search, refinements, promotions, pricebook, inventory Expertise in 3rd party service implementation Hands-on experience in basket, shipment, orders custom implementation Hands-on experience in payment integration (at least one E2E flow) Good knowledge of OCAPI Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹70,000.00 - ₹80,000.00 per month Benefits: Work from home Schedule: Monday to Friday Morning shift UK shift US shift Application Question(s): Are you ready to move Onsite | Bangalore/Pune? Education: Bachelor's (Preferred) Experience: SFCC: 6 years (Preferred)

Posted 1 week ago

Apply

2.0 - 3.0 years

0 - 0 Lacs

Tiruchchirāppalli

On-site

Job Summary We are happy to introduce about our self, TechZarInfo offers a complete web design and development service. We provide the expertise and know-how to deliver unique websites to clients across a wide range of sectors. From branding, website design, website development, ecommerce, content management systems (CMS website development) and intranets, to search engine optimization, mobile applications, iPhone, android We are looking for experienced Android Application Developers. Job Description: · Excellent technical knowledge of Core Java for Android and the Eclipse IDE · Good understanding of OOP Concept and design patterns · Familiar with Eclipse IDE, Android Studio, Kotlin, Android debugging tools, Material designing · Experience in Web Service Integration, XML/JSON parsing, RESTFUL web services · Familiarity with the process of deploying apps to Google Play · Good understanding of the following concepts: memory management, application life cycle, multi threading, · web service consumption, databases, social media integration, cloud integration, location based services, · media streaming, encryption/decryption, push notifications, in-app purchases, mobile ads, etc. · Experience in working with third-party libraries and tools (example: Facebook, Twitter API, Google Maps API) Experience: 2 - 3 years Benefits: Company will provide performance awards every month and appraisals seasonally. Shift: Day Shift Notice Period: Immediate joinees preferable, max 30days permitted. Pay Scale: Best in industry. If you find the above JD matching your profile, kindly forward your resume to career@techzarinfo.com and hr.techzar@gmail.com Kindly do not submit irrelevant profiles. You can contact us at 9943761441, 9952885799. Note: Interested candidate can walk in to our office in weekdays from 10.00 am to 5.00 pm Job Type: Full-time Pay: ₹15,000.00 - ₹35,000.00 per month Schedule: Day shift Expected Start Date: 05/06/2025

Posted 1 week ago

Apply

4.0 years

12 - 15 Lacs

Ahmedabad

Remote

Role : Xamarin Developer Responsibility : 4+ Years of hands-on experience in design, architect, developing and delivering cross platform Xamarin.Android, Xamarin.iOS and MAUI applications using Microsoft Visual Studio/VS for MAC. Experience in designing and developing custom controls for Android, Windows and iOS. Should have a strong knowledge of OOPS and intermediate knowledge of front-end GUI design and development. Working knowledge of data parsing, storing and related patterns. Strong in C#, the .NET framework and object oriented programming patterns and practices. Expertise in MVVM pattern. Code sharing patterns such as dependency service, service locator. Experience in designing and developing custom controls for Android, Windows and iOS. Capabilities to write and execute unit test cases. Solid foundation in data structures, algorithms, and object orientation design. Independent researching, solution finding, analysis and problem solving skills and capabilities. Knowledge on Agile and Scrum approach is must. Demonstrate ability to apply Microsoft Practices and Patterns as guided by the Architect. Ability to complete all phases of software development life cycle including analysis,design, functionality, testing and support. Should have knowledge of Source Control like SVN, GitLab & Github is preferred. Must be a team player and self-starter with ability to communicate clearly with clients and team-embers in English as Main Language. Knowledge on Bluetooth connectivity (BLE) is a plus.Knowledge of Drawing Library will be an added benefit. Knowledge of Drawing Library will be an added benefit Similar to Bikash and Pallav, the candidate should primarily work remotely but be available to work onsite in Pune on a need basis. Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Benefits: Flexible schedule Health insurance Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Experience: MAUI: 1 year (Required) Work Location: In person

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies