Jobs
Interviews

4894 Data Processing Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

5 - 9 Lacs

pune

Work from Office

Project Role : Advanced Application Engineer Project Role Description : Develop innovative technology solutions for emerging industries and products. Interpret system requirements into design specifications. Must have skills : Machine Learning Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Advanced Application Engineer, you will engage in a dynamic environment where you will utilize modular architectures and next-generation integration techniques. Your typical day will involve collaborating with Application Development Teams, fostering an Agile mindset, and contributing to projects of varying scopes and scales, all while maintaining a cloud-first and mobile-first approach to deliver innovative solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of modular architectures to enhance application performance.- Collaborate with cross-functional teams to ensure alignment and integration of project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning.- Good To Have Skills: Experience with data processing frameworks such as Apache Spark or Hadoop.- Strong understanding of machine learning algorithms and their applications.- Experience with programming languages such as Python or R for data analysis.- Familiarity with cloud platforms like AWS or Azure for deploying machine learning models. Additional Information:- The candidate should have minimum 2 years of experience in Machine Learning.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

hyderabad

Work from Office

Profile We are looking for a highly skilled Deep Learning Engineer to develop, implement, and optimize deep learning models. The ideal candidate will have expertise in neural networks, data processing, and model deployment to drive advanced AI solutions across various industries. Requirements Experience in development and implementation of computer vision products with a large scale data processing / analytics. Should have developed state of art AI/ML and DL applications/products. Experience working on TensorFlow, PyTorch, ONNX, MXNet, Caffe, OpenCV, Keras, CNN, R-CNN, DNN, CuDNN, RNN, Mask R-CNN, YOLO, ResNext, GoogleNet, AlexNet, ResNet. SegNet, VGG Net, etc., Neural networks, frameworks and platforms. Solid programming skills with Python and C/C++. Good to have MatLAB experience. Should have used GPU computing (CUDA, OpenCL) and HPC Knowledge and experience in using pykafka, kafka broker, various messaging and steaming services. Hands on experience in developing efficient Computer Vision products. Should have extensive experience in cloud like AWS/Azure and Google. Implementation of research papers at the expected level of quality. Experience in using the dockerized container with micro services for deploying the applications. Experience in using the NoSQL databases. Ability to optimize the models to TensorRT and publishing for various edge devices. Hands-on experience in using and deploying optimized models for NVIDIA Jetson Nano, TX1, TX2, Xavier NX, AGX Xavier and Raspberry Pi, other edge devices with DeepStream, GStream etc. Experience with Samsung Neural is an advantage. Responsibilities Own the product development milestone, ensure delivery to the architecture and identify challenges. Drive innovation in the product catering to successful initiatives and engineering best practices to be followed by core product development teams in the company. Developing, porting and optimizing computer vision algorithms and data structures on proprietary cores; Perhaps using design, implement, validate, and release applications. Involved in research and development effort of advanced product-critical computer vision components covering key product critical perception features such as feature extraction, tracking objects, sensor calibration. Solid foundation in computer vision: photogrammetry, multi-view geometry, visual SLAM, detection and recognition, 3D reconstruction. Write maintainable, reusable code, leveraging test-driven principles to develop high quality computer vision and machine learning modules. Visualization and solid understanding of deep neural networks. Experience with object detection, tracking, classification, recognition, scene understanding. Excellent knowledge on any/all of the given concepts in Computer Vision namely image classification, object detection and semantic segmentation developed using state of the art deep learning algorithms Experience/exposure to usage of open source technologies. Experience in owning technical architecture of the products, planning roadmaps and technically managing the team. Evaluate and advise on new technologies, vendors, products and competitors. Initiative and the ability to work independently and in a team. Must have managed teams but at the same time be a hands-on technology person.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

pune

Work from Office

Skills : 1] Hands-on experience in LLM/ML based application development 2] Productionizing LLM/ML apps Data Handling/processing/engineering Dataset creation/curation 3] Understanding of different LLM performance metrics, fine-tuning, prompt engineering 4] Image/Video Processing 5] Generative AI, OpenAI, Claude Knowledge. Good in Prompt Engineering, autogen or similar agentic framework knowledge

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

pune

Work from Office

Skills: Django Django Custom UI Python Rest Framework ORM HTML& CSS Chat-GPT Prompting GIT Knowledge SQL Postgres Industry standards and best practices JSON Handling Data Processing Working in Team Environment WhatsApp META API Experience is a PLUS Keywords ORM,HTML,CSS,JSON,Data Processing,Backend Development,Postgres,Django Custom UI,Python,Rest Framework,Chat-GPT Prompting,GIT Knowledge,SQL,JSON Handling,Django

Posted 3 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

dhule

Work from Office

Job Overview: The Computer Operator is responsible for overseeing the daily operation of computer systems used in construction project management. This includes managing project data, assisting with design and scheduling software, troubleshooting hardware and software issues, and ensuring that all construction technology runs smoothly to support ongoing and upcoming projects. Key Responsibilities: System Management: Monitor and maintain construction management software systems, including project management, scheduling, and budgeting tools. Operate and oversee software for design and drafting (e.g., AutoCAD, Revit) to support engineers, architects, and project managers. Ensure all construction-related data is accurately stored, organized, and backed up in the system. Data Entry & Processing: Input and update project information, including material costs, labor hours, progress reports, and other relevant construction data. Generate daily, weekly, and monthly reports for project managers and stakeholders. Maintain accurate records of construction timelines, budgets, and resource allocation in digital systems. Collaboration: Work closely with engineers, project managers, architects, and field teams to ensure that technological systems meet project needs. Communicate any system issues to management and IT personnel to ensure quick resolution. Compliance & Security: Ensure that all construction data complies with industry regulations and company policies. Assist in maintaining cybersecurity protocols for sensitive construction data. Perform regular data backups and ensure the security of digital project information. Skills & Qualifications: Technical Skills: Proficiency in construction management software (e.g., Procore, Buildertrend, Microsoft Project). Familiarity with design and drafting software (e.g., AutoCAD, Revit). Basic knowledge of networking and troubleshooting hardware and software issues. Experience: Previous experience in a computer operator or similar role, preferably within the construction industry. Understanding of construction terminology, processes, and industry standards. Education: High school diploma or equivalent; associate's degree in computer science, information technology, or a related field preferred. Mandatory Key Skills troubleshooting,compliance,cybersecurity protocols,Data Entry,Data Processing,System Management,construction project management*,AutoCAD*,Revit*

Posted 3 weeks ago

Apply

0.0 - 5.0 years

1 - 5 Lacs

mumbai, pune, mumbai (all areas)

Work from Office

Hiring Data Entry Operator – Typing speed 30+ WPM. Enter, update & maintain data in computer systems. Proficient in MS Excel, Word. Fresher/Experienced can apply. Back office, computer operator, data entry jobs. Immediate joiners preferred.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

18 - 25 Lacs

bengaluru

Work from Office

Job Title: Data Engineer | Python & SQL | Intelligence AI Platform Experience: 2-5 Years Job Description:We are looking for a skilled Data Engineer (2-5 years) to join our team working on Intelligine, an AI-powered content creation and marketing platform. The ideal candidate will have strong expertise in Python, SQL, and ETL/ELT pipelines with a proven ability to work on scalable data solutions. Key Responsibilities: * Design, build, and maintain robust ETL/ELT pipelines using Python and SQL * Develop optimized SQL queries (joins, CTEs, window functions) * Integrate data from multiple sources: APIs, flat files, databases, and third-party tools * Optimize workflows, query performance, and database design * Collaborate with cross-functional teams to deliver secure and scalable data solutions * Automate processes and monitoring to ensure pipeline reliability Qualifications: * Bachelors/Masters in Computer Science, Data Engineering, or related field * 2-5 years of professional experience as a Data Engineer (Python & SQL focus) * Strong understanding of data structures, algorithms, and DBMS Attributes: * Strong analytical and problem-solving mindset * Effective communication and collaboration skills * Attention to detail and code quality * Self-driven, able to work independently or in teams

Posted 3 weeks ago

Apply

4.0 - 8.0 years

5 - 8 Lacs

chennai

Work from Office

Responsibilities What you'll do Engineer, test, document and manage GCP Dataproc, DataFlow and VertexAI services used in high-performance data processing pipelines and Machine Learning. Help developers optimize data processing jobs using Spark, Python, and Java. Collaborate with development teams to integrate data processing pipelines with other cloud services and applications. Utilize Terraform and Tekton for infrastructure as code (IaC) and CI/CD pipelines, ensuring efficient deployment and management. Good to have Experience with Spark for large-scale data processing. Solid understanding and experience with GitHub for version control and collaboration. Experience with Terraform for infrastructure management and Tekton for continuous integration and deployment. Experience with Apache NiFi for data flow automation. Knowledge of Apache Kafka for real-time data streaming. Familiarity with Google Cloud Pub/Sub for event-driven systems and messaging. Familiarity with Google BigQuery Mandatory Key Skills Python,Java,Google Cloud Pub/Sub,Apache Kafka,Big Query,CI/CD*,Machine Learning*,Spark*

Posted 3 weeks ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

pune

Work from Office

Skills: 1] Hands-on experience in LLM/ML based application development 2] Productionizing LLM/ML apps Data Handling/processing/engineering Dataset creation/curation 3] Understanding of different LLM performance metrics, fine-tuning, prompt engineering 4] Image/Video Processing 5] Generative AI, OpenAI, Claude Knowledge. Good in Prompt Engineering, autogen or similar agentic framework knowledge Skills : - LLM/ML ,application development, Data Handling,Data processing,Data engineering,Dataset creation, LLM performance metrics, fine-tuning, prompt engineering, Image/Video Processing, Generative AI, OpenAI, Claude, Prompt Engineering, autogen, agentic framework Mandatory Key Skills Prompt Engineering,Generative AI*,performance metrics,data processing,OpenAI*,Claude*,llm,data engineering,artificial intelligence,video processing,application development

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

hyderabad, ahmedabad

Work from Office

The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering, and distribution to power our Financial Institution Group business and customer needs. We focus on platform scalability to support business operations by following a common data lifecycle that accelerates business value. Our team provides essential intelligence for the Financial Services, Real Estate, and Insurance industries. The Impact: The Data Engineering team will be responsible for implementing and maintaining services and tools to support existing feed systems. This enables users to consume FIG datasets and makes FIG data available for broader consumption and processing within the company. Whats in it for you: Opportunity to work with global stakeholders and engage with the latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using advanced data processing and cloud technologies. Collaborate with the broader technology team, including information architecture and data integration teams, to align pipelines with strategic initiatives. What Were Looking For: Bachelors degree in computer science or a related field, with at least 8+ years of professional software development experience. Must have: Programming languages commonly used for data processing,Data orchestration and workflow management systems,Distributed data processing framework, relational database management systems,Big data processing frameworks Experience with large-scale data processing platforms. Deep understanding of RESTful services, good API design, and object-oriented programming principles. Proficiency in object-oriented or functional scripting languages. Good working knowledge of relational and NoSQL databases. Experience in maintaining and developing software in production environments utilizing cloud-based tools. Strong collaboration and teamwork skills, along with excellent written and verbal communication abilities. Self-starter and motivated individual with the ability to thrive in a fast-paced software development environment. Agile experience is highly desirable. Experience with data warehousing and analytics platforms will be a significant advantage. Technical Expertise Data Engineering Expertise: Strong experience in distributed data processing and optimization using modern frameworks. Cloud Platforms: Proficient in leveraging cloud services for scalable data solutions, including ETL orchestration, containerized deployments, and data storage. Workflow Orchestration: Skilled in designing and managing complex data pipelines and workflows. Programming & Scripting: Proficient in writing clean, modular, and testable code for data processing tasks. Database Management: Solid understanding of both relational and non-relational databases, including data querying and modeling. ETL & Data Architecture: Proven ability to design and implement robust ETL pipelines and optimize data models for performance and scalability. Soft Skills Excellent communication skills able to articulate technical concepts to non-technical stakeholders. Strong interpersonal skills collaborative, empathetic, and team-oriented. Demonstrated ability to work on challenging projects and go the extra mile to deliver results. Preferred Qualifications Experience with CI/CD pipelines, Github and DevOps practices is must. Familiarity with data lake and data warehouse architectures. Exposure to real-time data processing frameworks, and observability tools like Grafana is added advantage.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

9 - 11 Lacs

pune

Work from Office

Responsibilities: Design, develop, and maintain robust and scalable backend systems using Django and Python. Develop RESTful APIs using Django REST Framework to power our frontend applications. Implement efficient database solutions using PostgreSQL and Django ORM. Write clean, well-documented, and maintainable code. Collaborate with the frontend team to ensure seamless integration between frontend and backend components. Optimize application performance and scalability. Implement security best practices to protect our applications and user data. Stay up-to-date with the latest technologies and industry trends. Contribute to the development of new features and improvements. Skills: Django Django Custom UI Python Rest Framework ORM HTML&CSS Chat-GPT Prompting GIT Knowledge SQL Postgres Industry standards and best practices JSON Handling Data Processing Working in Team Environment WhatsApp META API Experience is a PLUS. Mandatory Key Skills ChatGPT,GIT,SQL,Postgres,JSON,Data Processing,WhatsApp META API,Django,Python,Rest Framework,ORM,HTML*,CSS

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 18 Lacs

chennai

Work from Office

About the Team: ZF COE Team is effectively communicate complex technical concepts related to AI, ML, DL, and RL to both technical and non-technical audiences. This might involve presenting research findings at conferences or writing papers for academic journals. What you can look forward to as Senior Data Scientist -AI/ML (m/f/d): Conduct cutting-edge research to identify and develop novel AI/ML methodologies, including Deep Learning (DL) and Reinforcement Learning (RL). Design and conduct experiments to test hypotheses, validate new approaches, and compare the effectiveness of different ML algorithms. Analyze data to uncover hidden patterns and relationships that can inform the development of new AI techniques. Stay at the forefront of the field by keeping abreast of the latest advancements in algorithms, tools, and theoretical frameworks. This might involve researching areas like interpretability of machine learning models or efficient training methods for deep neural networks. Prototype and explore the potential of advanced machine learning models, including deep learning architectures like convolutional neural networks (CNNs) or recurrent neural networks (RNNs). Contribute to the development of fundamental algorithms and frameworks that can be applied to various machine learning problems. This may involve improving existing algorithms or exploring entirely new approaches. Focus on theoretical aspects of model design, such as improving model efficiency, reducing bias, or achieving explainability in complex models. Document research methodologies, experimental procedures, and theoretical contributions for future reference and knowledge sharing within the research community. Contribute to the development of the research team''s long-term research goals and strategies. Your profile as Senior Data Scientist- AI/ML(m/f/d): Master''s degree in Mathematics, Computer Science or other related technical fields. Phd is good to have. 8+ years of experience in research development, with a strong understanding of data structures and algorithms. 5+ years of experience building and deploying machine learning models in production. Proven experience with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn. Experience with distributed computing systems and large-scale data processing. Excellent communication and collaboration skills. Contribution to invention disclosure process. A strong publication record in top machine learning conferences or journals.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 3 Lacs

bengaluru

Work from Office

Opening for Research Analyst Consultant Experience: Freshers Qualification: 2nd PUC, Diploma, B.A, BCA B.E, MCA, not relevant( Please don't share the CV) Job Details The company provides IT enabled Data research & processing services for a US organizations in Academic domain. We are a team of highly motivated professionals providing highest quality data, with shortest turnaround time. We collect data related to Academic faculties studying in various universities of the U.S., affiliated to government or private aided organizations. This job profile focuses on collecting Career history of such faculty members from many universities, manually from different websites like University webpages, LinkedIn, Science direct, Google scholar, etc. Data collection includes personal information, articles, books, awards, salary etc. This data is collected & fed into our database through an Web application, based on classified affiliations (like university type, department type, degree type etc.) Qualifications & Skills Diploma or Graduation from any reputed university Freshers preferred. Any experience (0-1yrs) in can be considered Candidates must be proficient in English (Oral & Written) Good understanding of Internet research ability is a must. Ability to work independently with minimal follow ups for their assigned tasks. Familiarity with Microsoft Office applications Good learner; ability to grasp and think creatively when necessary Open minded, receptive, and adaptable to our work culture. Able to meet the required goals, with high quality and on time. Should be able to stretch timelines, when a need arises to meet deadlines. Responsibilities, but not limited to the following : Article Matching via RealTime Tool and DVI application OnBoard New Contacts to Database Collect and update of Degree Information (degree year, institution, and highest degree (usually Ph.D.) Collection of person specific URLs (these are used as an article matching aid and in system scoring) Collect faculty information from University websites Adhoc assignments as needed Preferences : Candidate must preferably be located in Bangalore Laptop & Broadband connection is Mandatory This is a Non-voice, Non-Technical opportunity; Mandatory Key Skills Data processing,Research Analyst,Internet research,Data collection,Data research.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

bengaluru

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 18 Lacs

bengaluru

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance. Mandatory Key Skills ETL development,SQL,data governance,machine learning,Microsoft Azure,Solution architecture,Data engineering,Cognos Reporting,DB2,Azure Databricks,Python*

Posted 3 weeks ago

Apply

5.0 - 7.0 years

15 - 17 Lacs

chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills java,data flow,sql,shell scripting,agile,cloud platform,google,gen,data processing,hive,sqoop,spark,hadoop,aws,big data,javascript,ansible,docker,jenkins,linux,microsoft azure,html,git,Google Cloud Platform*

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 4 Lacs

chandigarh, gurugram, mumbai (all areas)

Work from Office

Roles and Responsibilities JOB DESCRIPTION It is a non Voice and Voice process To be able to deal with customers and meet customer requirements. Candidate should be ready to work Desired Candidate Profile Industry: BPO / Call Centre / ITES Functional Area: ITES , BPO , KPO , LPO , Customer Service , Operations Role Category: Voice Role: Associate/ Senior Associate - (NonTechnical) Desired Candidate Profile Education qualification: Graduate in any discipline/ Under graduates/ Dropout. Perks and Benefits Domestic and International call Center WhatsApp number 9781021114 No Fees Call 9988350971 01725000971 7508062612 9988353971 Age Limit 18 to 32 12th or Graduate any degree or diploma can apply Salary 15000 to 35000 and incentive 1 lakh

Posted 3 weeks ago

Apply

0.0 - 4.0 years

1 - 4 Lacs

hyderabad, pune, bengaluru

Hybrid

Urgent Hiring For Data Entry Operator Basic Typing Speed Basic Computer Knowledge Fresher And Experience Both Can Apply No Target Work No time Boundation

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 2 Lacs

bengaluru

Remote

Ability to type and perform data entry with speed and accuracy alongside work management Ability to communicate effectively with others Prior exp using a computer terminal for data entry preferred

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

noida, new delhi, delhi / ncr

Work from Office

Role: Urgent Opening : DATA ENTRY / BACK Office Coordinator Industry Type: BPM / BPO Department: Customer Success, Service & Operations Employment Type: Full Time, Permanent Role Category: Back Office Education UG: Graduation Not Required

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

mumbai, mumbai suburban

Work from Office

Role: Urgent Opening : DATA ENTRY / BACK Office Coordinator Industry Type: BPM / BPO Department: Customer Success, Service & Operations Employment Type: Full Time, Permanent Role Category: Back Office Education UG: Graduation Not Required

Posted 3 weeks ago

Apply

0.0 - 4.0 years

2 - 5 Lacs

suryapet

Remote

Updating and maintaining databases, archives, and filing systems. Monitoring and reviewing databases and correcting errors or inconsistencies. Generating and exporting data reports, spreadsheets, and documents as needed. Required Candidate profile The ability to manage and process high volumes of data accurately. Good understanding of databases and digital and paper filing systems. Knowledge of administrative and clerical operations. Perks and benefits Flexible Hours. Retirement Plans.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

16 - 22 Lacs

chennai

Remote

Looking for an Oracle EBS Technical expert with strong skills in PL/SQL, Forms, Reports, Workflows, Interfaces & Conversions. Proficient in Python scripting & Java APIs. Knowledge of R12, performance tuning, debugging.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

noida, ghaziabad, new delhi

Work from Office

Role & responsibilities Good in English Typing at-least 30 wpm typing speed. Basic knowledge of MS Office. Coordination with internal team. Candidate preferred in DELHI .

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

india

On-site

DESCRIPTION Join our Global Retail Systems Development team at Amazon, where you'll collaborate on innovative Competitive Monitoring solutions. Your work will focus on creating effective distributed systems, data mining applications, and scalable web solutions that drive retail insights. We're seeking imaginative engineers who are curious about solving technical challenges in data processing and analysis. You'll develop efficient solutions for competitive data collection while working with modern technologies that impact Amazon's retail decisions. Your role involves designing intuitive APIs and creating accessible user interfaces that enhance our competitive analysis capabilities. As a self-starter based in Bangalore, you'll be part of a collaborative team that values creative approaches to technical challenges. Working with colleagues in Chennai and Seattle, you'll help develop adaptable solutions for data collection and analysis. Your responsibilities will include building resilient crawling systems, implementing data processing pipelines, and developing analytical tools that support informed decision-making. You'll create trustworthy solutions that prioritize security, performance, and reliability. This role offers opportunities to persist through challenging projects while maintaining up-to-date knowledge of emerging technologies. This position combines retail technology expertise with competitive analysis, providing space for multitasking and innovative thinking. You'll contribute to Amazon's future by developing efficient monitoring systems and collaborating with teams across global locations. BASIC QUALIFICATIONS - 3+ years of non-internship professional software development experience - 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience - Experience programming with at least one software programming language PREFERRED QUALIFICATIONS - 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience - Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies