Jobs
Interviews

4902 Data Processing Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

1 - 5 Lacs

mumbai, pune, mumbai (all areas)

Work from Office

Fresher / Experience both are welcome Vacancy available for Data Entry / Computer Operator role Immediate joiners Education: 12th pass / Any Graduate Basic computer skills (Typing, MS Excel)

Posted 3 weeks ago

Apply

1.0 - 5.0 years

1 - 2 Lacs

ghaziabad

Work from Office

Responsibilities: Accurately enter, update, and maintain data in company systems and databases. Verify and review data for errors, corrections, and completeness. Assistive technologies Accessible workspace

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

coimbatore

Work from Office

Role & responsibilities Preferred candidate profile MS Office PPT Good in Mail communication Mathematical knowledge

Posted 3 weeks ago

Apply

4.0 - 8.0 years

15 - 25 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 4 Lacs

mumbai suburban, mumbai (all areas)

Work from Office

Role & responsibilities RESPONSIBILITIES: Supporting a project team with all operational tasks including but not limited to Data Entry, Claim Review, UND/FWD linking, and case status reporting. Together, they will provide consistent superior client service. The ideal candidate will have excellent verbal and written communication skills, be consultative and solution-oriented, and be capable of managing multiple priorities. The Analyst must be a team player, committed to fostering a team environment and building cooperation between team members to provide the highest quality service to clients. Duties and Responsibilities include: Handle claim-related data entry tasks to support mailed letters and claim forms Proactive review of active cases to ensure mail and web correspondence is handled timely and effectively, escalating questions or potential challenges identified on the case Support project teams in ensuring timely completion of claim review Prepare weekly status reporting and summaries on cases according to requirements set by clients. Perform analysis of reporting and ensure quality assurance. Participate in training and ensure understanding of materials before beginning tasks Communicate verbally and in writing with project team members, on projects to ensure appropriate understanding of all projects Track all hours worked on each project accurately. Requirements: Attention to detail Ability to performs repetitive tasks with a high degree of accuracy Comfortable working with minimal supervision Knowledge of Microsoft Office Experience in the legal field or in a service industry highly preferred. Excellent verbal and written communication skills with a professional, calm demeanor. Critical thinking skills. The ability to efficiently gather and process information in a fast-paced environment are required. High proficiency in in MS Office Suite, particularly in Excel. Preferred candidate profile

Posted 3 weeks ago

Apply

0.0 - 5.0 years

1 - 2 Lacs

hyderabad

Work from Office

Proficient in typing with speed and accuracy to efficiently input and update data. Data Verification and Correction Data Management

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 3 Lacs

dharuhera, bawal, neemrana

Work from Office

Role & responsibilities Enter and update data accurately into company databases. Maintain records and ensure data integrity. Verify and correct errors in data before final submission. Handle confidential information with discretion. Generate basic reports from collected data.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 18 Lacs

chennai

Work from Office

Senior Data Scientist - AI/ML About the Team: ZF COE Team is effectively communicate complex technical concepts related to AI, ML, DL, and RL to both technical and non-technical audiences. This might involve presenting research findings at conferences or writing papers for academic journals. What you can look forward to as Senior Data Scientist -AI/ML (m/f/d): Conduct cutting-edge research to identify and develop novel AI/ML methodologies, including Deep Learning (DL) and Reinforcement Learning (RL). Design and conduct experiments to test hypotheses, validate new approaches, and compare the effectiveness of different ML algorithms. Analyze data to uncover hidden patterns and relationships that can inform the development of new AI techniques. Stay at the forefront of the field by keeping abreast of the latest advancements in algorithms, tools, and theoretical frameworks. This might involve researching areas like interpretability of machine learning models or efficient training methods for deep neural networks. Prototype and explore the potential of advanced machine learning models, including deep learning architectures like convolutional neural networks (CNNs) or recurrent neural networks (RNNs). Contribute to the development of fundamental algorithms and frameworks that can be applied to various machine learning problems. This may involve improving existing algorithms or exploring entirely new approaches. Focus on theoretical aspects of model design, such as improving model efficiency, reducing bias, or achieving explainability in complex models. Document research methodologies, experimental procedures, and theoretical contributions for future reference and knowledge sharing within the research community. Contribute to the development of the research team''s long-term research goals and strategies. Your profile as Senior Data Scientist- AI/ML(m/f/d): Master''s degree in Mathematics, Computer Science or other related technical fields. Phd is good to have. 8+ years of experience in research development, with a strong understanding of data structures and algorithms. 5+ years of experience building and deploying machine learning models in production. Proven experience with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn. Experience with distributed computing systems and large-scale data processing. Excellent communication and collaboration skills. Contribution to invention disclosure process. A strong publication record in top machine learning conferences or journals.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 2 Lacs

visakhapatnam

Work from Office

We are looking for a Typist Documentation (RC) to manage documentation, data entry, and record-keeping related to regulatory/compliance activities. The role involves accurate typing, formatting, and maintaining digital as well as physical files. Requirements: Graduate with proficiency in typing (English & local language preferred). Good knowledge of MS Office and document management systems. Strong accuracy, attention to detail, and organizational skills.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

noida

Work from Office

AWS-Specific Skills AWS Storage & Compute Services S3: Deep understanding of bucket policies, lifecycle rules, and data organization Glue: For ETL jobs, crawlers, and data cataloging Athena: Querying data directly from S3 using SQL EMR: For big data processing using Spark/Hadoop Lakehouse Technologies Apache Hudi: Knowledge of transactional data lakes, schema evolution, etc Data Cataloging & Governance AWS Glue Data Catalog Integration with Lake Formation Security & Compliance IAM roles and policies Encryption Data masking and tokenization strategies Mandatory Competencies Cloud - AWS - AWS S3, S3 glacier, AWS EBS Big Data - Big Data - Hadoop Big Data - Big Data - SPARK Database - Database Programming - SQL ETL - ETL - AWS Glue ETL - ETL - Data Stage Beh - Communication and collaboration.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

bengaluru

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

10.0 - 18.0 years

18 - 25 Lacs

mumbai

Work from Office

Responsibilities: Design, develop, and maintain robust and scalable backend systems using Django and Python. Develop RESTful APIs using Django REST Framework to power our frontend applications. Implement efficient database solutions using PostgreSQL and Django ORM. Write clean, well-documented, and maintainable code. Collaborate with the frontend team to ensure seamless integration between frontend and backend components. Optimize application performance and scalability. Implement security best practices to protect our applications and user data. Stay up-to-date with the latest technologies and industry trends. Contribute to the development of new features and improvements. Skills: Django Django Custom UI Python Rest Framework ORM HTML&CSS Chat-GPT Prompting GIT Knowledge SQL Postgres Industry standards and best practices JSON Handling Data Processing Working in Team Environment WhatsApp META API Experience is a PLUS Skills: Django Django Custom UI Python Rest Framework ORM HTML&CSS Chat-GPT Prompting GIT Knowledge SQL Postgres Industry standards and best practices JSON Handling Data Processing Working in Team Environment WhatsApp META API Experience is a PLUS

Posted 3 weeks ago

Apply

1.0 - 4.0 years

1 - 3 Lacs

pune

Work from Office

Review data for errors and correct any incompatibilities. Data entry with accuracy. Strong typing speed, attention to detail. Basic Proficiency in MS Office (Excel, Word) and Google Sheets. Maintain confidentiality and security of all data handled.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

bengaluru, karnataka, india

Remote

Who we are: Founded in 1982, Workplace Options (WPO) is the largest independent provider of holistic wellbeing solutions. Through our customized programs, and comprehensive global network of credentialed providers and professionals, we support individuals to become healthier, happier and more productive both personally and professionally. Trusted by 51% of Fortune 500 companies, we deliver high quality care digitally and in person to over 75 million individuals across 116,000 organizations in more than 200 countries and territories. At WPO, you will be joining a team that is committed to improving employee wellbeing around the world. Current Opportunity: Data Processing Specialist Location: India Remote/Hybrid/Onsite: Remote or hybrid depending on proximity to our Bangalore office. What you will do: The Data Processing Specialist is responsible for sourcing, validating, and maintaining high-quality provider databases for WPOs Locator tools across multiple regions. This includes managing complex processing operations, overseeing vendor performance, ensuring accuracy of the data, and supporting quarterly updates for a multi-type provider directory (child care, elder care, schools, and specialty services). The role requires strong data handling skills, meticulous attention to detail, and the ability to manage technically demanding processes. This position is embedded within a cross-functional team and works closely with Product, Content, Reporting, and Technology teams. It plays a key role in sustaining the accuracy, usability, and legal compliance of WPOs provider databases. Responsibilities: Data Sourcing and Vendor Oversight: Lead the sourcing of provider data via government websites, public records, and direct outreach to licensing bodies. Write structured mining and deduplication instructions for the vendor (Sasta Outsourcing Services). Ensure compliance with regional licensing and regulatory requirements. Data Management and Quality Assurance: Perform address formatting, subtype mapping, geocoding, duplicates, etc. on vendor data sets. Apply 4-layer deduplication protocols to merge vendor and in-house datasets. Use lookup tools and geocoding APIs (e.g., EXE tool, Geoapify) to fill missing fields (e.g., zip code, county, local authority). Maintain data formatting and subtype consistency per Locator taxonomy. Publishing and Reporting Support: Prepare quarterly database updates and coordinate handoffs with the Reporting team for publishing on WPO platforms. Validate that publishing metrics (record counts by region/subtype) match source files. Flag anomalies or failures in geocoding, QA, or publication output. Documentation and Process Optimization: Maintain documentation for geocoding workflows, vendor instructions, and subtype standards. Identify opportunities for automation and process improvement. Support audits, updates, and transitions related to Locator tools and content infrastructure. Qualifications/Skills: Bachelors degree in a data, technology, or information science field. 2+ years in data processing and vendor management. Experience in database content verification, public data sourcing, or regulated information processing. Strong proficiency in Excel and familiarity with lookup formulas, VLOOKUP, and batch processing. Knowledge of geocoding tools, APIs, or GIS software is an asset. Excellent organizational, written, and verbal communication skills. Self-driven, detail-oriented, and comfortable working in a cross-regional environment. What we offer: At Workplace Options, we dont just deliver wellbeing services to our clients, we champion wellbeing for our own employees as well. Examples of our benefits and commitment to employee wellbeing include: Benefits - Group Mediclaim Insurance for 6 lacs INR, Accident Insurance, Gym reimbursement, Tuition reimbursement, EAP Support Services, Mentorship program, WPO Cares, Employee exchange program, Comprehensive training provided for this position At Workplace Options, we are committed to and are accountable for building a workplace where individuals feel empowered to bring their whole selves to work, free from judgment or fear of discrimination. We understand that having a diverse organization is only the beginning and it will require nurturing and care to thrive. We will continue to take action to ensure we achieve equitable and measurable outcomes. We strive to cultivate a space where diverse voices are not only heard but actively sought out and valued for the unique insights they bring. By embracing and promoting authenticity, we aim to build a vibrant and inclusive community that fosters collaboration, innovation, belonging and personal growth. For further details about WPO please check out our website www.workplaceoptions.com and these short videos give a great overview of what we do Human-Powered Care and The WPO Global Experience Workplace Options collects and processes personal data in accordance with applicable data protection laws. If you are a European job applicant, refer to our Privacy Notice for further details (https://www.workplaceoptions.com/privacy-notice-for-recruitment/). Show more Show less

Posted 3 weeks ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

bengaluru

Work from Office

Job Description: Technical Animator with Unreal Engine ExpertiseOverview:We are seeking a skilled Animator with a strong proficiency in Unreal Engine, as well as excellent rigging and skinning capabilities The ideal candidate will have a passion for creating immersive, visually stunning experiences and will play a key role in bringing our projects to life Responsibilities: Animation Creation: Design and develop high-quality animations for characters, objects, and environments using industry-standard software and tools, with a focus on Unreal Engine Rigging and Skinning: Utilize advanced rigging and skinning techniques to create flexible and realistic character and object models, ensuring smooth deformations and natural movements Collaboration: Work closely with the art, design, and programming teams to integrate animations seamlessly into the game engine, ensuring optimal performance and visual fidelity Problem Solving: Identify and resolve technical challenges related to animation, rigging, and skinning, employing creative solutions to achieve desired outcomes Optimisation: Optimize animation assets and processes to maintain efficient workflows and meet performance targets, particularly within the constraints of real-time rendering in Unreal Engine Documentation: Document animation pipelines, techniques, and best practices to facilitate knowledge sharing and ensure consistency across projects Continuous Learning: Stay updated on industry trends, tools, and techniques related to animation, rigging, and Unreal Engine, actively seeking out opportunities for skill development and improvement Requirements: Professional Experience: Proven experience as a Animator, with a strong portfolio demonstrating proficiency in animation, rigging, and skinning, particularly within Unreal Engine environments Technical Skills: Expertise in industry-standard animation software such as Maya, Blender, or 3ds Max, along with advanced knowledge of Unreal Engine animation tools and workflows Rigging and Skinning Expertise: Demonstrated proficiency in character and object rigging, skinning, and deformation techniques, with a keen eye for detail and a focus on realism and performance Creativity: Strong creative and artistic abilities, with a passion for storytelling and the ability to breathe life into characters and environments through animation Communication Skills: Excellent communication and collaboration skills, with the ability to effectively convey ideas and feedback to team members of varying disciplines Problem-Solving Skills: Strong analytical and problem-solving abilities, with a proactive approach to identifying and addressing technical challenges in animation production Adaptability: Ability to adapt to evolving project requirements and timelines, working efficiently under pressure while maintaining a high standard of quality and attention to detail Preferred Qualifications: Experience with motion capture technology and data processing Knowledge of scripting and programming languages such as Python or C++ Familiarity with game development pipelines and methodologies Previous experience working on AAA game titles or equivalent projects

Posted 3 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

pune

Work from Office

Job Description Developer Python | aiohttp | REST |Pandas At Assent, we re solving complex technical challenges with global impact. Our multi-tenant SaaS platform helps the world s most influential companies gain deep visibility into their supply chains enabling them to manage risk, ensure compliance, and take real action on issues like forced labor, environmental sustainability, and ethical sourcing. We re looking for a Developer with deep expertise in Python and Back-end architecture to own the design system strategy and lead the evolution of our product. You should have a basic understanding of Angular and UI Technologies. You ll work closely with engineering leaders, product managers, and architects to define technical direction, enforce architectural standards, and ensure delivery of scalable, accessible, and well-tested WebApp solutions across multiple teams. You ll operate in a high-trust CI/CD environment every commit goes to production, and there s no separate QA team. That means you ll be responsible for setting expectations and ensuring quality practices are embedded in the development process. You ll drive the creation of automated test coverage strategies, web app development and documentation. We work with the latest technology and plan to keep doing so. Our team uses tools like GitHub Copilot to help with AI-powered coding, Lucid for architectural design, and we re always open to trying new tools that can help us create better experiences for our customers. Our Product Tech Stack will Includes: C#, Python Pandas NumPy/SciPy AIOHTTP CI/CD pipelines that support rapid, high-quality delivery to production Angular We value system thinking, performance, and platform maintainability. You ll help ensure that UI decisions support both short-term product goals and long-term architectural integrity. What You ll Do: As a Python Developer for this foundational pilot, you will play a pivotal role in designing and implementing the core serverless backend architecture that powers the user interaction and data processing pipeline. Your focus will be on building a scalable, observable, and maintainable API and orchestration layer leveraging AWS services. Key responsibilities include: Design and develop RESTful APIs using AWS API Gateway integrated directly with Lambda functions, removing the need for a traditional monolithic BFF. Implement endpoint logic in Python-based Lambda functions, focusing on modular, single-responsibility functions such as file upload, status checks, and analytics retrieval. Ensure the API layer abstracts away all backend complexity from the Angular frontend, providing clean and reliable interfaces to support user interactions. Support core user flows, such as uploading CMRT files and retrieving analytic results, by implementing and exposing APIs that connect to appropriate backend services. Partner closely with frontend engineers, data platform teams, and Gold-tier data engineers to define requirements, ensure backend compatibility, and surface data insights effectively. Work with data engineers to integrate the API layer with the medallion architecture (Bronze, Silver, Gold), routing requests dynamically based on the data and analytics context. Implement and maintain AWS Step Functions workflows to orchestrate file processing, from ingestion through to entity resolution and persistence. Use EventBridge to trigger workflows on CMRT file uploads to S3, ensuring the system is extensible for future ingestion paths (e.g., SFTP, email, or 3rd-party systems). Set coding standards, enforce clean architecture principles, and mentor other developers working on the Lambda and orchestration components. By leading the development of this cloud-native, event-driven architecture, youll enable rapid iteration, scalability, and future extensibility of the RM Foundational Pilot platform. Qualifications Bachelors degree in Computer Science, Engineering, or a related field (or equivalent practical experience). 1 to 5 years of experience in front-end development, with at least 4 years focused on Angular. Strong Python expertise, especially in building modular, serverless applications using AWS Lambda. Experience with AWS services, including API Gateway, Step Functions, EventBridge, and S3. Strong understanding of Apache Kafka/Amazon Kinesis Basic understanding of Pandas and Pyspark. Familiarity with ML Integration. Familiarity with event-driven architecture and orchestrating workflows using AWS-native tools. Understanding of data processing pipelines and integrating with layered data architectures (e.g., Bronze, Silver, Gold). Experience collaborating across frontend, backend, and data teams to deliver end-to-end functionality. Strong emphasis on clean testing, and performance optimization. Leadership in setting technical direction, mentoring developers, and maintaining code quality at scale. Excellent communication, leadership, and problem-solving skills. Preferred Qualifications Strong understanding of event-driven and asynchronous workflows in cloud environments. Familiarity with data platform integration, including medallion architectures (Bronze, Silver, Gold) and analytics consumption patterns. Experience with CI/CD pipelines, infrastructure as code (e.g., CloudFormation, CDK, or Terraform), and version control best practices. Experience with Angular frontend integration and knowledge of CMRT or similar compliance data formats. Familiarity with CI/CD pipelines for front-end deployments. Experience with cloud platforms (e.g., AWS, Azure, GCP).

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

bengaluru

Work from Office

Huron is redefining what a global consulting organization can be. Advancing new ideas every day to build even stronger clients, individuals and communities. We re helping our clients find new ways to drive growth, enhance business performance and sustain leadership in the markets they serve. And, we re developing strategies and implementing solutions that enable the transformative change they need to own their future. As a member of the Huron corporate team, you ll help to evolve our business model to stay ahead of market forces, industry trends and client needs. Our accounting, finance, human resources, IT, legal, marketing and facilities management professionals work collaboratively to support Huron s collective strategies and enable real transformation to produce sustainable business results. Join our team and create your future. In the Data Engineer role, you will be working with the data engineers, MS Power BI Developers, data Architect, team leads and practice leadership to contribute to the operational effectiveness of the Software Engineering function, team, and strategy. This role is based in Bengaluru, India, and will work with Software Development and Corporate IT, and business/practice resources based in the US and India. Under limited direction, this individual is responsible for designing and developing solutions that improve how Huron does business. This individual participates in the full software development life cycle including requirements development, analysis, design, implementation, and support. In this role, you will also create technical specifications based on conceptual design and business requirements and will consult with project and business teams to prototype, refine, test, and debug solutions. You will use current programming languages and technologies to build solutions and integrations between systems, develop documentation and procedures for installation, support, and maintenance. All solutions will be built based on Huron s enterprise platforms and technologies. Requirements and Preferences Bachelor s degree in Engineering ( or related field) with 2+ years of relevant industry experience. Hands-on experience in building data lake, data warehouse, and big data/analytics solutions. Proficiency in ETL development, data modeling (dimensional and normalized), and batch data processing (Hadoop/Spark). Strong programming skills in Python, SQL, and Bash, with the ability to create robust and reusable code. Experience working with AWS (S3, Lambda, Glue, RDS, Redshift, Athena) and Azure (Data Factory, containers) ecosystems. Demonstrated experience with Power BI, including building data models and dashboards, especially using data from Databricks. Preferred Certifications: Microsoft Certified: Power BI Data Analyst Associate (PL-300) AWS Certified Developer Associate (DVA-C01) AWS Certified Data Analytics Specialty (DAS-C01) Azure Data Engineer Associate (DP-203) Position Level Analyst Country India

Posted 3 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

chennai

Work from Office

About the Team: ZF COE Team is effectively communicate complex technical concepts related to AI, ML, DL, and RL to both technical and non-technical audiences. This might involve presenting research findings at conferences or writing papers for academic journals. What you can look forward to as AI Research Scientist (m/f/d): Conduct cutting-edge research to identify and develop novel AI/ML methodologies, including Deep Learning (DL) and Reinforcement Learning (RL). Design and conduct experiments to test hypotheses, validate new approaches, and compare the effectiveness of different ML algorithms. Analyze data to uncover hidden patterns and relationships that can inform the development of new AI techniques. Stay at the forefront of the field by keeping abreast of the latest advancements in algorithms, tools, and theoretical frameworks. This might involve researching areas like interpretability of machine learning models or efficient training methods for deep neural networks. Prototype and explore the potential of advanced machine learning models, including deep learning architectures like convolutional neural networks (CNNs) or recurrent neural networks (RNNs). Contribute to the development of fundamental algorithms and frameworks that can be applied to various machine learning problems. This may involve improving existing algorithms or exploring entirely new approaches. Focus on theoretical aspects of model design, such as improving model efficiency, reducing bias, or achieving explainability in complex models. Document research methodologies, experimental procedures, and theoretical contributions for future reference and knowledge sharing within the research community. Contribute to the development of the research team''s long-term research goals and strategies. Your profile as AI Research Scientist (m/f/d): Master''s degree in Mathematics, Computer Science or other related technical fields. Phd is good to have. 8+ years of experience in research development, with a strong understanding of data structures and algorithms. 5+ years of experience building and deploying machine learning models in production. Proven experience with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn. Experience with distributed computing systems and large-scale data processing & Excellent communication and collaboration skills. Contribution to invention disclosure process & A strong publication record in top machine learning conferences or journals.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 3 Lacs

mumbai, thane, navi mumbai

Work from Office

1) Monitoring CCTV camera for suspicious activities 2) Reporting unusual activities such as unauthorized access in restricted lobby’s, vandalism theft attempts 3) Checking alerts and alarms generated through computer 4) Maintaining records in excel

Posted 3 weeks ago

Apply

3.0 - 5.0 years

2 - 3 Lacs

vapi

Work from Office

Candidate must have 3 - 4 years experience in Computer Data Entry. B Com qualification is compulsory

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM WebSphere DataPower Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality. Roles & Responsibilities:-Expertise in XSLT or GatewayScript:Proficient in using XSLT for transforming and processing data.-REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.-Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG), and XML Firewalls.-XML and Related Technologies:Strong knowledge of XML, WSDL, XSLT, JSON, XML Schema, and XPATH. Professional & Technical Skills: - Must To Have Skills: Expertise in XSLT or GatewayScript:Proficient in using GatewayScript for transforming and processing data.- Good To Have Skills: Strong understanding of REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.- Familiarity with Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.- Strong knowledge of JSON & schema-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG) Additional Information:- The candidate should have a minimum of 3 years of experience in IBM WebSphere DataPower.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design and development of applications.- Act as the primary point of contact for application-related queries.- Collaborate with team members to ensure project success.- Provide technical guidance and mentorship to junior team members.- Stay updated on industry trends and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of big data processing and analytics.- Experience with data processing frameworks like Apache Spark.- Hands-on experience in building scalable data pipelines.- Knowledge of cloud platforms for data processing.- Experience in performance tuning and optimization. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache AirflowMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements, ensuring efficient application performance and functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with Python (Programming Language), Apache Airflow- Strong understanding of data analytics and data processing- Experience in building and optimizing data pipelines- Knowledge of cloud platforms and services for data processing Additional Information:- The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and enhance application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to enhance application functionality.- Conduct code reviews and provide technical guidance to team members.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated on industry trends and technologies to drive continuous improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Hadoop.- Strong understanding of distributed computing principles.- Experience with data processing frameworks like MapReduce and Spark.- Hands-on experience in designing and implementing scalable data pipelines.- Knowledge of Hadoop ecosystem components such as HDFS, YARN, and Hive. Additional Information:- The candidate should have a minimum of 3 years of experience in Apache Hadoop.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and deliver high-quality solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure best practices and quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Hadoop.- Strong understanding of distributed computing principles and frameworks.- Experience with data processing and analysis using Hadoop ecosystem tools.- Familiarity with programming languages such as Java or Python.- Knowledge of data storage solutions and data management best practices. Additional Information:- The candidate should have minimum 3 years of experience in Apache Hadoop.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies