Jobs
Interviews

4894 Data Processing Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

12 - 15 Lacs

pune

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components . Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

12 - 14 Lacs

mumbai

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components .Write and optimize advanced SQL queries for data extraction, transformation, and analysis.Develop data processing workflows and ETL processes using Python and PySpark.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

13 - 16 Lacs

mumbai

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components .Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark. Implement data solutions using Azure Data Stack components . Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

pune

Work from Office

Design, develop, and implement data solutions using Azure Data Stack components . Write and optimize advanced SQL queries for data extraction, transformation, and analysis. Develop data processing workflows and ETL processes using Python and PySpark.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

bengaluru

Work from Office

The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

mumbai, pune, chennai

Work from Office

Senior-Level Data Engineer Bachelors with 6+ or Masters with 5+ experience in Computer Science, Engineering, Math or other quantitative field. 5+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 5+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 5+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 5 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies. 5+ years' Experience with large-scale data processing and query optimization techniques using TSQL. 5+ years' Experience with implementing audit, balance and control mechanism in data solutions 3+ year experience with any source control repos like GIT, TFVC or Azure DevOps , including branching and merging and implement CICD pipelines for database and ETL workloads. 2+ experience working with Python Pandas libraries to process semi structured data sets, and load them to SQL Server DB.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

mumbai, pune, chennai

Work from Office

Roles and Responsibility Design, develop, and implement big data solutions using various technologies. Collaborate with cross-functional teams to identify business requirements and develop technical solutions. Develop and maintain large-scale data processing systems and pipelines. Ensure data quality and integrity by implementing data validation and testing procedures. Optimize system performance and scalability through tuning and optimization techniques. Participate in code reviews and contribute to improving overall code quality. Job Requirements Strong understanding of big data concepts and technologies such as Hadoop, Spark, and NoSQL databases. Experience with programming languages like Java, Python, or Scala. Knowledge of cloud-based big data platforms such as AWS or GCP. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Strong analytical and critical thinking skills.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

6 - 10 Lacs

pune

Work from Office

We are seeking a dynamic and experienced Tech Lead with a strong foundation in Java and Apache Spark to join our team. In this role, you will lead the development and deployment of scalable cloud-based data solutions, leveraging your expertise in AWS and big data technologies. Key Responsibilities: Lead the design, development, and deployment of scalable and reliable data processing solutions on AWS using Java and Spark. Architect and implement big data processing pipelines using Apache Spark on AWS EMR. Develop and deploy Serverless applications using AWS Lambda, integrating with other AWS services. Utilize Amazon EKS for container orchestration and microservices management. Design and implement workflow orchestration using Apache Airflow for complex data pipelines. Collaborate with cross-functional teams to define project requirements and ensure seamless integration of services. Mentor and guide team members in Java development best practices, cloud architecture, and data engineering. Monitor and optimize performance and cost of deployed solutions across AWS infrastructure. Stay current with emerging technologies and industry trends to drive innovation and maintain a competitive edge. Required Skills: Strong hands-on experience in Java development. Proficiency in Apache Spark for distributed data processing. Experience with AWS services including EMR, Lambda, EKS, and Airflow. Solid understanding of Serverless architecture and microservices. Proven leadership and mentoring capabilities. Excellent problem-solving and communication skills.

Posted 2 weeks ago

Apply

0.0 - 1.0 years

3 - 6 Lacs

hyderabad

Work from Office

Job Summary : As a Clinical reporting analyst you will be integral to our mission of providing accurate and timely analysis of ECG data, contributing to the improvement of patient care and outcomes. looks forward to your contributions to our team and the impact you will make in enhancing our data processing capabilities. Join us in embracing the startup vibe of agility, open communication, and teamwork. Here, you'll thrive in an environment where learning, challenging the status quo, and unleashing your creativity are encouraged. Your voice matters, and together, we move swiftly, learn from missteps, and make meaningful impacts. Let's forge ahead, innovate, and make a difference. Come be a part of our dynamic team! Job Responsibilities: Every candidate goes through a 6 week training program. The training covers ECG Analysis training, data processing techniques and software training. Once the training completes, your primary duties will include: Sanitise and process up Beat data as per the standard process. Prepare up Beat data with appropriate highlights for further processing. Effectively communicating ECG abnormalities by notifying lead technicians and/or physicians and clinical staff as necessary. Maintaining compliance with job-specific proficiency requirement. Your specific responsibilities may change from time to time at the discretion of t Company. You will also be expected to comply with all rules, policies, and procedures of the Company, as they may be adopted and modified from time to time. Candidate Requirements: 12th grade + Diploma in cardiology or Bachelors Degree in Zoology, lifesciences. Experience as a Holter Scanner or telemetry / monitor technician will be an added advantage . Proficiency level in handling computers. Excellent attention to detail . Positive attitude and team player, ability to use critical thinking skills . Knowledge of medical terminology, specific to Cardiology and Electrophysiology. Excellent written and verbal communication skills . Strong analytical, communication, and interpersonal skills.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

hyderabad, gachibowli

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

hyderabad

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

hyderabad, hitech city

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

navi mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

mumbai suburban

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 20 Lacs

mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab.

Posted 2 weeks ago

Apply

12.0 - 15.0 years

15 - 20 Lacs

pune

Work from Office

We are looking for a highly motivated Engineering Lead Cloud and Data Engineer to lead the development of scalable, cloud-native data solutions. This role demands both strong technical expertise and leadership capabilities to ensure seamless integration of cloud technologies with modern data engineering practices. Key Responsibilities: Lead the architecture, design, and implementation of cloud-native data engineering solutions on platforms like AWS. Orchestrate integration of applications and data sources across multi-cloud or hybrid cloud environments. Design and implement automated data pipelines using tools like Apache Spark and cloud-native services (AWS Glue, EMR, etc.). Collaborate with cross-functional teams to ensure secure and efficient connectivity and data flow between systems. Provide technical leadership and mentorship to junior engineers in both cloud and data engineering domains. Ensure solutions are built with scalability, maintainability, and performance in mind. Lead efforts in data governance, security, and compliance, aligned with enterprise standards. Conduct code reviews, performance tuning, and troubleshooting across the cloud data stack. Stay current with emerging cloud technologies, trends, and best practices to drive innovation in architecture and engineering processes. Qualifications: Minimum 8+ years of experience in cloud engineering, data engineering, or related areas. Deep expertise in AWS cloud services (e.g., EC2, Lambda, S3, EMR, Glue). Hands-on experience with Apache Spark for big data processing is highly preferred. Proven experience in application and data integration across cloud environments. Strong leadership, analytical thinking, and excellent communication skills.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

1 - 2 Lacs

ahmedabad, surat, vadodara

Work from Office

Urgent Hiring for Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records of company Required Candidate profile Data entry work experience, as a Data Entry Operator BasicTyping Speed, Basic Computer Knowledge Fresher & Experience Both Can Apply interested client send resume sankalpmanpowerservicesjobs@gmail.com

Posted 2 weeks ago

Apply

1.0 - 4.0 years

1 - 2 Lacs

kolkata, mumbai, rudrapur

Work from Office

Urgent Hiring for Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records of company Required Candidate profile Data entry work experience, as a Data Entry Operator BasicTyping Speed, Basic Computer Knowledge Fresher & Experience Both Can Apply interested client send resume sankalpmanpowerservicesjobs@gmail.com

Posted 2 weeks ago

Apply

1.0 - 4.0 years

1 - 2 Lacs

haridwar, dispur, sivaganga

Work from Office

Urgent Hiring for Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records of company Required Candidate profile Data entry work experience, as a Data Entry Operator BasicTyping Speed, Basic Computer Knowledge Fresher & Experience Both Can Apply interested client send resume sankalpmanpowerservicesjobs@gmail.com

Posted 2 weeks ago

Apply

1.0 - 4.0 years

1 - 2 Lacs

ballari, kolhapur, hassan

Work from Office

Urgent Hiring for Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records of company Required Candidate profile Data entry work experience, as a Data Entry Operator BasicTyping Speed, Basic Computer Knowledge Fresher & Experience Both Can Apply interested client send resume sankalpmanpowerservicesjobs@gmail.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

noida

Work from Office

Primary Responsibilities: Work on business analysis requests around different business problems Creating analytical findings for the internal clients of UHG as per specifications received from them using information in UHG Specific databases. Manage changing business priorities and scope and work on multiple projects concurrently Self - motivated and proactive with the ability to work in a fast paced environment Document, discuss and resolve business, data, data processing and BI/ reporting issues within the team, across functional teams, and with business stakeholders Present written and verbal data analysis findings, to both the project team and business stakeholders as required to support the requirements gathering phase and issue resolution activities Coaching and mentoring of other team members and helping them in Business/technical challenges Draw up project plan for the analysis and execute as per schedule Perform quality checks on the analysis and data for accuracy, completeness, and consistency before sending it across to the clients both internal and external End to end experience in designing and deploying analysis/ Dashboard/ data visualizations using Tableau/Power BI Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelors / 4 year university degree 7+ years business experience in analytics projects in Business Intelligence/Business Analysis space6+ years of solid work experience - Ability to convert Business requirements into technical requirements and ability to develop Best in class code as per Technical/Business requirements Solid work experience in SQL or associated languages viz. R, Python, HIVE, Databricks In-Depth/Project Knowledge of Business intelligence tools like Tableau/Power BI, etc. Proven interpersonal, collaboration, diplomatic, influencing, planning and organizational skills Proven relationship management skills to partner and influence across organizational lines Proven ability to consistently demonstrate clear and concise written and verbal communication Proven ability to effectively use complex analytical, interpretive and problem-solving techniques Demonstrated ability to be work under pressure and to meet tight deadlines with proactive, decisiveness and flexibility

Posted 2 weeks ago

Apply

10.0 - 15.0 years

11 - 16 Lacs

mumbai, pune, chennai

Work from Office

Experience Level: 10+ years Required Skills and Expertise: Experience: 10+ years of experience in data engineering, data architecture, data science and reporting. Data Architecture: Proven expertise in designing and managing scalable data architecture, including data lakes, warehouses, and pipelines. Data Science: Strong understanding of machine learning, statistical modeling, and data analytics. Power BI: Understanding the dashboard, reports and underline semantic layer and queries Mongo: Understanding and usage of mongo for troubleshooting, development perspective Kubernetes/Dockers: Current env is in containers based so it is nice to have Python Expertise: Hands-on and advanced proficiency in Python for data engineering, automation, and analytics tasks. Key Responsibilities: Design, implement, and optimize scalable data architectures to support business intelligence, analytics, reporting and machine learning initiatives. Develop and execute an overarching data strategy that aligns with organizational goals and objectives. Lead the development and management of the entire data landscape, including data pipelines, data lakes, and data warehouses. Leverage Python programming for data engineering tasks, including automation, data processing, and analytics. Provide thought leadership and guidance on emerging trends and technologies in data engineering, data science, data architecture and reporting

Posted 2 weeks ago

Apply

3.0 - 8.0 years

2 - 6 Lacs

hyderabad

Work from Office

Strong experience in programming fundamentals in .NET, C#, and API-based development. Expertise in working with Service Oriented Architectures, Micro-services, and Web Services. Object Oriented Programming & Design Patterns in addition to distributed computing. Proven experience in debugging large complex software & working on production-quality applications. Expertise in data processing and storage technologies like SQL and/or NoSQL database systems. Strong experience in working with Microsoft SW development tools (Visual Studio, performance profiles, debugging, and analysis tools) Strong knowledge of various MS frameworks for UI Development. Strong ability to understand software architectures/design and develop solutions conformant to defined architectures and design. Experience in requirements elicitation, design, development, effective reuse, diagnostics, and configuration management Good knowledge of SDLC and software engineering Understand, troubleshoot, and drive difficult issues that span whole software systems. Able to uncover the root cause and to devise and drive innovative analyses and solutions for complex problems. Creative thinker with good problem-solving abilities Collaborate with teams across different geographical zones to drive/develop/deliver software solutions. Strong analytical and problem-solving abilities Self-learner, able to work with minimum supervision.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

mumbai, pune, chennai

Work from Office

Requirements 5+ years' experience with the Microsoft Azure suite of products, mainly from a data ingestion perspective using ADF Pipeline, Azure Databricks. 5+ years' hands-on experience with Databricks , ADF, PySpark, & Azure SQL mandatory. Python/PySpark coding skills. Experience working with ADLS delta tables. Good understanding of data models and databases. Strong SQL skills. Worked on different SCD types. Able to deliver independently. Should have strong SQL Procedure writing skills & ACID property knowledge. Should be able to follow coding standards, take ownership & highly competent with data processing/optimization skills. Bachelor's degree in a related field.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

10 - 15 Lacs

mumbai, pune, chennai

Work from Office

Responsibilities Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement Batch Processing - Capability to design an efficient way of processing high volumes of data where a group of transactions is collected over a period Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects Understand the difference between on-prem and cloud-based data integration technologies. The Role offers Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations Gives an opportunity to showcase candidates strong analytical skills and problem-solving ability Learning & Growth opportunities in cloud and Big data engineering spaces Essential Skills 6+ years experience in developing large scale data pipelines in a cloud/on-prem environment. Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc., Deep knowledge in Data warehouse/Data Mart architecture and modelling Define and develop data ingest, validation, and transform pipelines. Deep knowledge of distributed data processing and storage Deep knowledge of working with structured, unstructured, and semi structured data Working experience needed with ETL/ELT patterns Extensive experience in the application of analytics, insights and data mining to commercial real-world problems Technical experience in any one programming language preferably, Java, .Net or Python Essential Qualification BE/Btech in Computer Science, Engineering or relevant field

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies