Home
Jobs

1578 Data Processing Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Project Manager In this role, you will: Agile Leadership Team Management -Serve as a servant leader, facilitating agile ceremonies (Daily Stand-ups, Sprint Planning, Retrospectives, etc. ) and coaching teams on Scrum practices to enhance delivery and efficiency. Ensure alignment of project goals with broader program objectives and actively promote Agile and DevOps practices. Manage team performance, fostering an environment of collaboration, continuous improvement, and accountability. Project Delivery Reporting Drive end-to-end project delivery, including resource planning, scheduling, risk and dependency management, and reporting. Implement effective project controls, including change control processes, to manage scope and mitigate risks proactively. Provide regular updates on project status, budget, and resource utilization to leadership and stakeholders. Stakeholder Management Communication: Build strong relationships with internal and external stakeholders, acting as a liaison to ensure alignment and buy-in on project plans, changes, and decisions. Manage stakeholder expectations and report on project progress, challenges, and achievements to influence key decision-making. Cloud Migration Data Management: Lead projects within data domains, focusing on cloud migration (GCP), data processing, reporting, and automation. Oversee the design and implementation of data solutions aligned with regulatory requirements, ensuring the successful handling of complex, large-scale data processing needs. Quality Assurance Benefit Realization: Establish quality assurance processes and enforce best practices for continuous improvement and optimization. Develop a benefits realization plan to track and report on project outcomes against goals. Requirements To be successful in this role, you should meet the following requirements: Proven experience in Agile methodologies (Scrum, SAFe) and a strong understanding of agile product delivery in practice. Hands-on experience with Cloud Migration projects Solid foundation in project management principles, with demonstrated success in managing data migration, reporting, and production support projects. Proven experience in handling change management, troubleshooting, and root cause analysis within IT environments. Familiarity with DevOps practices and metrics-driven productivity improvement. Strong analytical and decision-making abilities, with a focus on prioritizing competing demands effectively. Excellent communication and interpersonal skills for effective stakeholder management and team collaboration. Ability to mentor and guide teams, promoting team wellness, respect for diverse skill sets, and leveraging individual strengths for collective success. Behavioral Attributes: Demonstrates a team-first attitude, safeguarding team interests and promoting shared achievements. Skilled in servant leadership, coaching, and fostering team growth. Ability to work collaboratively with global and cross-cultural teams, with a "share the glory, shield the blame" approach. Preferred Qualifications: Certification in Agile methodologies (e. g. , CSM, SAFe) is a plus. Experience in regulatory and compliance environments within the data domain Location : Pune and Hyderabad

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

As HR Practitioner - Learning, you will be supporting employees and business needs in delivering outstanding HR services for employees. Your primary responsibilities include: You will manage end-to-end learning administration tasks via the Learning Administration System (LMS), including but not limited to creating/modifying/cancelling courses/classes/curriculum, registration/assignment/completion/update of learners to courses/classes/curriculum Provide employee service and act as a point of contact for employees with learning queries Support queries related to employee password reset Handle dispatching of work queue effectively To have a good understanding of the internal and external policies, procedures, regulations, and compliance related to Human Resources and respond to employee queries Provide active contribution to any ongoing projects Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate/Postgraduate (MBA HR is preferred) with a minimum of 2-4 years of experience in Learning administration or Customer relationship management/business administration or a related field will be preferred Proven experience in Microsoft Excel, PowerPoint, MS Word and GSuite Excellent verbal and written English language skills Prior experience in Training or presentation will be a plus Experience working in a fast-paced, client-facing environment Validated organizational skills and the ability to prioritize and time management are essential for this role Demonstrating strong attention to detail, follow and quick responsiveness is imperative for this role Ability to develop effective relationships with internal and external stakeholders of the organization Flexibility to work in shifts including night shift (during training/knowledge transfer activities) will be preferred Problem solving skills and ability to analyze errors/complex issues and identify appropriate solutions Preferred technical and professional experience Prior experience in Data processing or Data management skill will be preferred Working knowledge of Workday will be an added advantage

Posted 1 week ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Gurugram

Work from Office

Naukri logo

A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics

Posted 1 week ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Gurugram

Work from Office

Naukri logo

A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics

Posted 1 week ago

Apply

7.0 - 11.0 years

9 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Property & Casualty - Catastrophe Risk Management Designation: CAT Modeling & Analytics Specialist Qualifications: BE Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do We help insurers redefine their customer experience while accelerating their innovation agenda to drive sustainable growth by transforming to an intelligent operating model. Intelligent Insurance Operations combines our advisory, technology, and operations expertise, global scale, and robust ecosystem with our insurance transformation capabilities. It is structured to address the scope and complexity of the ever-changing insurance environment and offers a flexible operating model that can meet the unique needs of each market segment.Cat Risk Analytics team develops technical solutions, applications and tools to support General Insurance business with risk analysis, pricing, portfolio analytics and cat modeling-Develop, maintain and support account pricing/valuation tools and modules-Develop, maintain and support cat modeling for account and portfolio-Develop, maintain and support data for hazard analytics-Develop, maintain and support hazard analytics for natcat and man-made cat risk-Develop, maintain and support portfolio analytics and portfolio modeling tools-Proof-of-concept new ideas (tools and data) quickly for risk analysis, pricing and portfolio analysis-Provide business support by responding to adhoc requests and reports in a timely manner What are we looking for Geo-Spatial Developer/Expert:Strong hands-on experience in Geo-spatial data and technologies, and depth in GIS best practices and principles; Must be able to work independently on technical tasksExpert in ESRI/ArcGIS tool suite - ArcMap/ArcGIS Pro, ArcGIS Server, ESRI/JavaScript, ArcPy (Python)Geo-spatial SQL programming (Oracle Spatial or PostGreSQL) is required.Experience in Python programming is requiredExperience in spatial data processing tool like Safe/FME desktop is required.Demonstrated experience with 3rd party data sources and geo-coding tools is requiredDemonstrated willingness to perform data analysis, and ability to abstract rules for data store and processing. Experience with large data is requiredExperience in Insurance/Reinsurance domain is highly desirable.Able to think independently and creativelyExperience with responding to production business queries and issues in a timely mannerClear and effective communicationStrong attention to detailExcellent time-management skills and multi-tasking under pressureMust be able to balance and adjust to changing organizational prioritiesDrive to solve problems and self-motivated learningMust be able to think outside of the box and have the ability to work in a small, collaborative team environment.Ability to work with global teams in a collaborative modeAbility to deal effectively and courteously with users and colleaguesMust be able to work outside of normal business hours, including evenings, weekends and public holidays, as necessary Roles and Responsibilities: Qualification BE

Posted 1 week ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Java Enterprise EditionMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of the projects you are involved in, ensuring that the applications you develop are efficient and effective in meeting user needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Java Enterprise Edition.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Very good experience on Continuous Flow Graph tool used for point based development. Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Over all 8 Years and Relevant 5+ years Extract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems. Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience Effective communication and presentation skills. Industry expertise / specialization

Posted 1 week ago

Apply

3.0 - 7.0 years

1 - 3 Lacs

Greater Noida

Work from Office

Naukri logo

Role & responsibilities Sound Knowledge & hands on experience on H-look Up, V-Look Up, Pivot Table, Conditional Formatting etc Good in preparing MIS Report Perform data analysis for generating reports on periodic basis Provide strong reporting and analytical information support Knowledge of various MIS reporting tools

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 4 to 8 years Location: Gurgaon Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 24 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Description We are seeking an experienced NLP Engineer to join our team in India. The ideal candidate will have a strong background in natural language processing and machine learning, with the ability to develop and implement innovative solutions to complex language challenges. Responsibilities Design and implement natural language processing (NLP) models and algorithms. Collaborate with data scientists and software engineers to integrate NLP capabilities into existing applications. Conduct research to improve existing NLP models and explore new approaches to solving language-related problems. Analyze and preprocess large datasets for training NLP models. Deploy and maintain NLP models in production environments, ensuring high performance and scalability. Skills and Qualifications 5-10 years of experience in natural language processing or related fields. Strong programming skills in Python, Java, or similar languages. Proficiency in NLP libraries and frameworks such as NLTK, SpaCy, TensorFlow, or PyTorch. Experience with machine learning algorithms and statistical methods. Familiarity with data preprocessing techniques for text data. Knowledge of deep learning architectures for NLP, including LSTM, CNN, and Transformers. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities.

Posted 1 week ago

Apply

8.0 - 13.0 years

4 - 9 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Responsibilities: Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform). To transform data into meaningful insights primarily using DBT and GCP Bigquery Ensure that our built data products work as independent units of deployment and non-functional aspects of the data products follow the defined standards for security, scalability, observability, and performance. Work close to the Product Owner and other stakeholders around vision for existing data products and identifying new data products to support our customer needs Work with product teams within and outside our domain around topics that relate to the data mesh concept. Evaluate and drive continuous improvement and reducing technical debt in the teams Maintain expertise in latest data/analytics and cloud technologies Qualifications Passion for Data, people, and technology At least 8+ years work experience including hands-on as either: Data engineer on modern cloud data platforms /or advanced analytics environments. Software Engineer with cloud technologies and infrastructure Excellent Experience with DBT Experience in different data formats (Avro, Parquet) Experience in data query languages (SQL or similar) Experience in data centric programming using one of more programming languages Python, Java /or Scala. Good understanding of different data modelling techniques and trade-off Knowledge of NoSQL and RDBMS databases Have a collaborative and co-creative mindset with excellent communication skills Motivated to work in an environment that allows you to work and take decisions independently Experience in working with data visualization tools Fluent in English both written and verbal Advantage if you also have: Experience in GCP tools Dataflow, Dataproc and Bigquery Experience in data processing framework Beam, Spark, Hive, Flink Business understanding of retail industry

Posted 1 week ago

Apply

8.0 - 13.0 years

4 - 9 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Responsibilities: Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform). To transform data into meaningful insights primarily using DBT and GCP Bigquery Ensure that our built data products work as independent units of deployment and non-functional aspects of the data products follow the defined standards for security, scalability, observability, and performance. Work close to the Product Owner and other stakeholders around vision for existing data products and identifying new data products to support our customer needs Work with product teams within and outside our domain around topics that relate to the data mesh concept. Evaluate and drive continuous improvement and reducing technical debt in the teams Maintain expertise in latest data/analytics and cloud technologies Qualifications Passion for Data, people, and technology At least 8+ years work experience including hands-on as either: Data engineer on modern cloud data platforms /or advanced analytics environments. Software Engineer with cloud technologies and infrastructure Excellent Experience with DBT Experience in different data formats (Avro, Parquet) Experience in data query languages (SQL or similar) Experience in data centric programming using one of more programming languages Python, Java /or Scala. Good understanding of different data modelling techniques and trade-off Knowledge of NoSQL and RDBMS databases Have a collaborative and co-creative mindset with excellent communication skills Motivated to work in an environment that allows you to work and take decisions independently Experience in working with data visualization tools Fluent in English both written and verbal Advantage if you also have: Experience in GCP tools Dataflow, Dataproc and Bigquery Experience in data processing framework Beam, Spark, Hive, Flink Business understanding of retail industry

Posted 1 week ago

Apply

0.0 - 1.0 years

0 - 1 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Role Responsibilities: Perform manual data collection from online sources Update academic profiles into internal database tools Match articles and degrees through in-house applications Execute ad hoc research tasks and meet delivery timelines Key Deliverables: Accurate and timely research data entry Faculty career histories mapped comprehensively Onboarding of new contacts to database systems High-quality results aligned with turnaround targets

Posted 1 week ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Very strong hands-on experience in Databricks with AWS/Azure cloud service in Data engineering/Data processing hands-on. Must have Databricks Implementation experience Hands on experience in AWS Cloud-based development and integration Strong knowledge on Pyspark and Python, PLSQL - Data engineering pipeline Practical experience with Data Engineering and the accompanying DevOps DataOps workflows Experience in offshore/onshore model and Ability and agile methodology. Gathering requirements, understand the business need and regular discussion with tech on design, development activities. Should have good experience working with client architect/design team to understand the architecture, requirement and work on the development. Experience working in a Financial Industry. Certification on Databricks and AWS will be added advantage

Posted 1 week ago

Apply

3.0 - 8.0 years

1 - 2 Lacs

Ambarnath

Work from Office

Naukri logo

* Maintain database accuracy through data management practices and Input data into computer systems with high speed and accuracy * Prepare reports from processed data using software tools - email resume on satishg@bidhata.com Badlapur location

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 40 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 25 to 40 LPA Exp: 7 to 11 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Pune/Bangalore/Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 1 week ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Greater Noida

Work from Office

Naukri logo

Role & responsibilities Internship highlights Pursuing or freshers with basic Excel skills and attention to detail Extract course information using Octoparse, ensure data accuracy through manual review, and gather structured and unstructured data Description Position: Data Management Intern Location: Noida - Sector 135 Working days- 6 days, (10:30 am to 7:30 pm) Skills: Data entry, Basic excel, Data Correction. Role & responsibilities Web Scraping: You'll be responsible for extracting course information from university websites using the Octoparse Tool. Manual Data Correction/Management: You'll ensure the accuracy and quality of the extracted data through manual review and correction. Data Gathering: Collect structured and unstructured data through web scraping and other methods. Data Cleaning: Ensure data accuracy by identifying inconsistencies, duplication and errors in collected data-sets. Preferred candidate profile Education: Any field of study. Technical Skills: Basic understanding in Excel and data entry tools. Analytical Skills: Strong attention to detail and data accuracy. Communication Skills: Able to communicate professionally in office with cross functional teams Time Management: Ability to handle multiple tasks and meet deadlines in a fast-paced environment. Problem-Solving Skills: A proactive approach to addressing data collection challenges. Preferred candidate profile

Posted 1 week ago

Apply

4.0 - 9.0 years

17 - 19 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Be an Operation Automation Associate driving global success with innovative solutions and collaboration. As an Operation Automation Associate within MART team, you will be dedicated to automating and optimizing reports and creating Dashboards along with end-to-end ETL solutions. You will achieve this through both strategic and tactical solutions, utilizing a variety of Business Intelligence (BI) tools, ETL Tools and Database to create low-touch processes that enhance reporting. Your efforts will focus on mitigating risks and delivering efficiencies by improving processes. Our group is heavily focused on data processing utilizing several different technology stacks and we continually seek to improve our technology environment as part of our ongoing modernization journey. Our modernization plans include automating manual legacy processes, and migrating to cloud (AWS) Job Responsibilities Manage stakeholder expectations effectively and facilitate decision-making by providing the right level of information and timely escalation when required. Ensure all process and supporting documents are maintained up-to-date, and all escalations are done on a timely basis. Collaborate cross-functionally to efficiently cater to business deliverables. Drive change management projects and new reporting requirements independently. Provide ad-hoc data support upon request by the business. Ensure accurate and timely resolutions of all queries. Engage in continuous learning and upskilling to effectively contribute towards business deliveries. Exhibit a proactive approach towards identifying problems and solutions. Demonstrate ownership in ensuring the completion of assigned projects in a timely manner. Articulate problem statements and solution strategies impactfully. Plan and report the status of ongoing projects and tasks to senior management. Required qualifications, capabilities and skills 4+ years of experience working with Database and ETL Experience with relational enterprise databases (Oracle and/or SQL Server) Experience in SQL and query optimization concepts (TSQL and/or PL/SQL) Exposure to BI Tools like Qliksense/Tableau. Experience in migrating data workflows on-premises to public cloud (AWS) Creating and manipulating Unix scripts. Creating/maintaining ETL processes using Pentaho Showcase strong data analytical skills and a financial services or business consultancy background. Advanced knowledge of application, data, and infrastructure architecture disciplines Display good project management skills with the ability to plan, prioritize, and deliver against deadlines. Must hold a Bachelors degree or above referred qualifications, capabilities and skills Programming language such as python

Posted 1 week ago

Apply

0.0 - 2.0 years

2 - 5 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

A motivated Life Science graduate with 0-2 years of experience, preferably in medical records reviewing/summarization or medical content writing. In this role, you will be responsible for analyzing and summarizing medical records to support case evaluations, ensuring accuracy and adherence to timelines. On-site work opportunity in our Chennai office. India compensation is based upon the local competitive market. Responsibilities Review and summarize medical records with attention to detail. Identify key data points and compile concise summaries. Collaborate with team members to ensure timely completion of cases. Maintain confidentiality and comply with medical record handling standards. Qualifications Bachelors degree in Life Sciences or related field. 0-2 years of experience in medical records review or summarization (preferred). Strong analytical and written communication skills. Familiarity with medical terminology is a plus. Our Cultural Values Entrepreneurs at heart, we are a customer first team sharing one goal and one vision. We seek team members who are: Humble - No one is above another; we all work together to meet our clients needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte s Technology Fast 500). Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. #LI-SN1 #LI-Onsite

Posted 1 week ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Designing technical architecture for IET DIgital utilizing current tools and platforms focused on cloud architecture, cloud build, etc., SPARQ deployment strategy in Azure (IaaS or PaaS) and AWS and applicability across the rest of the enterprise. Designing technical stack for Cordant (IET Digital Flagship Product) including tooling decisions. Enabling enterprise architecture vision and strategy, and support enterprise wide applications and business systems roadmap. Designing servers, storage, security, networks, virtualization/cloud, systems software, tools and governance to meet specified goals. Appying existing technologies, approaches, methodologies in new combinations to design new products, systems or processes. Viewed internally and externally as a specialist in the discipline Leading the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards. Defining processes to ensure conformance with standards, institute solution-testing criteria, and promote a clear and consistent business vision through technical architectures. Planning and delivering legacy infrastructure transformation and migration to drive next-generation business outcomes Driving for Continuous Integration and Continuous Delivery (CI/CD) based application and cloud infrastructure development. Understanding, learning and applying new automated build, test and deployment capabilities and help develop project teams towards integrating such solutions. Collaborating with internal development , external partners and QA teams to help ensure end-to-end quality Fuel your passion To be successful in this role you will: Have a Bachelors degree from an accredited university or college with minimum of 10 additional years of experience in Infrastructure Architecture. Have an Experience working with Linux operating system. Knowledge of hybrid cloud environments. Understanding of microservice design and architectural patterns Have strong expertise in DevOps and CI/CD implementation. Thorough knowledge of cloud-native development Have an Expertise with implementing Keycloak-based IAM solution that supports the integration of enterprise user directories such as LDAP and AD, and/or 3rd-party SSO provider for identity information and applications via standards-based tokens Have an Expertise with implementing Role-Based Access Control (RBAC), Attribute-Based Access Control (ABAC) policy creation and enforcement. Have Expertise with SAML 2.0, OpenID Connect and OAuth 2.0 Have an Experience with experience with load balancers such as Apache, Nginx, HAProxy. Familiarity with PKI infrastructure, Certificate authorities, OCSP, CRL, CSRs, x.509 certificate structures, pkcs12 certificate containers Have Experience with microservices architectures. Experience with Microservices architecture components, including Docker and Kubernetes Have an Experience and domain knowledge related to data processing. Experience with DevSecOps, Identity Access Management. Have an Experience with software configuration management tools such as Git/Gitlab. Experience with software development environments and CI/CD tools such as Jenkins

Posted 1 week ago

Apply

3.0 - 9.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Senior Software Engineer About Skyhigh Security: Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the world s data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on LinkedIn and Twitter @SkyhighSecurity . Role Overview: Software Engineers conduct or participate in multidisciplinary research and collaborate with design, layout and/or hardware engineers in the design, development, and utilization of productivity enhancement layout tools and design rule checkers, electronic data processing systems software. Determines computer user needs, advises hardware designers on machine characteristics that affect software systems such as storage capacity, processing speed, and input/output requirements, designs and develops compilers and assemblers, utility programs, and operating systems. Responds to customer/client requests or events as they occur. Develops solutions to problems utilizing formal education, judgement and formal software process. About the Role: Design and develop high-performing and responsive web applications using Angular JS best practices. Write clean and efficient JavaScript, CSS, and HTML codes. Analyze and resolve debugging issues in the application code to maintain the sustainability of the application. Proficiency in JavaScript, TypeScript, Angular (v15 and above), HTML, and CSS. Create system configuration functions using a component-based architecture. Perform troubleshooting bugs and issue resolution. Hands-on experience in developing modularized or template-based implementation. Perform product analysis and development tasks that may require extensive research and analysis. Build application performance through JavaScript profiling and code optimization. Development experience with Cloud/SaaS deployments is strongly desired Candidates with prior experience in developing software for security/networking products will be preferred. Should have excellent debugging, troubleshooting, analytical and problem solving skills. Good verbal and written communication in English. Working experience with JIRA/GIT is desired. About You: Practical knowledge of AngularJS, Angular 2+, and TypeScript Bachelor s degree in IT, computer science, computer engineering, or similar Good understanding of UI/UX design principles Excellent Node.JS and Express knowledge Experience with database technologies (e.g., MySQL and MongoDB) Problem-solving and analytical skills Strong knowledge of CSS, HTML, and writing cross-browser compatible code. Experience using JavaScript building tools like Gulp or Grunt Familiarity and sensibilities with UX and design application and execution Good comprehension of server-side CSS pre-processors such as Stylus, Less Comfortable with Java and JavaScript (workflow and decision engines) Good understanding/familiarity with Kafka and Mongo Should be a team player and have strong attention to detail Excellent verbal and written communication skills Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each others unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement Were serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies