Home
Jobs

1649 Data Processing Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Grow with us About this opportunity: Ericsson is seeking a dedicated and skilled CI/CD Engineer to drive developments and maintain our CI/CD (Continuous Integration and Continuous Delivery) toolkits. This role serves a critical function, enabling our product and engineering teams to effectively operate within a CI/CD pipeline. Your role will also involve supporting teams in the deployment and utilization of the CI/CD toolchain. Familiarity with Agile methodology and principles is vital in this role to foster a productive and efficient work environment. What you will do: - Develop and fine-tune CI/CD toolkits for use by our product and engineering teams. - Build and manage products/services within the CI/CD pipeline, tailored to user needs and our target architecture. - Craft multistage YAML pipelines for the build and deploy process, ensuring a consistent and repeatable procedure. - Guide and implement branching and merging strategies for concurrent development. - Leverage tools like Terraform for automating infrastructure provisioning. - Contribute to the refinement of our DevOps processes while exploring new technologies to spur innovation and boost development productivity. - Document configurations to foster self-service capabilities among teams. - Actively participate in CI/CD Communities of Practice/Interest and continuously learn on the job and experiment with new technology. - Support onboarding of new team members by sharing knowledge and providing necessary guidance. The skills you bring: A developer must have required competence as outlined in sequence of importance (from top to bottom): 1. Ansible automation - Robust understanding of the framework - Fundamental experience to structure automation code - Building own collections and roles with inter-dependencies - Writing plugins (inventory, action, lookup,etc.) and modules 2. Python development - Writing scripts and class libraries for system administration and data processing - Interact with REST-APIs - Handling embedded documentation - JSON, YAML, Jinja2, RegExp 3. Working Linux/Unix platform experience (preferred Rocky / Red Hat Linux) - command shell - ssh config - managing files and configs - SELinux - Firewalld - Logging - Troubleshooting - Installation package build and management 4. Git and Gitlab - Good working knowledge in using git as SCM in the day-to-day business - Experience with Gitlab (alternatively Github) regarding code handling and CI pipelines 5. Terraform / Kubernetes - Hands on experience working with and implementing Terraform and Kubernetes will be added advantage

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

Position Summary... About Team: The Walmart Contact Center specializes in providing best-in-class service to customers, stores, and associates via phone, chat, email. We are a metrics driven center dedicated to driving results where our associates thrive in this high-volume environment that handles over 10 million contacts per year. We Invest in You! At Walmart, we focus on the growth and development of our associates! We are a highly engaged team that prides itself on exceeding customer expectations, building relationships, career progression and providing individual and team recognition. What youll do... We are looking for career minded, customer centric individuals who are experienced in providing best-in-class customer service. What you ll do As a Customer Care Senior Resolution Coordinator, you will take a high volume of incoming calls, chats, and emails from customers, stores, and associates while navigating multiple systems to aid in answering questions and resolving issues. All Customer Care Coordinators must have the ability to communicate professionally in a conversational manner while utilizing all available resources to ensure customer satisfaction. To exceed our customers needs, our associates must be punctual, reliable, problem solve, act with integrity and be dedicated to making a difference. What you ll bring 0 - 12 months of relevant customer service experience Excellent written and verbal communication skills Able to interact professionally with customers. Ability to manage multiple tasks simultaneously. Customer focused mindset with a high level of urgency; role model for delivering Extraordinary Customer Care In this role, you may be asked to switch between any support channel of phone, chat, and Email based on the business requirements. Review, analyze, and process critical customer queries with accuracy to provide customer satisfaction. Adhere to quality, compliance guidelines and SLA s Must be willing to take continuous voice calls Must type a minimum of 25 WPM Proficient with Microsoft Office programs (Outlook, Word) Successful completion of mandatory training Should be flexible work in a 24/7 work environment with rotating weekly time off. Should be able to work in permanent night shifts or any assigned shifts on a rotational basis. Any graduation About Walmart Global Tech . . Benefits : Benefits . Equal Opportunity Employer: Belonging at Walmart . At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is and feels included, everyone wins. Approximately 90% of the U.S. population lives within 10 miles of a Walmart or Sam s Club - our associates and customers reflect the makeup of all of America, as well as the 18 other countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Belonging: We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Associates: We want to ensure our associates worldwide are seen for their unique contributions, supported in their daily work, and connected to co-workers. Walmart is the U.S. largest private employer. Our policies, practices, and programs promote fairness and the same treatment for all associates. Everyone in our workforce has the same access to opportunities for growth, development, and advancement. We transparently report on our workforce twice a year and we have associate resource groups to further engagement, networking, connection and a sense of community. Business & Customers: We provide an assortment of products and services that meet the unique needs of our customers and members while strengthening our connection to the communities we serve. We operate sensory friendly hours in all stores from 8 a.m. to 10 a.m. daily and offer Carolines Carts - a specially designed shopping cart for children and adults with disabilities. Our focus every day is how we can best serve our customers with quality food and goods at everyday low prices, which are 10-25% lower than those of competitors. Communities: Walmart thrives when we take a shared value approach, complementing business with philanthropy to strengthen the communities where we operate and prioritize issues that are meaningful to our business and all customers. Walmart is one of the most charitable companies in the Fortune 500. Last year we gave away over 8% profits through a combination of in-kind and cash gifts totalling more than $1.7 billion. You make sound judgments and promote a Associate / Candidates focused environment. You optimize execution and results. You inspire commitment through communication and influence. You demonstrate adaptability while thinking and acting strategically. You build and sustain internal and external relationships. Flexible to work in US hours shifts. Minimum Qualifications... Preferred Qualifications... basic computer processing/data entry software Primary Location... 3Rd Floor, B, Block, Tecci Park, 173, Old Mahabalipuram Road, Sholinganallur , India

Posted 2 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled Python Data Engineer (some overlap with Developer skills) to join our dynamic team. In this role, you will be responsible for developing and maintaining preprocessing scripts for streaming data from video OTT platform. You will work closely with our data engineering team to ensure seamless data integration and processing. You will also be responsible for optimizing SQL querying for optimized performance. Key Responsibilities: - Develop and maintain preprocessing scripts to handle streaming data from a video OTT platform. - Retrieve data from data lakes or data warehouses, process it according to business needs, and load it back into the data warehouse. - Utilize GitHub or similar repositories to manage code versions and collaborate with team members. - Access servers directly to test and debug scripts, ensuring optimal performance. - Analyse data within data warehouses using SQL to identify anomalies and prepare data for processing. - Optimize SQL queries for faster execution with optimized memory consumption. - Implement and manage Redis queues or sets to control the flow of data processing tasks. Required Knowledge and Skills: - Proficiency in Python, with a strong understanding of object-oriented programming and familiar with packages such as Pandas, NumPy, OS, SYS, datetime, JSON, and traceback. - Expertise in exception handling, logging, file handling, and multiprocessing/multithreading. - Strong knowledge of SQL for data analysis and processing. - Strong knowledge of Spark with hands-on PySpark experience. - Experience with Redis, Git and Linux commands. Python

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Job Description: Business Operations Analyst (Non-Billable Contractor) CSS - OU Delivery Team This position will be part of the CSS OU Delivery Team to provide end to end support for (a) supply chain or procurement (b) partner support (c) change management, working with shared service centre. The successful candidate would be expected to be flexible for any time zone. Location: Candidate should be from Bangalore Scope: Works as part of the Oracle University (OU) delivery operations team Reports to the Operations Director Regularly provides status updates on progress of tasks/work. Understanding and analysing reports for business need Collaborates with Delivery & Partner support team, other teams for business support Manage end to end Supplier change management role. Responsibilities: Maintain the integrity of all records, working with all internal operations team. Creating and Managing reports based on correctness of data and information with confidentiality on a daily, weekly and monthly frequencies. Excellent understanding of all tools and applications to ensure that all transactions are managed in a smooth manner. Manage or escalate issues and queries connected to the data collection and processing process, quickly, accurately, and professionally. Develop and maintain good working relationships with internal support groups (shared service centre), to ensure data processing is managed in a timely and accurate manner. Manage the flow of supply chain end to end, in a timely fashion with adequate reporting system for audit. Operate in line with Oracle and OU business policies and procedures, ensuring sales adherence to business practices and compliance. Manage and maintain Partner transactions end to end based on compliance and audit parameters and reports thereof. Profile: Experience in Essential functions - 3-5 years Highly computer literate - Essential Excellent English verbal and written communication skills - Essential Advanced MS Excel skills - Essential Experience in building analysis and dashboards in Oracle Business Intelligence (OBI EE) - Good to have. Knowledge of SQL and experience of running SQL queries in Oracle SQL developer - Good to have. Accuracy and attention to detail Excellent planning, Organising and Presentation skills, ability to prioritize workload Team player Personal drive Professional attitude

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Software Development Engineer II - Azure Data Engineering Back to job search results Tesco India Bengaluru, Karnataka, India Full-Time Permanent Apply by 07-Jun-2025 About the role We are looking for a skilled Data Engineer to join our team, working on end-to-end data engineering and data science use cases. The ideal candidate will have strong expertise in Python or Scala, Spark (Databricks), and SQL, building scalable and efficient data pipelines on Azure. What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles -simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Design, build, and maintain scalable ETL/ELT data pipelines using Azure Data Factory, Databricks, and Spark. Develop and optimize data workflows using SQL and Python or Scala for large-scale data processing and transformation. Implement performance tuning and optimization strategies for data pipelines and Spark jobs to ensure efficient data handling. Collaborate with data engineers to support feature engineering, model deployment, and end-to-end data engineering workflows. Ensure data quality and integrity by implementing validation, error-handling, and monitoring mechanisms. Work with structured and unstructured data using technologies such as Delta Lake and Parquet within a Big Data ecosystem. Contribute to MLOps practices, including integrating ML pipelines, managing model versioning, and supporting CI/CD processes. You will need Primary Skills: Data Engineering & Cloud: Proficiency in Azure Data Platform (Data Factory, Databricks). Strong skills in SQL and [Python or Scala] for data manipulation. Experience with ETL/ELT pipelines and data transformations. Familiarity with Big Data technologies (Spark, Delta Lake, Parquet). Data Optimization & Performance: Expertise in data pipeline optimization and performance tuning. Experience on feature engineering and model deployment. Analytical & Problem-Solving: Strong troubleshooting and problem-solving skills. Experience with data quality checks and validation. Nice-to-Have Skills: Exposure to NLP, time-series forecasting, and anomaly detection. Familiarity with data governance frameworks and compliance practices. Basics of AI/ML like: ML & MLOps Integration Experience supporting ML pipelines with efficient data workflows. Knowledge of MLOps practices (CI/CD, model monitoring, versioning) About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built. Apply

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Grow with us About this opportunity: Ericsson is seeking a dedicated and skilled CI/CD Engineer to drive developments and maintain our CI/CD (Continuous Integration and Continuous Delivery) toolkits. This role serves a critical function, enabling our product and engineering teams to effectively operate within a CI/CD pipeline. Your role will also involve supporting teams in the deployment and utilization of the CI/CD toolchain. Familiarity with Agile methodology and principles is vital in this role to foster a productive and efficient work environment. What you will do: - Develop and fine-tune CI/CD toolkits for use by our product and engineering teams. - Build and manage products/services within the CI/CD pipeline, tailored to user needs and our target architecture. - Craft multistage YAML pipelines for the build and deploy process, ensuring a consistent and repeatable procedure. - Guide and implement branching and merging strategies for concurrent development. - Leverage tools like Terraform for automating infrastructure provisioning. - Contribute to the refinement of our DevOps processes while exploring new technologies to spur innovation and boost development productivity. - Document configurations to foster self-service capabilities among teams. - Actively participate in CI/CD Communities of Practice/Interest and continuously learn on the job and experiment with new technology. - Support onboarding of new team members by sharing knowledge and providing necessary guidance. The skills you bring: A developer must have required competence as outlined in sequence of importance (from top to bottom): 1. Ansible automation - Robust understanding of the framework - Fundamental experience to structure automation code - Building own collections and roles with inter-dependencies - Writing plugins (inventory, action, lookup,etc.) and modules 2. Python development - Writing scripts and class libraries for system administration and data processing - Interact with REST-APIs - Handling embedded documentation - JSON, YAML, Jinja2, RegExp 3. Working Linux/Unix platform experience (preferred Rocky / Red Hat Linux) - command shell - ssh config - managing files and configs - SELinux - Firewalld - Logging - Troubleshooting - Installation package build and management 4. Git and Gitlab - Good working knowledge in using git as SCM in the day-to-day business - Experience with Gitlab (alternatively Github) regarding code handling and CI pipelines 5. Terraform / Kubernetes - Hands on experience working with and implementing Terraform and Kubernetes will be added advantage Why join Ericsson? What happens once you apply? We are committed to providing reasonable accommodations to all individuals participating in the application and interview process. If you need assistance or to request an accommodation due to a disability please reach out to Contact us We are proud to announce Ericsson India is ranked 19th among all 50 countries and is once again officially Great Place to Work Certified in 2024. Every year, more than 10,000 organizations from over 60 countries partner with the Great Place to Work Institute for assessment, benchmarking and planning actions to strengthen their workplace culture and this Certification acknowledges our employees value their employee experience and our workplace culture. Primary country and city: India (IN) || Bangalore Req ID: 767916

Posted 2 weeks ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

">Data Scientist 2.5-6 Years Bengaluru data science NLP Role -Data /Applied scientist (Search/ Recommendation) Experience - 2.5 yrs to 6 years Location - Bangalore Strong in Python and experience with Jupyter notebooks , Python packages like polars, pandas, numpy , scikit-learn, matplotlib, etc. Must have: Experience with machine learning lifecycle , including data preparation , training , evaluation , and deployment Must have: Hands-on experience with GCP services for ML & data science Must have: Deep understanding of modern recommendation systems including two-tower , multi-tower , and cross-encoder architectures Must have: Hands-on experience with deep learning for recommender systems using TensorFlow , Keras , or PyTorch Must have: Experience generating and using text and image embeddings (e.g., CLIP , ViT , BERT , Sentence Transformers ) for content-based recommendations Must have: Experience with semantic similarity search and vector retrieval for matching user-item representations Must have: Proficiency in building embedding-based retrieval models , ANN search , and re-ranking strategies Must have: Experience with Vector Search and Hybrid Search techniques Must have: Experience with embeddings generation using models like BERT , Sentence Transformers , or custom models Must have: Experience in embedding indexing and retrieval (e.g., Elastic, FAISS, ScaNN , Annoy ) Must have: Experience with LLMs and use cases like RAG (Retrieval-Augmented Generation) Must have: Understanding of semantic vs lexical search paradigms Must have: Experience with Learning to Rank (LTR) techniques and libraries (e.g., XGBoost , LightGBM with LTR support) Should be proficient in SQL and BigQuery for analytics and feature generation Should have experience with Dataproc clusters for distributed data processing using Apache Spark or PySpark Should have experience deploying models and services using Vertex AI , Cloud Run , or Cloud Functions Should be comfortable working with BM25 ranking (via Elasticsearch or OpenSearch ) and blending with vector-based approaches Good to have: Familiarity with Vertex AI Matching Engine for scalable vector retrieval Good to have: Familiarity with TensorFlow Hub , Hugging Face , or other model repositories Good to have: Experience with prompt engineering , context windowing , and embedding optimization for LLM-based systems Should understand how to build end-to-end ML pipelines for search and ranking applications Must have: Awareness of evaluation metrics for search relevance (e.g., precision@k , recall , nDCG , MRR ) Should have exposure to CI/CD pipelines and model versioning practices GCP Tools Experience: ML & AI : Vertex AI, Vertex AI Matching Engine, AutoML , AI Platform Storage : BigQuery , Cloud Storage, Firestore Ingestion : Pub/Sub, Cloud Functions, Cloud Run Search : Vector Databases (e.g., Matching Engine, Qdrant on GKE), Elasticsearch/OpenSearch Compute : Cloud Run, Cloud Functions, Vertex Pipelines , Cloud Dataproc (Spark/ PySpark ) CI/CD & IaC : GitLab/GitHub Actions

Posted 2 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Technical Skills Should have a very good knowledge of Python and should have worked in Data related Projects. Extensive use of various python libraries like Pandas, SQLAlchemy, Numpy and more as part of the data processing design and code. Experienced in connecting to databases preferably AWS Redshift to read data, process and clean / en-richen the data to better quality and value. Highly skilled in SQL to query from several databases with excellent tuning skills. Analytical Skills Ability to review data in alignment with the business process. Identify data that is not in line with expectations and take a logic driven approach based on data /pattern. Should be able to review data and identify gaps and find outliers that impact the data. Excellent skills in Data exploration and assessment of data based on data types, categories and distributions. Ability to spot inconsistencies, inaccuracies, missing values, outliers to come up with a plan for improvement. Technical Skills Should have a very good knowledge of Python and should have worked in Data related Projects. Extensive use of various python libraries like Pandas, SQLAlchemy, Numpy and more as part of the data processing design and code. Experienced in connecting to databases preferably AWS Redshift to read data, process and clean / en-richen the data to better quality and value. Highly skilled in SQL to query from several databases with excellent tuning skills. Analytical Skills Ability to review data in alignment with the business process. Identify data that is not in line with expectations and take a logic driven approach based on data /pattern. Should be able to review data and identify gaps and find outliers that impact the data. Excellent skills in Data exploration and assessment of data based on data types, categories and distributions. Ability to spot inconsistencies, inaccuracies, missing values, outliers to come up with a plan for improvement.

Posted 2 weeks ago

Apply

12.0 - 14.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Position Summary Experienced Senior Data Engineer utilizing Big Data & Gogle Cloud technologies to develop large scale, on-cloud data processing pipelines and data warehouses. What you ll do Consult customers across the world on their data engineering needs around Adobes Customer Data Platform. Support pre-sales discsusions around complex and large scale cloud, data engineering solutions. Design custom solutions on cloud integrating Adobes solutions in scalable and performant manner. Deliver complex, large scale, enterprise grade on-clould data engineer and integration solutions in hand-on manner. Good to have Experience of consulting India customers. Multi-cloud expertise preferable AWS and GCP EXPERIENCE 12-14 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Python, BigQuery

Posted 2 weeks ago

Apply

12.0 - 14.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design and implement scalable, reliable, and high-performance data architectures to support business needs. Develop and maintain real-time data streaming solutions using Kafka and other streaming technologies. Utilize AWS cloud services to build and manage data infrastructure, ensuring security, performance, and cost optimization. Create efficient and optimized data models for structured and unstructured datasets. Develop, optimize, and maintain SQL queries for data processing, analysis, and reporting. Work with cross-functional teams to define data requirements and implement solutions that align with business goals. Implement ETL/ELT pipelines using Python and other relevant tools. Ensure data quality, consistency, and governance across the organization. Troubleshoot and resolve issues related to data pipelines and infrastructure. Required Skills and Qualifications: Experience in Data Engineering and Architecture. Proficiency in Python for data processing and automation. Strong expertise in AWS (S3, Redshift, Glue, Lambda, EMR, etc.) for cloud-based data solutions. Hands-on experience with Kafka for real-time data streaming. Deep understanding of data modeling principles for transactional and analytical workloads. Strong knowledge of SQL for querying and performance optimization. Experience in building and maintaining ETL/ELT pipelines. Familiarity with big data technologies like Spark, Hadoop, or Snowflake is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and stakeholder management skills EXPERIENCE 12-14 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Kafka, Python, Data Modeling, ETL, Data Architecture, SQL, Redshift, Pyspark

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Job Summary: We are looking for an experienced Data Engineer with a specialization in SQL Server and SSIS (SQL Server Integration Services) to join our development team. In this role, you will design, develop, and optimize complex database solutions and ETL (Extract, Transform, Load) processes using SQL Server and SSIS. You will work closely with other developers and analysts to ensure the efficient and reliable data processing across our systems. The ideal candidate will have a deep understanding of database design, performance tuning, and the ability to develop and maintain SSIS packages for data integration and transformation. Strong problem-solving skills, attention to detail, and the ability to work both independently and as part of a team are essential for success in this role Role and Responsibilities: Design, develop, and maintain SQL Server databases and SSIS packages Create and optimize complex SQL queries, stored procedures, and triggers Develop ETL processes to integrate data from multiple sources into our data warehouse Troubleshoot and resolve database performance issues and data integrity problems Ensure data security and compliance with relevant regulations Participate in code reviews and contribute to best practices for database development Requirements Requirements: Bachelors degree in Computer Science, Information Technology, or a related field 5 years of proven experience as a Data Engineer with a focus on SQL Server and SSIS Expertise in designing and developing SSIS packages for ETL processes Strong knowledge of T-SQL, including stored procedures, functions, and query optimization Experience with data warehousing and star schema data models Familiarity with version control systems and CI/CD pipelines Excellent problem-solving abilities and attention to detail Strong communication and teamwork skills

Posted 2 weeks ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Working at Freudenberg: We will wow your world! Responsibilities: Manage manufacturing production processes and implementation of company policies also e nsure manufacturing process performance, volumes, quality goals or KRA are met by utilizing efficient method Lead technical part in capital investment in process to optimize cost without deviation in quality and deliveries also design and facilitate procurement of jig, fixtures and tooling Lean practices, Initiate and implement - Pokayoke, 3P, SMED etc Create and control internal drawings as per specification in customer drawings Optimize the existing layout of plant facility Periodic BOM mass update in ERP as per ECN or Manufacturing process improvements and VAVE activities and routing run time update as per cycle time study of existing products & new product integration. Ensure safety of all employees working in plant Qualifications: Graduate in Engineering - Mechanical or Industrial or Production Master or Management studies will be additional benefit 8 to10 years of experience in manufacturing setup in manufacturing engineering department Periodic BOM mass update in ERP as per ECN / Manufacturing process improvements and VAVE activities Have advanced knowledge of computing specially big data processing Working Proficiency in advanced level of Microsoft versions Good English Communication skills and should poses very high level of interpersonal skill Freudenberg Filtration Technologies India Private Limited

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

JD: 1) Must have 8+ years of experience in Python / Java, Spark, Scala, Hive, Microsoft Azure Cloud Services (Data Bricks platform and developer tools) 2) 4 to 8 years of experience in Data Warehouse, ETL, Snowflake and Report Testing 3) Strong in writing SQL scripts & Database knowledge on Oracle, SQL Server, Snowflake 4) Hands on working experience with any of the ETL tools, preferably Informatica and Report / Dashboard tools 5) Ability to work independently and 12:30 pm to 9:30 pm timings Good to have, Data Processing Ability to build optimized and cleaned ETL pipelines using Databricks flows, Scala, Python, Spark, SQL Testing and Deployment Preparing pipelines for deployment Data Modeling Knowledge of general data modeling concepts to model data into a lakehouse Building custom utilities using Python for ETL automation Experience working in agile(scrum) environment and usage of tools like JIRA. Must haves Data Bricks: 4/5 Pyspark with Scala: 4/5 SQL: 3/5

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About Company: Azure Data Engineer About the Role: We are seeking an experienced Azure Data Engineer to join our team onsite in Bangalore. The ideal candidate will have a strong background in Azure data engineering with expertise in handling large-scale, complex datasets and modern data platforms. This role offers an exciting opportunity to work on cutting-edge projects within a highly collaborative and innovative team. Our teams culture emphasizes continuous learning, knowledge sharing, and leveraging the latest technologies to solve real-world business problems. You will be involved in projects related to advanced analytics, data integration, and real-time data processing that have a significant impact on our business. Key Responsibilities: Design, develop, and maintain data solutions on Azure cloud platforms. Work extensively with data lakes, data warehouses, and Snowflake (preferred). Assemble and integrate large, complex datasets meeting functional and non-functional business requirements. Model and orchestrate complex finance-related data workflows. Perform performance tuning, optimization, bottleneck analysis, and troubleshooting in ambiguous environments. Develop and maintain CI/CD pipelines and data integrity checks. Implement and monitor best practices in Dev frameworks and cloud systems. Work with Kafka for data streaming and real-time data processing. Collaborate with cross-functional teams to deliver high-quality data solutions. Required Skills & Experience: 10+ years of hands-on experience in Data Engineering with a strong focus on Azure. Strong understanding of modern data platforms including data lakes and data warehouses; Snowflake experience preferred. Proficiency in SQL, Python, PowerShell, and JavaScript. Experience with cloud-based data platforms, especially Azure and Snowflake. Expertise in large volume data processing; retail industry experience is a plus. Familiarity with Kafka technologies and real-time data streaming. Experience building CI/CD pipelines and implementing data integrity checks. Ability to troubleshoot and optimize performance in complex environments. Strong analytical, problem-solving, and communication skills.

Posted 2 weeks ago

Apply

12.0 - 18.0 years

30 - 37 Lacs

Hyderabad

Work from Office

Naukri logo

Develop and execute the engineering strategy that aligns with the company\u2019s vision, goals, and business objectives. Collaborate with executive leadership to shape the product roadmap and ensure that engineering efforts are in sync with business priorities. Drive innovation within the engineering team, identifying emerging technologies and trends that can create competitive advantages. Customer Trust & Success: Champion customer-centric development practices, ensuring that all engineering efforts are focused on delivering value and building trust with customers. Collaborate with customer success, product, and sales teams to understand customer needs and feedback, and translate them into actionable engineering strategies. Ensure that engineering teams are equipped to deliver reliable, secure, and scalable products that instill confidence in our customers. Technical Leadership & Operations: Cloud & Infrastructure Management: Design and implement robust system and network architectures utilizing AWS and GCP to build scalable, reliable cloud solutions. Deploy and manage applications on Kubernetes, ensuring optimal performance and scalability. Handle traffic routing with Ingress Controllers (Nginx), oversee Certificate Management using Cert Manager, and manage secrets with Sealed Secrets and Vault. Enhance application performance with Caching solutions like Redis and Memcache, and implement comprehensive logging and tracing systems using Loki, Promtail, Tempo, and OpenTelemetry (Otel). Establish and maintain monitoring and alerting systems with Grafana, Prometheus, and BlackBoxExporter. Manage Infrastructure as Code using Terraform, oversee Manifest Management with Gitlab, and lead Release Management workflows using Gitlab and ArgoCD. Application & Data Management: Manage Authentication and Authorization services using Keycloak and implement Event Streaming solutions with Kafka and Pulsar. Oversee database management and optimization utilizing tools such as Pg Bouncer, Mulvis, OpenSearch, and ClickHouse. Implement and manage distributed and real-time systems with Temporal. Leverage advanced data processing tools like Trino, Apache Superset, Livy, and Hive to meet specialized data specific requirements. Machine Learning Integration: Collaborate with data scientists to integrate and host machine learning models within applications, implementing MLOps practices to streamline the deployment, monitoring, and management of ML models in production. Utilize tools such as TensorFlow Extended (TFX), Kubeflow, MLflow, or SageMaker for comprehensive ML lifecycle management, ensuring robust model versioning, experimentation, reproducibility, and optimizing ML pipelines for performance, scalability, and efficiency. Project Management: Oversee project timelines, deliverables, and resource allocation. Coordinate with cross-functional teams to align on project goals and deliverables. Ensure timely and high-quality delivery of software products. Requirements Qualifications: Education & Experience: Bachelor\u2019s or Master\u2019s degree in Computer Science, Engineering, or a related field. Proven experience (12+ years) in software engineering, with a strong focus on B2B SaaS applications. At least 5 years of experience in a senior leadership role, preferably at the VP level. Strategic & Technical Skills: Demonstrated ability to develop and execute engineering strategies that align with business goals. Expertise in full stack development, cloud platforms (AWS, GCP), and Kubernetes. Strong experience with infrastructure management, MLOps, and integrating machine learning models. Ability to translate customer needs into technical requirements and ensure the delivery of high-quality products. Leadership & Soft Skills: Visionary leadership with the ability to inspire and guide large engineering teams. Strong business acumen with the ability to align technical efforts with business objectives. Excellent communication and interpersonal skills, with a focus on building strong cross-functional relationships. Proven track record of fostering customer trust and delivering products that drive customer success ","

Posted 2 weeks ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary What you need to know about the role As part of the Merchant Reporting & Insights team at PayPal, you will lead the development of mission-critical systems that ingest, orchestrate, and analyze massive volumes of data. These systems generate millions of high-volume reports, transforming complex fintech data into actionable business insights. You ll be responsible for ensuring timely delivery of reports aligned with merchant SLAs and business cadences. Meet our team The Merchant Reporting & Insights team is at the heart of PayPal s data-driven decision-making. We build scalable data platforms and reporting tools that empower merchants with real-time insights. You ll collaborate with cross-functional teams across engineering, product , customers to deliver impactful solutions that drive business outcomes. Your way to impact You will lead the design and development of scalable, enterprise-grade data solutions that power critical reporting for merchants globally. Your leadership will drive innovation in big data and real-time processing, ensuring high-throughput systems meet business needs. You ll foster a culture of technical excellence, collaboration, and continuous improvement. Job Description Your Day-to-Day Provide technical leadership and guidance to teams of software engineers, fostering a culture of collaboration, innovation, and continuous improvement. Establish outcomes and key results (OKRs) and successfully deliver them. Drive improvements in key performance indicators (KPIs). Increase the productivity and velocity of delivery teams. Develop, plan, and execute engineering roadmaps that bring value and quality to our customers. Collaborate and coordinate across teams and functions to ensure technical, product, and business objectives are met. Instill end-to-end ownership of products, projects, features, modules, and services that you and your team deliver in all phases of the software development lifecycle. What do you need to bring 10+ years of experience in the software industry, with 3+ years of professional experience leading software development teams. Strong critical thinking and problem-solving skills with the ability to address complex technical and non-technical challenges. Experience building and developing engineering teams that exhibit strong ownership, user empathy, and engineering excellence. Proven track record of delivering high-quality systems and software in Big Data Technologies including Spark, Airflow, Hive, etc., with practical exposure to integrating machine learning workflows into data pipelines. Proven track record of delivering high-quality systems and software in Java/J2EE technologies and distributed systems, with experience deploying ML models into production at scale using REST APIs, streaming platforms, or batch inference. Excellent communication skills with the ability to collaborate effectively with cross-functional teams (including data scientists and ML engineers) and manage stakeholders expectations. Ability to coach and mentor talent to reach their full potential, including guiding teams in adopting MLOps best practices and understanding AI model lifecycle management. Experience in building large scale, high throughput, low latency systems, including real-time data processing systems that support personalization, anomaly detection, or predictive analytics. Strong understanding of software development methodologies, modern technology topics and frameworks, and developer operations best practices. Experience with ML platforms (e.g., Kubeflow, MLflow) and familiarity with model monitoring, feature engineering, and data versioning tools is a plus. Provide leadership to others, particularly junior engineers who work on the same team or related product features. Proven experience delivering complex software projects and solutions effectively through Agile methodologies on a regular release cadence. Strong verbal and written communication skills. Strong customer focus, ownership, urgency and drive.

Posted 2 weeks ago

Apply

0.0 - 2.0 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Our Global Nielsen Media Campaign Analytics Research team works collaboratively to deliver actionable recommendations that help clients win in the marketplace. Focused on market impact and business growth, were at the forefront of customer experience as we navigate the complex needs of our industry. Qualifications MBA in Marketing or related field preferred. 0-2 years experience in Marketing or Media research preferred Knowledge of marketing and advertising a plus, ideally of digital ad unit types and digital ad buying/selling ecosystem Good understanding of survey methodology Strong Project Management skills Strong interpersonal skills required Knowledge of SPSS, VBA, and R scripting language a strong plus Very strong quantitative, data tabulation, analytic thinking, and data mining skills Excellent skills with Microsoft Office and Google suite of products (especially Excel/Sheets, PowerPoint/Slides) Knowledge of relational databases a plus Strong written and verbal communication skills in English Strong time management skills Ability to deliver under deadlines Effective organizational skills and ability to multitask Close attention to detail Eager to learn and develop skills Ability to work across time zones Willingness to work in 2:30 pm- 11:30pm shift Responsibilities Create detailed research analyses focused on the effectiveness of advertising on a variety of media platforms using established test vs control methodology Decide on the correct analytic approach(es) to measure campaigns and evaluate question selection/wording Work directly with clients from the study kickoff phase through to delivery Perform strong quality assurance checks on poll grammar and tone, data collection during survey flight, and finalized reports Investigate/raise questions when issues are discovered and proactively work to help find the root cause and resolve them Work as part of a team to create research solutions for new product developments that would better serve our clients Design and implement brand impact surveys and analyze and interpret findings as necessary Responsible for supporting survey-based primary research quote requests and project work (cost/feasibility requests, survey design, field management, and report slide creation). Drafting proposals, pricing & performing feasibility checks Drafting/Editing Survey Questionnaire Online survey link checking Preparing Analysis Plan(detailing Tables specifications) Coordinating with the various teams (Programming, Data processing, Open End Coding, Translations, etc.) Fieldwork monitoring/ communication Sample performance and analysis Managing sampling process Preparing PowerPoint report template - Participate in creating research reports Report population and quality checking Analysis and report writing - Analysing and summarizing the data to answer client questions and provide meaningful recommendations Work on different tools - SPSS, Decipher, Primelingo /Scarborough database, Data Visualization tool(Displayr) etc. Notifies project lead/manager of any problem/risk areas on timely basis. Coordinate with multiple project members/ teams for query solving and keeping track of project timelines. Responsibility for the quality of deliverables; error-free. Guide clients in the interpretation of results of analytics, partnering with the global clients insights team to present results directly to agencies, advertisers, and media companies Interact and partner with global clients insights team t based out of U.S. to ensure a smooth delivery of projects. The Research Analyst will be part of a fast-paced team responsible for dealing directly with media companies and their agency/advertiser clients daily to understand an advertiser s campaign, advising on survey setup, and providing analysis on the campaign s performance, all while maintaining a high level of quality assurance throughout each step of the process. As a Research Analyst on the Campaign Analytics team, you may execute brand impact surveys measuring ad effectiveness on media platforms of all kinds, from digital, to social, to streaming, to podcasting, and beyond. Additionally, the Research Analyst will assist in developing and performing deep-dive custom analyses under the guidance of research leads.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

4 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Responsibilities : Data Analysis and Management Conduct advanced spatial data analysis using GIS software (ArcGIS, QGIS) to derive meaningful insights. Manage, manipulate, and analyze large geospatial datasets to produce high-quality maps and actionable reports. Ensure data accuracy and integrity through rigorous quality control measures and regular audits. Programming and Automation Develop and implement Python scripts for data processing, analysis, and automation, with proficiency in SQL for querying and managing databases.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

15 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking forward to hire SnowFlake Professionals in the following areas : : Experience required: 3-5 years. Snowflake Architecture and Administration: Demonstrate a deep understanding of Snowflake architecture, including compute and storage management. Illustrate the administrative tasks involved in managing Snowflake accounts, optimizing performance, and ensuring fail-safe operations. Snowflake Utilities and Features: Hands-on experience with Snowflake utilities, such as Snow SQL, Snow pipe, and time travel, is essential. Proficiency in writing stored procedures, using Snowflake functions, and managing tasks will be required. Data Engineering Expertise: Exhibit expertise in engineering platform components like data pipelines, data masking, data orchestration, data quality, data governance, and analytics within the Snowflake environment. Load Operations and Performance: Review and implement Snowflake best practices for managing load operations and optimizing performance to ensure efficient data processing. Data Security and Governance: Describe data governance in Snowflake, including the use of secure views and dynamic data masking features for column-level data security. Design and develop secure access to objects using Role-Based Access Control (RBAC). Data Sharing and Replication: Utilize data replication for sharing data across accounts securely and managing failover scenarios. Large-Scale Data Intelligence: Demonstrate hands-on experience in implementing large-scale data intelligence solutions around Snowflake data warehousing. Knowledge of scripting languages like Spark, Py-Spark, Snowpark, or SQL is highly desirable. Performance Tuning: Implement advanced techniques for performance tuning methodologies in Snowflake to optimize query performance and reduce data processing time. Collaboration and Leadership: Work collaboratively with cross-functional teams to understand data requirements and contribute to the data engineering strategy. Provide technical leadership and mentorship to junior team members. Strategic Impact Impacts the effectiveness of own works team through the quality and timeliness of the work produced. Largely works within standardized procedures and practices to achieve objectives and meet deadlines and gains some discretion in problem solving. Scope of People Responsibility Manages own workload effectively and efficiently. Expands technical contribution and encourage knowledge management and promote the cross-fertilization of ideas and information between teams. Foster Snowflakes & other developers to grow and mentor to help ensure overall solutions are delivered on timely manner. Provides training for members of the team and ensures team effectiveness. May provide informal guidance and support to colleagues with less experience. Cooperation Communicates difficult concepts and negotiates with others to adopt a different viewpoint. Fostering accountability throughout the team to uphold strong governance in the form of standards, methodology and requirements. Demonstrates the ability to work as a team member on a project and can effectively work with a larger global team.|h2. | Candidate s Profile Work Experience Overall 3-5 years of experience in IT Bachelors or masters degree in computer science, Information Technology, or a related field. Extensive experience working with Snowflake data warehousing technology, including hands-on experience with various Snowflake utilities and features. Proficiency in SQL and scripting languages such as Spark, PySpark, Python or Snowpark. Strong knowledge of data engineering concepts, data integration, and ETL processes. Familiarity with data governance principles, data security, and RBAC. Excellent understanding of data replication and data sharing across accounts. Proven experience in performance tuning and optimization of Snowflake queries. Exceptional problem-solving skills and ability to address complex data engineering challenges. Excellent communication and leadership skills with a collaborative mindset. Ability to manage multiple accounts across the organization and ensure smooth operations. Knowledge of transaction and concurrency models, DDL operations, and DML considerations in Snowflake. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

We want Amazon.com to be the place where our customers can find, discover and buy anything online! Whatever our customers tell us they want, we will find the means to deliver. In doing so, we will create the most customer-centric company in the universe, a company that customers from all over the globe will recognize, value, and trust for both our products and our service. With your help, Amazon.com will continue to enable people to discover new worlds and create change in a meaningful and lasting way. We are looking for experienced support Engineers who can lead support activities for seller compliance in delivering high quality software solutions to support variety of customer use cases and are scalable to handle amazon volume. The SE role in the EPR Pay-on-behalf team is responsible for working with Tech and Non-Tech stakeholders to ensure smooth and on-time publication of EPR Reports, and on-time EPR Remit declarations for EU Sellers. The SE is expected to learn the technologies and use the tools required to perform data processing like: Cradle, SQL, Ratchit, Quicksight, AWS technologies like: Lambda, SQS, SNS, Dynamo DB , S3 , Cloudwatch. Over the past 2 years, we have seen a 50% YOY seller growth and the SE should be invested to automate the manual processes associated with monthly cycles. The SE will be owning the process of driving discussions with PMT and other stakeholders to generate and maintain the metrics and work with SDM to build script and tools for test infrastructure and troubleshooting the bugs reported. In addition, the SE will maintain the pipelines to keep it healthy, and drive other core support engineering initiatives. 2+ years of software development, or 2+ years of technical support experience Bachelors degree in engineering or equivalent Experience troubleshooting and debugging technical systems Experience in Unix Experience scripting in modern program languagesExperience with REST web services, XML, JSON Experience with AWS, networks and operating systems

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architecture. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4to 6 years of Computer Science, IT or related field experience OR Bachelors degree and 6 to 8 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Preferred Qualifications: Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and collaboration skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Role Responsibilities: Data Collection Cleaning : Collect and clean data from various internal and external sources to ensure accuracy and consistency. Data Analysis : Analyze complex datasets to identify trends, patterns, and insights that inform business decisions. Collaboration : Work closely with cross-functional teams to understand business needs and provide relevant data insights. Data Interpretation : Provide actionable recommendations based on data analysis to guide business strategy and decision-making. Data Management : Maintain and update databases, ensuring data integrity and accessibility. Automation Optimization : Develop automated systems for data processing and reporting to increase efficiency. Required Skills Qualifications: Data Analysis Statistical Techniques : Strong analytical skills with experience in data analysis, statistical methods, and modeling. Programming Languages : Experience with SQL for database querying. Data Management : Strong understanding of data cleaning, transformation, and database management practices. Reporting Documentation : Ability to write clear reports and document processes, methods, and findings. Communication Skills : Strong written and verbal communication skills to present data findings to both technical and non-technical stakeholders. Problem-Solving : Excellent problem-solving skills and the ability to think critically when analyzing data. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

0.0 - 1.0 years

2 Lacs

New Delhi, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Key Responsibilities: -Verbal communication/ articulate/ polished: strong oral and written communication skills must be able to communicate with clients through calls and emails. -Flexible to work in night shifts. (24*7 timings) -Work ethic: Accountability/ownership -Quick learner and ability to deal with change management and adapt quickly. -Strong reporting skills using MS office applications ( Word, Excel, & PowerPoint ) -Should be flexible in shifts & Ability to prioritize and work on multiple tasks. -Experience operating in collaborative team environment Experience and Qualifications: -Under Graduates will be considered(Minimum qualification is higher secondary )

Posted 2 weeks ago

Apply

0.0 - 2.0 years

3 - 5 Lacs

Navi Mumbai

Work from Office

Naukri logo

Roles & Responsibilities The Analyst will work on back-office and middle-office processes for financial institutions, handling various stages of the client/product lifecycle across KYC, reference data management, legal docs, loans, portfolio reconciliation, document capture, system reconciliation, pre and post settlements, brokerage functions, drafting, trade support, corporate actions, tax operations, and more. Responsibilities also include data capture, cataloging, data processing, system inputs and updates, reconciliations, settlements, and fund transfers. The role involves preparing reports using MS Excel and may require external interaction with agents/counterparties/clients to resolve process-related queries and discrepancies via phone or email. Key responsibilities include: Identifying and escalating risks, promptly reporting outstanding issues to clients. Performing various trade support activities across the Trade Lifecycle, such as Trade Confirmation matching, Trade Pre-Settlements support, Front office to back-office reconciliation of trade positions, report generation, and settlements of cash flows from trading events (e.g., Interest or Premium). Handling operations of Syndicated Loans and Corporate action setup and operations. Managing other capital market operational tasks beyond Trade Lifecycle support, including Reference Data support and Regulatory reporting (Swaps Data Repository, SDR, Know Your Customer (KYC), various front-office and back-office reconciliations). Learning and mastering various financial products, including Equity Securities and Derivatives, Interest Rates Swaps, FX Spot, Options, Futures, Credit Derivatives Swaps, Commodities Derivatives, and Fixed Income products (e.g., Corporate and Treasury Bonds). Qualification and Skills Bachelors Degree (B.Com, BBA, BBM, BCA) / Masters Degree (M.Com, MBA, PGDM). 0 to 2 years of experience in investment banking operations involving projects, people, process, and client management. Basic knowledge of finance, trade lifecycle, investment banking, and derivatives. Strong logical and quantitative abilities to derive insights from data. Excellent time management skills and ability to resolve issues promptly. Proficiency in planning, organizing, and time management.

Posted 2 weeks ago

Apply

0.0 - 4.0 years

1 - 4 Lacs

Hyderabad

Remote

Naukri logo

Mainly Responsible for Customer service/Data Entry work Freshers can also apply. Graduate experience. Good knowledge of MS Excel, Word, PowerPoint, etc. Required Candidate profile Basic Typing speed 15 wpm To 30 wpm Basic Computer Knowledge. ( (MS Office, MS Excel, etc.) Age Criteria:- 18 to 30 Years Qualification:- HSC Pass or above

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies