Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
1 - 4 Lacs
chandigarh
Remote
Call Handling, Messaging: Answer inbound calls from potential job seeker, listen to their needs, & qualify them. Provide information on WhatsApp. Pass lead to recruitment team for qualified leads - in a professional and timely manner. Work From Home
Posted 13 hours ago
0.0 - 4.0 years
1 - 5 Lacs
bengaluru
Remote
Call Handling, Messaging: Answer inbound calls from potential job seeker, listen to their needs, & qualify them. Provide information on WhatsApp. Pass lead to recruitment team for qualified leads - in a professional and timely manner. Work From Home
Posted 14 hours ago
0.0 - 4.0 years
2 - 5 Lacs
hyderabad
Remote
Handle data entry for regulatory documents, MSDS, COAs, drug licenses, and audit records. Check for discrepancies in data and coordinate with relevant departments to resolve issues. Support the QA/QC team in maintaining GMP-compliant documentation
Posted 14 hours ago
0.0 - 1.0 years
1 - 1 Lacs
coimbatore
Work from Office
Responsibilities: * Operate scanner to digitize documents * Input data accurately into computer system * Ensure quality assurance of processed data * Maintain confidentiality of sensitive information Only Work From office call us on +91 7200986954 Performance bonus
Posted 1 day ago
0.0 - 5.0 years
2 - 4 Lacs
chennai, coimbatore, bengaluru
Hybrid
PERMANENT WORK FROM HOME 2025 graduate can also apply An Urgent Requirement For graduates and under graduates for Data Entry Sal 10 to 35k take home Required Age 18 to 35 Years Full Time Easy Selection Process Please apply for the job in Naukri.com. We will check & will update you. Do not search the number in Google and do not call us. Now the requirement is on hold from Client side. Please apply for the job in Naukri.com. We will check & will update you. Do not search the number in Google and do not call us. Now the requirement is on hold from Client side.
Posted 1 day ago
0.0 - 5.0 years
2 - 4 Lacs
vijayawada, visakhapatnam, hyderabad
Hybrid
PERMANENT WORK FROM HOME 2025 graduate can also apply An Urgent Requirement For graduates and under graduates for Data Entry Sal 10 to 35k take home Required Age 18 to 35 Years Full Time Easy Selection Process Please apply for the job in Naukri.com. We will check & will update you. Do not search the number in Google and do not call us. Now the requirement is on hold from Client side. Please apply for the job in Naukri.com. We will check & will update you. Do not search the number in Google and do not call us. Now the requirement is on hold from Client side.
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
bengaluru, karnataka, india
On-site
This role is central to the development and deployment of intelligent AI systems that would enhance service value proposition, operational efficiency, and strategic decision-making. The individual will design, develop, train, and implement machine learning and deep learning models capable of processing large datasets and generating actionable insights. Collaboration with data scientists, software developers, and business leaders is essential to integrate these models into existing systems. The position requires a deep understanding of machine learning/NLP, deep learning, data processing, and generative AI to ensure that solutions are robust, scalable, and impactful. Responsibilities : Create and optimize models capable of generating high-quality content, such as text, images, audio, or video, based on specified parameters and datasets. Design, build, and optimize machine learning models, deep learning architectures to solve specific business challenges. Collect, clean, and preprocess large datasets to ensure high-quality data is used for training models. Train AI models on large datasets, tune hyperparameters, and evaluate model performance using appropriate metrics. Conduct experiments to test model accuracy, efficiency, and scalability. Deploy AI models into production environments, ensuring they integrate smoothly with existing software systems. Monitor, update & refine the performance of deployed models. Stay up to date with the latest advancements in AI, machine learning, and related technologies. Communicate technical concepts, model results, and AI capabilities to non-technical stakeholders. Ensure AI models and solutions adhere to ethical guidelines. Enable and support fellow team members. Career Level - IC3
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Role Overview: You will be responsible for providing documentation and credit related services to Business and Credit Control Unit to process disbursement of loan applications as well as support in post-disbursement activities. Key Responsibilities: - Prepare Offer letters, Facility documents, and security documents (non-mortgage) - Conduct preliminary checking of filled documents (Facility and security documents) - Initiate and follow up with Legal and Valuation agencies for timely submission of reports - Perform miscellaneous activities related to disbursement depending on the product - Liaise with RMs and customers for various post-disbursement documents/exception tracking - Process and input data for vendor bills Qualification Required: - Preferable experience in a similar role of back-office credit operations/documentation with another bank/NBFC - Graduate in the Commerce stream, preferred post-graduate - Basic knowledge of MS Office and various systems used by Credit Services can be an added advantage Additional Details: You will need to have thorough job knowledge of documentation and various corporate bank products. Organizational skills and the ability to meet conflicting deadlines are essential. Being proactive in understanding system implementations and changes is important. Effective verbal and written communication skills are necessary to liaise with various stakeholders including RMs, CCU, Legal Compliance, and Finance teams. Company Information: DBS Corporate and Investment Banking provides a full range of commercial banking products and services to corporate customers, fostering long-standing relationships based on account management, service differentiation, product development, and rigorous credit standards. Apply Now: If you are looking for a dynamic environment that supports your development and recognizes your achievements, we offer a competitive salary and benefits package.,
Posted 2 days ago
2.0 - 6.0 years
10 - 18 Lacs
hyderabad, bengaluru
Hybrid
Indium is hiring Data Analysts and Data Engineers ! Join our dynamic team and work on cutting-edge data projects. We're conducting a Walk-In Drive on 20th September 2025 at our Hyderabad and Chennai offices. Open Roles: Data Analyst Experience: 2 to 5 Years Key Skills: Advanced SQL Tableau / Looker / GDS / Power BI Python (Numpy, Pandas) Strong data visualization and communication skills Understanding of business metrics and KPIs Data Engineer Experience: 2 to 6 Years Key Skills: Python, SQL ETL Pipelines, Data Warehousing Cloud Platforms (GCP or any other cloud) Experience with Big Data tools. How to Apply / Participate: Please share your details in the form - https://forms.gle/ECPdpDo6yxUvoZxX8 Post reviewing your details, if your profile is shortlisted, we will send you the invitation for Walk-in drive. Looking forward for your responses!
Posted 2 days ago
3.0 - 8.0 years
9 - 14 Lacs
chennai
Work from Office
Location - Chennai NP - Immediate Exp - 3 years to 8 Years Mandatory Skills - AI, ML, Python, Data Scientist Walk in Drive 18th Sep 2025 at Chennai(In Personal only) Shift Timing: 11 AM – 8 PM (Monday to Friday) Contact - 7976457434
Posted 2 days ago
4.0 - 6.0 years
8 - 13 Lacs
saidapet, tamil nadu
Work from Office
Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As aData Engineer, you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus onPython,SQL,AWS,PySpark, andDatabricks, you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines usingPySpark,Databricks, andSQLonAWS cloud platforms. Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment usingPythonandPySpark. Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such asData CatalogsorCollibra(if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 yearsof professional experience inData Engineeringor a related field. Strong programming experience withPythonand experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimizedSQLqueries on large-scale datasets. Solid hands-on experience withPySparkand distributed data processing frameworks. Expertise working withDatabricksfor developing and orchestrating data pipelines. Experience withAWS cloudservices such asS3,Glue,EMR,Athena,Redshift, andLambda. Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools likeAirflow,Databricks Jobs, orAWS Step Functions. Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure todata observability,monitoring, andalertingframeworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight.
Posted 2 days ago
3.0 - 8.0 years
2 - 6 Lacs
pune
Work from Office
Job Purpose ICE Mortgage Technology is driving value to every customer through our effort to automate everything that can be automated in the residential mortgage industry. Our integrated solutions touch each aspect of the loan lifecycle, from the borrower's \"point of thought\" through e-Close and secondary solutions. Drive real automation that reduces manual workflows, increases productivity, and decreases risk. You will be working in a dynamic product development team while collaborating with other developers, management, and customer support teams. You will have an opportunity to participate in designing and developing services utilized across product lines. The ideal candidate should possess a product mentality, have a strong sense of ownership, and strive to be a good steward of his or her software. More than any concrete experience with specific technology, it is critical for the candidate to have a strong sense of what constitutes good software; be thoughtful and deliberate in picking the right technology stack; and be always open-minded to learn (from others and from failures). Responsibilities Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule. Turn ideas and concepts into carefully designed and well-authored quality code. Articulate the interdependencies and the impact of the design choices. Develop APIs to power data driven products and external APIs consumed by internal and external customers of data platform. Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results. Improve and develop new engineering processes & tools. Knowledge and Experience 3+ years of building Enterprise Software Products. Experience in object-oriented design and development with languages such as Java. J2EE and related frameworks. Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred) Knowledge in Java/J2EE frameworks like Spring Boot, Microservice, JPA, JDBC and related frameworks is must. Built high throughput real-time and batch data processing pipelines using Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift. (Should know basics at least) Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres Proven ability to deliver working solutions on time Strong analytical thinking to tackle challenging engineering problems. Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills. Experience with working in DevOps environment you build it, you run it Demonstrated ability to set priorities and work in a fast-paced, dynamic team environment within a start-up culture. Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc (Nice to have) Experience with handling large data sets using technologies like HDFS, S3, Avro and Parquet (Nice to have)
Posted 2 days ago
6.0 - 10.0 years
20 - 25 Lacs
gurugram
Work from Office
Were seeking a detail-oriented and technically skilled Data Engineer to streamline our data operations. The ideal candidate will take ownership of the stability, integrity, and performance of our enterprise databases. This role involves hands-on maintenance, data lifecycle management, automation, and support for business-critical analytics. Monitor data consistency and integrity across multiple database platforms Extract, transform, and load (ETL) data from various sources Manage and support data transmission, synchronization, and consolidation between business systems Build and maintain robust database queries and interfaces Perform complex data imports, cleansing, structuring, and exports across varied sources Design and implement automated workflows for database operations, including deployment and routine maintenance Collaborate across teams to identify data needs and improve processes Troubleshoot issues, optimize performance, and ensure high availability of data systems Work with Snowflake (preferred) to manage large datasets and performance tuning Support financial data processing activities with a focus on accuracy and compliance . What we look for Proven experience working directly with relational databases (SQL Server, MySQL, etc. ) Expertise in Oracle and SQL Server database platforms. Strong knowledge of stored procedures, views, triggers, and query performance optimization Hands-on experience in data security, compliance, and capacity planning Previous experience handling financial data in a secure and controlled environment Programming skills (e. g. , VBA , scripting languages) are a strong asset Familiarity with Snowflake or willingness to learn Excellent attention to detail, organization, and problem-solving mindset Power BI / Power Query experience for reporting and dashboarding Strong communication skills for turning data into stories
Posted 2 days ago
5.0 - 10.0 years
18 - 20 Lacs
pune
Work from Office
Join us as a Full Stack Developer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. you'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Full Stack Developer you should have experience with: Front End technologies, UI/UX understanding. Proficiency in Java , Spring boot, data processing, RESTful, GraphQL API designs Strong knowledge of SQL Database Design patterns, Scalability and Performance Version Control, Best security practices. Some other highly valued skills may include: Build and Deployment Pipeline (Gradle/Maven) Monitoring and Logging E2E testing, performance testing DSA, understanding of solving complex problems. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as we'll as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.
Posted 2 days ago
1.0 - 3.0 years
3 - 7 Lacs
bengaluru
Work from Office
We are seeking a DevOps Engineer to lead the migration of multiple applications and services into a new AWS environment. This role requires a strategic thinker with hands-on technical expertise, a deep understanding of DevOps best practices, and the ability to guide and mentor other engineers. You will work closely with architects and technical leads to design, plan, and execute cloud-native solutions with a strong emphasis on automation, scalability, security, and performance. Key Responsibilities: Take full ownership of the migration process to AWS, including planning and execution. Work closely with architects to define the best approach for migrating applications into Amazon EKS . Mentor and guide a team of DevOps Engineers, assigning tasks and ensuring quality execution. Design and implement CI/CD pipelines using Jenkins, with an emphasis on security, maintainability, and scalability. Integrate static and dynamic code analysis tools (e.g., SonarQube ) into the CI/CD process. Manage secure access to AWS services using IAM roles , least privilege principles , and container-based identity (e.g., workload identity). Create and manage Helm charts for Kubernetes deployments across multiple environments. Conduct data migrations between S3 buckets , PostgreSQL databases , and other data stores, ensuring data integrity and minimal downtime. Troubleshoot and resolve infrastructure and deployment issues, both in local containers and Kubernetes clusters. Required Skills & Expertise: CI/CD & DevOps Tools: Jenkins pipelines (DSL), SonarQube, Nexus or Artifactory Shell scripting, Python (with YAML/JSON handling) Git and version control best practices Containers & Kubernetes: Docker (multi-stage builds, non-root containers, troubleshooting) Kubernetes (services, ingress, service accounts, RBAC, DNS, Helm) Cloud Infrastructure (AWS): AWS services: EC2, EKS, S3, IAM, Secrets Manager, Route 53, WAF, KMS, RDS, VPC, Load Balancers Experience with IAM roles , workload identities , and secure AWS access patterns Network fundamentals: subnets, security groups, NAT, TLS/SSL, CA certificates, DNS routing Databases: PostgreSQL: pg_dump/pg_restore, user management, RDS troubleshooting Web & Security Concepts: NGINX, web servers, reverse proxies, path-based/host-based routing Session handling, load balancing (stateful vs stateless) Security best practices, OWASP Top 10, WAF (configuration/training), network-level security, RBAC, IAM policies Candidate Expectations: Explain best practices around CI/CD pipeline design and secure AWS integrations. Demonstrate complex scripting solutions and data processing tasks in Bash and Python. Describe container lifecycle, troubleshooting steps, and security hardening practices. Detail Kubernetes architecture, Helm chart design, and access control configurations. Show a deep understanding of AWS IAM, networking, service integrations, and cost-conscious design. Discuss TLS certificate lifecycle, trusted CA usage, and implementation in cloud-native environments. Preferred Qualifications: AWS Certified DevOps Engineer or equivalent certifications. Experience in FinTech, SaaS, or other regulated industries. Knowledge of cost optimization strategies in cloud environments. Familiarity with Agile/Scrum methodologies. Certifications or experience with ITIL or ISO 20000 frameworks are advantageous.
Posted 2 days ago
4.0 - 8.0 years
8 - 12 Lacs
bengaluru
Work from Office
Sr Power BI/Data Engineer Location: Hybrid at Bengaluru, Karnataka, India Roles and Responsibilities Design, develop, and implement robust data models using state-of-the-art data management tools to support business objectives and provide insights. Collaborate with cross-functional teams to gather requirements and translate business needs into technical solutions that leverage data effectively. Manage data integration processes to ensure seamless data flow and accuracy across various systems and platforms. Oversee data warehousing solutions that optimize performance and reliability, ensuring that data availability aligns with business needs. Develop comprehensive data reports and dashboards that enable stakeholders to make informed decisions based on clear insights. Implement data governance and security measures to protect sensitive information and ensure compliance with industry standards. Continuously evaluate and improve ETL processes to increase efficiency and adaptability within rapidly changing data environments. Monitor data health and troubleshoot any issues that arise, maintaining the highest levels of data integrity and performance. Stay updated with emerging trends and technologies in the field of data engineering to incorporate innovative approaches and tools. Required Qualifications Proven experience in data engineering, with a deep understanding of data architecture and data management principles. Strong proficiency in designing and developing data processing systems that are scalable and reliable. Ability to translate complex technical concepts into actionable data strategies that align with business needs. In-depth knowledge of database management and data warehousing solutions, with a track record of successful implementations. Excellent problem-solving skills and the ability to troubleshoot complex data scenarios efficiently. Strong communication skills, both written and verbal, with the ability to explain technical concepts to non-technical stakeholders. Demonstrated ability to work effectively in a hybrid work environment, collaborating with onsite and remote teams. Key Responsibilities Lead the development and execution of data strategies that enable organizational growth and innovation. Spearhead the optimization of data warehousing and data integration processes to enhance system performance. Drive the implementation of best practices in data governance, ensuring data is secure and compliant with applicable regulations. Mentor junior data engineers, fostering a collaborative team environment and promoting knowledge sharing. Serve as a subject matter expert in data engineering, providing guidance and insights that influence strategic decisions. Continuously evaluate the effectiveness of current data systems and propose improvements to address emerging business needs. Oversee the development and maintenance of data-driven applications, ensuring they meet quality and security standards. ,
Posted 2 days ago
3.0 - 7.0 years
4 - 8 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to deliver analysis inline with client business objectives, goals, and to maintain, develop and exceed client performance targets. Job Description: Understands the client needs in specific. Ensures crisp communication with clients and work as an interface between team members and client counterpart. Discusses issues related to questionnaires with clients and suggest solutions for the same Uses specialised knowledge of market research tools / programming languages to understand the client requirements and build surveys/ deliver data tables as per the requirement with required quality and productivity levels Reviews project requirements and executes projects independently as per requirements by following the guidelines and deploying the tools/systems as applicable Assigns and allocates work to the junior team members, coordinates with them and help them programme the surveys Creates and follows work allocation schedule and project plan
Posted 2 days ago
4.0 - 8.0 years
5 - 9 Lacs
pune, bengaluru
Work from Office
The purpose of this role is to deliver analysis inline with client business objectives, goals, and to maintain, develop and exceed client performance targets. Job Description: Understands the client needs in specific. Ensures crisp communication with clients and work as an interface between team members and client counterpart. Discusses issues related to questionnaires with clients and suggest solutions for the same Uses specialised knowledge of market research tools / programming languages to understand the client requirements and build surveys/ deliver data tables as per the requirement with required quality and productivity levels Reviews project requirements and executes projects independently as per requirements by following the guidelines and deploying the tools/systems as applicable Assigns and allocates work to the junior team members, coordinates with them and help them programme the surveys Creates and follows work allocation schedule and project plan
Posted 2 days ago
2.0 - 5.0 years
5 - 8 Lacs
mohali
Work from Office
The Database Developer will be responsible for database design, development, tuning, and ensuring data integrity. Key Responsibilities Develop and optimize stored procedures, functions, and views. Perform data analysis and migration. Monitor and tune database performance. Technical Skills SQL Server/PostgreSQL/Oracle Stored Procedures, Triggers, Indexing Performance Tuning, ETL tools Data modeling Experience with AI-powered data processing Nice to Have Understanding of Microservice-based DB design Exposure to DevOps for database deployment automation
Posted 2 days ago
0.0 - 4.0 years
2 - 6 Lacs
bengaluru
Work from Office
EY- Global Delivery Services (GDS) Consulting People Consulting (PC), Asset Services (CASS) Staff/Consultant (EYWP) Managing the global workforce in today s fast pace and highly disrupted environment is becoming increasingly complex. As a member of our PC practice, you ll be part of a team that supports clients in aligning their HR function with the organizational plans while keeping employee experience as one of the core considerations. When you join us, you will gain cross functional, multi industry and a truly global work experience to take your career in the right direction. The opportunity We are looking for Consultant with expertise in providing production management support to customers and engagement teams to join the PC team. This is a fantastic opportunity to be a part of a leading global professional services firm whilst being instrumental in the growth of the People Consulting team. Your key responsibilities Manage and support technical queries from end-users based on the support model Resolve product/service problems by answering client queries, determining the cause of the problem; select and explain the best solution to solve the problem; expedite correction or adjustment; follow up to ensure resolution Design develop interactive dashboards and reports using Power BI Customize visuals and implement DAX measures to meet client-specific KPIs. Build and maintain ETL pipelines using Python. Perform data ingestion, transformation, and validation for structured and semi-structured data. Automate pre-ETL and post-ETL processes to improve data flow efficiency. Support OCN setup, including hierarchy mapping and data integration. Troubleshoot OCN-related issues and collaborate with stakeholders for enhancements. Maintain and enhance existing dashboards based on evolving business needs. Execute work associated with the implementation of EY tools, environment set-up for new clients, aligned with the project. Assist coordination of project activities between clients, project teams and directly with the external client Create knowledge base materials dedicated towards operational efficiency Skills and attributes for success High integrity and commitment to work in a new and challenging environment Ability to manage ambiguity and be proactive Strong communication and presentation skills Cross cultural awareness and sensitivity High energy levels, agility and adaptability Ability to effectively prioritize and escalate customer issues Ability to multitask and comfortable working in a large organization across multiple teams Ability to adapt quickly in using and supporting various technologies Qualifications Bachelor s degree in Computer Science, Information Systems, or related field. Certifications in Power BI, Python, or ETL tools. Experience working in Agile environments. To qualify for the role, you must have 0 4 years of experience in data analytics, BI, or ETL development. Proficiency in Power BI (DAX, Power Query). Hands-on experience with Python for data processing. Familiarity with ETL tools. Exposure to OrgChart Now or similar organizational charting tools is a plus. Strong conceptual and analytical skills Problem Solving, Performance tuning, Test case analysis, Azure DevOps, ServiceNow, Synthesia Experience in supporting web applications that use MongoDB, Neo4j and/or NoSQL, ADLS Strong analytical and problem-solving skills. Willingness to work in shifts, or extended hours, depending on the geography of the project Ideally, you ll also have Experience in using and supporting various technologies Strong hands on experience with Power BI or similar reporting tools along with broad exposure in Data architecture across web applications hosted in a cloud environment Customer support processes and technologies Reporting on service level agreements and usage statistics Technical experts with commercial acumen, relevant experiences and high degree of enthusiasm to adapt and learn in a fast-moving environment Knowledge and experience of working in a cross-cultural setup At EY, we re dedicated to helping our clients, from start ups to Fortune 500 companies and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that s right for you
Posted 2 days ago
1.0 - 5.0 years
3 - 7 Lacs
gurugram
Work from Office
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, youll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role This role will be part of the Treasury Applications Platform team, we are currently modernizing our platform, migrating it to GCP. You will contribute towards making the platform more resilient and secure for future regulatory requirements and ensuring compliance and adherence to Federal Regulations. Preferably a BS or MS degree in computer science, computer engineering, or other technical discipline 8+ years of software development experience Ability to effectively interpret technical and business objectives and challenges and articulate solutions Experience with managing large teams and balance multiple priorities. Willingness to learn new technologies and exploit them to their optimal potential Strong background with Java, Pyspark, SQL, Concurrency/parallelism, oracle, big data Cloud experience with GCP would be a preference Define problems and provide solution alternatives. Create detailed computer system design documentation. Implement deployment plan. Support consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation, design and deployment. Under supervision participate in unit-level and organizational initiatives with the objective of providing high-quality and value adding consulting solutions. Understand issues and diagnose root-cause of issues. Perform secondary research as instructed by supervisor to assist in strategy and business planning. Minimum Qualifications: Deep understanding of SDLC, Java, PL/SQL Comfortable with Java, Python, GitHub, CI/CD, ReactJS knowledge Preferred Qualifications: GCP experience would be preferred Exposure with large data processing systems Treasury Domain Knowledge
Posted 2 days ago
2.0 - 7.0 years
4 - 9 Lacs
noida
Work from Office
Location: Sector 63, Noida (WFO) Timings: Mon - Fri; 10:30 AM to 7:30 PM About the Role We are seeking a skilled Data Engineer with hands-on experience in building and maintaining scalable data pipelines and analytics solutions. The ideal candidate will be highly proficient in PySpark and Data Build Tool (DBT), with strong experience in managing large datasets, data warehousing, and modern data stack technologies. Key Responsibilities Design, build, and maintain ETL/ELT pipelines for efficient data ingestion, transformation, and storage. Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver actionable solutions. Implement and maintain data models using DBT for analytics and reporting. Optimize performance of large-scale distributed data processing jobs using PySpark. Ensure data quality, consistency, and integrity through validation and monitoring. Work with cloud-based data platforms (AWS, GCP, or Azure) to manage data storage and pipelines. Troubleshoot and resolve data-related technical issues. Participate in code reviews, documentation, and best practice enforcement for data engineering workflows. Required Skills and Qualifications Minimum 2 years of experience in Data Engineering or a related role. Experience writing code in Python language. Proficiency in PySpark for distributed data processing. Expertise in Data Build Tool (DBT) for data modeling and transformation. Strong knowledge of SQL and relational databases. Familiarity with data warehousing solutions such as Snowflake, BigQuery, or Redshift. Experience with cloud platforms like AWS, GCP, or Azure. Understanding of data pipelines, ETL/ELT processes, and data architecture principles. Knowledge of version control systems such as Git. Strong problem-solving skills and attention to detail. Preferred Skills Experience with orchestration tools like Airflow or Prefect. Knowledge of streaming data pipelines such as Kafka or Kinesis. Familiarity with containerization using Docker or Kubernetes. Understanding of data governance, security, and compliance practices. Perks and benefits of working at Algoscale: Opportunity to collaborate with leading companies across the globe. Opportunity to work with the latest and trending technologies. Competitive salary and performance-based bonuses. Comprehensive group health insurance. Flexible working hours and remote work options. (For some positions only) Generous vacation and paid time off. Professional learning and development programs and certifications.
Posted 2 days ago
2.0 - 7.0 years
4 - 9 Lacs
kochi
Work from Office
At EY, we re all in to shape your future with confidence. We ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Position Details Job Title Associate Data Engineer/Analyst Experience and Qualification Over 2 years of experience as a data analyst or data engineer. Bachelor s degree in a relevant field. Specific Role requirements Create, develop, and maintain scalable and efficient big data processing pipelines within distributed computing environments. Design, develop, and implement interactive Power BI reports and dashboards tailored to the needs of different business units. Work collaboratively with cross-functional teams to gather data requirements and design suitable data solutions. Execute data ingestion, processing, and transformation workflows to facilitate various analytical and machine learning applications. Keep updated with emerging technologies and best practices in data processing and analytics, integrating them into our data engineering methodologies. Possess expertise in data modelling, data warehousing principles, data governance best practices, and ETL processes. Demonstrate excellent written and verbal communication skills, with strong capabilities in documentation, presentation, and data storytelling. Have experience in utilizing and fine-tuning large language models (LLMs) and developing generative AI solutions. Technology Requirement Strong proficiency in SQL. Solid understanding of Azure data engineering tools, particularly Azure Data Factory. Familiarity with Python programming skills. Competent in using Azure Databricks. Expertise in Power BI and other Power Platform tools, including Power Apps and Power Automate. Knowledge of large language models and generative AI tools. Experience in a multi-cloud environment is advantageous.
Posted 2 days ago
4.0 - 5.0 years
6 - 7 Lacs
bengaluru
Work from Office
1.0 What this job involves: As the Incident Reporting and Analytics Manager reporting directly to the PSCC Lead, you will be part of a team that leads our Global Incident Management Reporting and Analytics function based in the Property Services Command Centre (PSCC) in Bangalore, India. You will establish and oversee centralized incident data management, analysis, and standardized reporting across JLLs global portfolio spanning APAC, US and EMEA regions. This role requires expertise in data analytics, facility engineering knowledge, and cross-regional communication skills to drive continuous improvement in incident management practices. 2.0 What your day-to-day will look like: 2.1 Data Collection and Management Establish and maintain a centralized database system for global incident data from country teams and the incident management system. Develop and implement data quality protocols and standards for the effective management of global incident data Create standardized data collection templates that integrate with existing systems Produce and maintain Standard Operating Procedures (SOPs) for all critical reporting functions Deliver monthly data quality assessments identifying areas for improvement Ensure proper data storage, security, and access controls for incident information Adapt to take on additional roles and responsibilities as required by the company or clients 2.2 Stakeholder Communication: Establish regular communication channels with country and regional teams Implement and maintain a comprehensive action tracking system with target closure dates Conduct monthly update meetings with regional representatives Develop standardized templates for cross-regional communications 2.3 Analytics and Reporting : Create customized dashboards for key stakeholders showing critical metrics Deliver quarterly trend analysis identifying recurring issues and systemic risks Provide analytical reports highlighting global trends affecting multiple regions Standardize reporting metrics and formats across different countries and regions Develop visualization tools that effectively communicate complex incident data Ensure consistent implementation and tracking of risk mitigation strategies across regions 2.4 Technology and Process Continuous Improvement : Deliver recommendations for reporting enhancements and process improvements wherever possible, including. Research and propose suitable analytics tools that can help to enhance reporting quality Develop automation solutions for routine reporting tasks to improve efficiency Create user-friendly documentation for all reporting processes and systems Conduct bi-annual effectiveness reviews of existing reporting methodologies Implement process improvements periodically based on stakeholder feedback Create training materials to support adoption of new reporting technologies 2.5 Compliance and Governance : Ensure compliance with internal standards and policies Support internal and external audit processes as required Develop compliant standards for the effective management of global incident data 2.6 Critical Facilities System Monitoring (APAC Backup) : Maintain capability and readiness to monitor APAC critical facility systems as needed 3.0 Desired or preferred experience and technical skills: 3.1 Data Analytics Proficiency in Excel, Power BI and/ similar tools data processing tools Statistical analysis capabilities Data visualization expertise Ensure data accuracy and completeness. 3.2 Facility Engineering Knowledge Understanding of critical facility systems and operations Familiarity with common incident types and resolution approaches Knowledge of industry standards and best practices Ability to interpret technical system alerts (for APAC monitoring backup) 3.3 Communication and Use of Tools Microsoft Teams collaboration skills Email management and organization Virtual meeting facilitation Document sharing and version control Ability and willingness to communicate adequately across different countries and regions Fluency at a professional level in written and spoken English 3.4 Project Management Tracking capabilities for follow-up actions Management of deadlines and structured timeframes for completing and issuing reports Stakeholder management across time zones Resource prioritization 4.0 Required Skills and Experience: Bachelors degree required, with 4-5 years of proven experience in a similar role, preferably in a command center/facilities management/Data analytics environment. Location: On-site Bengaluru, KA
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
pune
Work from Office
What makes a WorldpayerIt s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we re determined, always staying open winning and failing as one. We re looking for a Sr AWS Databricks Admin to join our Big Data Team to help us unleash the potential of every business. About the team: We are seeking a talented and experienced Senior AWS Data Lake Engineer to join our dynamic team who can design, develop, and maintain scalable data pipelines and manage AWS Data Lake solutions. The ideal candidate will have extensive experience in handling sensitive data, including Personally Identifiable Information (PII) and Payment Card Industry (PCI) data, using advanced tokenization and masking techniques. What you will be doing Design, develop, and maintain scalable data pipelines using Python and AWS services. Implement and manage AWS Data Lake solutions, including ingestion, storage, and cataloging of structured and unstructured data. Apply data tokenization and masking techniques to protect sensitive information in compliance with data privacy regulations (e.g., GDPR, HIPAA). Collaborate with data engineers, architects, and security teams to ensure secure and efficient data flows. Optimize data workflows for performance, scalability, and cost-efficiency. Monitor and troubleshoot data pipeline issues and implement robust logging and alerting mechanisms. Document technical designs, processes, and best practices. Provide support on Databricks and Snowflake. Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps. What you bring: 5+ years of experience working as a Python developer/architect. Strong proficiency in Python, with experience in data processing libraries (e.g., Pandas, PySpark). Proven experience with AWS services such as S3, Glue, Lake Formation, Lambda, Athena, and IAM. Solid understanding of data lake architecture and best practices. Experience with data tokenization, encryption, and anonymization techniques. Familiarity with data governance, compliance, and security standards. Experience with Snowflake and/or Databricks (Nice to have). Experience with CI/CD tools and version control (e.g., Git, CodePipeline). Strong problem-solving skills and attention to detail. Where you ll own it You ll own it in our modern Bangalore/Pune/Indore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe. Worldpay perks - what we ll bring for you We know it s bigger than just your career. It s your life, and your world. That s why we offer global benefits and programs to support you at every stage. Here s a taste of what you can expect. A competitive salary and benefits. Time to support charities and give back to your community. Parental leave policy. Global recognition platform. Virgin Pulse access. Global employee assistance program. What makes a Worldpayer At Worldpay, we take our Values seriously, and we live them every day. Think like a customer, Act like an owner, and Win as a team. Curious. Humble. Creative . We ask the right questions, listening and learning to get better every day. We simplify the complex and we re always looking to create a bigger impact for our colleagues and customers. Empowered. Accountable. Dynamic . We stay agile, using our initiative, taking calculated risks to progress. Never standing still, never settling, we work at pace to achieve our goals. We champion our ideas and stay flexible to make them happen. We know that every action adds up. Determined. Inclusive. Open. Unlocking potential means working as one global community. Our work spans borders, and we stay united by our purpose. We collaborate, always encouraging others to perform at their best, welcoming new perspectives. We can t wait to hear from you. To find out more about working with us, find us on LinkedIn .
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data processing job market in India is thriving with opportunities for job seekers in the field. With the growing demand for data-driven insights in various industries, the need for professionals skilled in data processing is on the rise. Whether you are a fresh graduate looking to start your career or an experienced professional looking to advance, there are ample opportunities in India for data processing roles.
These major cities in India are actively hiring for data processing roles, with a multitude of job opportunities available for job seekers.
The average salary range for data processing professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakh per annum, while experienced professionals can earn upwards of INR 10 lakh per annum.
A typical career path in data processing may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise in the field, they may progress from Junior Data Analyst to Senior Data Analyst, and eventually to roles such as Data Scientist or Data Architect.
In addition to data processing skills, professionals in this field are often expected to have knowledge of programming languages such as Python, SQL, and R. Strong analytical and problem-solving skills are also essential for success in data processing roles.
As you explore opportunities in the data processing job market in India, remember to prepare thoroughly for interviews and showcase your skills and expertise confidently. With the right combination of skills and experience, you can embark on a successful career in data processing in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |