Home
Jobs

1585 Data Processing Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: Our team is responsible for building the Celonis end-to-end Task Mining solution . Task Mining is the technology that allows businesses to capture user interaction (desktop) data, so they can analyze how people get work done, and how they can do it even better. We own all the related components, e.g. the desktop client, the related backend services, the data processing capabilities, and Studio frontend applications. The Role: Celonis is looking for a Senior Software Engineer to build new features and increase the reliability of our Task Mining solution. You would contribute to the development of our Task Mining Client so expertise on C# and .NET framework is required and knowledge of Java and Spring boot is a plus. The work you ll do: Implement highly performant and scalable desktop components to improve our existing Task Mining software Own the implementation of end to end solutions: leading the design, implementation, build and delivery to customers Increase the maintainability, reliability and robustness of our software Continuously improve and automate our development processes Document procedures, concepts, and share knowledge within and across teams Manage complex requests from support, finding the right technical solution and managing the communication with stakeholders Occasionally work directly with customers, including getting to know their system in detail and helping them debug and improve their setup. The qualifications you need: 7+ years of professional experience building .NET applications Passion for writing clean code that follows SOLID principles Hand-on experience in C# and .NET framework. Experience in user interface development using WPF and MVVM. Familiarity with Java, Spring framework is a plus. Familiarity with containerization technologies (i.e. Docker) Experience in REST APIs and/or distributed micro service architecture Experience in monitoring and log analysis capabilities (i.e. DataDog) Experience in writing and setting up unit and integration tests Experience in refactoring legacy components. Able to supervise and coach junior colleagues Experience interacting with customers is a plus. Strong communication skills. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .

Posted 1 week ago

Apply

4.0 - 9.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Lending team at Grab is dedicated to building safe, secure, and loan products catering to all user segments across SEA. Our mission is to promote financial inclusion and support underbanked partners across the region. Data plays a pivotal role in our lending operations, guiding decisions across credit assessment, collections, reporting, and beyond You will report to the Lead Data Engineer. This role is based in Bangalore. Get to Know the Role: As the Data engineer in the Lending Data Engineering team, you will work with data modellers, product analytics, product managers, software engineers and business stakeholders across the SEA in understanding the business and data requirements. You will build and manage the data asset, including acquisition, storage, processing and use channels, and using some of the most scalable and resilient open source big data technologies like Flink, Airflow, Spark, Kafka, Trino and more on cloud infrastructure. You are encouraged to think out of the box and have fun exploring the latest patterns and designs. The Critical Tasks You will Perform: Develop scalable, reliable ETL pipelines to ingest data from diverse sources. Build expertise in real-time data availability to support accurate real-time metric definitions. Implement data quality checks and governance best practices for data cleansing, assurance, and ETL operations. Use existing data platform tools to set up and manage pipelines. Improve data infrastructure performance to ensure, reliable insights for decision-making. Design next-gen data lifecycle management tools/frameworks for batch, real-time, API-based, and serverless use cases. Build solutions using AWS services like Glue, Redshift, Athena, Lambda, S3, Step Functions, EMR, and Kinesis. Use tools like Amazon MSK/Kinesis for real-time data processing and metric tracking. Read more Skills you need Essential Skills Youll Need: 4+ years of experience building scalable, secure, distributed, and data pipelines. Proficiency in Python, Scala, or Java for data engineering solutions. Knowledge of big data technologies like Flink, Spark, Trino, Airflow, Kafka, and AWS services (EMR, Glue, Redshift, Kinesis, and Athena). Solid experience with SQL, data modelling, and schema design. Hands-on with AWS storage and compute services (S3, DynamoDB, Athena, and Redshift Spectrum). Experience working with NoSQL, Columnar, and Relational databases. Curious and eager to explore new data technologies and solutions. Familiarity with in-house and AWS-native tools for efficient pipeline development. Design event-driven architectures using SNS, SQS, Lambda, or similar serverless technologies. Experience with data structures, algorithms, or ML concepts. Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave , and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What We Stand For at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Read more

Posted 1 week ago

Apply

2.0 - 7.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

2 - 7 years of experience in Python Good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Experience in developing data processing task using PySpark . Expertise in at least one popular cloud provider preferably AWS is a plus. Good knowledge of any RDBMS/NoSQL database with strong SQL writing skills Experience on Datawarehouse tools like Snowflake is a plus. Experience with any one ETL tool is a plus Strong analytical and problem-solving capability Excellent verbal and written communications skills Client facing skills: Solid experience working with clients directly, to be able to build trusted relationships with stakeholders Ability to collaborate effectively across global teams Strong understanding of data structures, algorithm, object-oriented design and design patterns Experience in the use of multi-dimensional data, data curation processes, and the measurement/improvement of data quality. General knowledge of business processes, data flows and quantitative models that generate or consume data Independent thinker, willing to engage, challenge and learn new technologies Qualification Bachelor s degree or master s in computer science or related field. Certification from professional bodies is a plus. SELECTION PROCESS Candidates should expect 3 - 4 rounds of personal or telephonic interviews to assess fitment and communication skills .

Posted 1 week ago

Apply

9.0 - 11.0 years

11 - 13 Lacs

Gurugram

Work from Office

Naukri logo

Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? You will be responsible for delivery of highly impactful analytics to understand and optimize our commercial acquisition site experience and increase digital conversion. Deliver strategic analytics focused on digital acquisition and membership experiences. Define and build key KPIs to monitor the channel/product/ platform health and success Support the development of new products and capabilities Deliver read out of campaigns uncovering insights and learnings that can be utilized to further optimize the channels Gain deep functional understanding of the enterprise-wide product capabilities and associated platforms over time and ensure analytical insights are relevant and actionable Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with American Express closed loop data Minimum Qualifications Advanced degree in a quantitative field (e.g. Finance, Engineering, Mathematics, Computer Science) Some experience with Big Data programming languages (BigQuery, Hive, Spark), Python, SQL. Experience in large data processing and handling, understanding in data science is a plus. Ability to work in a dynamic, cross-functional environment, with strong attention to detail. Excellent communication skills with the ability to engage, influence, and encourage partners to drive collaboration and alignment. Preferred Qualifications Strong analytical/conceptual thinking competence to solve unstructured and complex business problems and articulate key findings to senior leaders/partners in a succinct and concise manner. Basic knowledge of statistical techniques for experimentation & hypothesis testing, regression, t-test, or chi-square test. Strong programming skills

Posted 1 week ago

Apply

10.0 - 14.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

At Visa, the Corporate Information Technology, Billing & Incentives Platforms team, enables Visas revenue growth through flexible pricing engines and global revenue platforms built on next-generation technologies. This includes managing system requirements, evaluating cutting-edge technologies, design, development, integration, quality assurance, implementation and maintenance of corporate revenue applications. The team works closely with business owners of these services to deliver customer developed solutions, as well as implement industry leading packaged software. This team has embarked on a major transformational journey to build and implement best of breed revenue and billing applications to transform our business as well as technology. The candidate should enjoy working on diverse technologies and should be excited to take initiatives to solve complex business problems and get the job done while taking on new challenges. You should thrive in team-oriented and fast-paced environments where each team-member is vital to the overall success of the projects. Key Responsibilities Develop and maintain test automation scripts using PySpark for big data applications. Collaborate with data engineers and developers to understand data processing workflows and requirements. Design and implement automated tests for data ingestion, processing, and transformation in a Hadoop ecosystem. Perform data validation, data integrity, and performance testing for Spark applications. Utilize Spark-specific concepts such as RDDs, Data Frames, Datasets, and Spark SQL in test automation. Create and manage CI/CD pipelines for automated testing in a big data environment. Identify, report, and track defects, and work with the development team to resolve issues. Optimize and tune Spark jobs for performance and scalability. Maintain and update test cases based on new features and changes in the application. Document test plans, test cases, and test results comprehensively. Perform QA and manual testing for payments applications, ensuring compliance with business requirements and standards. Work with limited direction, usually within a complex environment, to drive delivery of solutions and meet service levels. Productively work with stakeholders in multiple countries and time zones. With active engagement, collaboration, effective communication, quality, integrity, and reliable delivery develop and maintain a trusted and valued relationship with the team, customers and business partners. Basic Qualifications Bachelors degree, OR 3+ years of relevant work experience Preferred Qualifications bachelor s degree in computer science, Information Technology or related field. Relevant certifications in Big Data, Spa

Posted 1 week ago

Apply

0.0 - 4.0 years

6 - 7 Lacs

Pune

Work from Office

Naukri logo

About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelors degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Azure Data Engineer - Soulpage IT Solutions Home Azure Data Engineer March 6, 2025 Position: Azure Data Engineer Skill set: Azure Databricks and Data Lake implementation Experience: 3+ years Notice Period: Immediate Immediate to 15 days Location: WFO, Hyderabad Job Type : Full-Time Positions: 2 Job Summary: We are looking for a highly skilled Azure Data Engineer with expertise in Azure Databricks and Data Lake implementation to design, develop, and optimize our data pipelines. The engineer will be responsible for integrating data from multiple sources, ensuring data is cleaned, standardized, and normalized for ML model building. This role involves working closely with stakeholders to understand data requirements and ensuring seamless data flow across different platforms. Key Responsibilities: Data Lake & Pipeline Development Design, develop, and implement scalable Azure Data Lake solutions . Build robust ETL/ELT pipelines using Azure Databricks, Data Factory, and Synapse Analytics . Optimize data ingestion and processing from multiple structured and unstructured sources. Implement data cleaning, standardization, and normalization processes to ensure high data quality. Implement best practices for data governance, security, and compliance . Optimize data storage and retrieval for performance and cost-efficiency. Monitor and troubleshoot data pipelines, ensuring minimal downtime. Work closely with data scientists, analysts, and business stakeholders to define data needs. Maintain thorough documentation for data pipelines, transformations, and integrations. Assist in developing ML-ready datasets by ensuring consistency across integrated data sources. Required Skills & Qualifications: 3+ years of experience in data engineering, with a focus on Azure cloud technologies . Expertise in Azure Databricks, Data Factory, Data Lake Strong proficiency in Python, SQL, and PySpark for data processing and transformations. Understanding of ML data preparation workflows , including feature engineering and data normalization. Knowledge of data security and governance principles . Experience in optimizing ETL pipelines for scalability and performance. Strong analytical and problem-solving skills. Excellent written and verbal communication skills. Preferred Qualifications: Azure Certifications Azure Data Engineer Associate, Azure Solutions Architect . Why Join Us? Work on cutting-edge Azure cloud and data technologies . Collaborate with a dynamic and innovative team solving complex data challenges. Competitive compensation and career growth opportunities. Application Process: Interested candidates can send their resumes to [email protected] with the subject line: Application for Azure Data Engineer We look forward to welcoming passionate individuals to our team!

Posted 1 week ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Kochi

Work from Office

Naukri logo

Create Annual Activity Planner and share with the client and TPV. Approve and Publish Final Version of Agreed Annual Payroll Calendar and system set up . Agree password format for the year Service Delivery Act as First point of escalation for payroll queries. Handle all the non-Payroll related tickets under the correct function . Mass upload, master data processing in hrX (only if applicable) Exchange event monitoring (only for hrX clients) Manager RCA - Arrange RCAs. validate quality Etc LVMS or BO reports to ensure all the ticket are close on time by TPV Responsible for the updating, maintaining, and enforcing of the Defined Work Instructions (DWIs) and CLIENT Solution workbook Responsible for the resolution of Technical/Functional issues escalated from the team, CLIENT and/or Partner and ensuring all system issues/defects are reported correctly and tickets are logged with the necessary details and evidence so Application Services and/or Products can investigate SLA Reporting Cross check the KPI with the real results and report to TPV to identify and correct any deviation Updating of SLA and fail reason in LVMS reported on monthly bases. Change Requests Check Client/Strada CSW/SOW for compliance Check Strada/TPV CSW/SOW for compliance Notify PSM on Change Requests raised Apply the CR process as per VPS 3.0 std. process Update CSW and get client s approval on the changes in Docs Escalations SPOC for TPV s First Escalation point for Clients. Include in RAG the escalations with PSM help Manage issues that need to be escalated - TPV related Security and Compliance Initiate SI process in case any SI detected by PSA Perform SOC1 Controls Hyper-care Participate in Hyper care calls Collaboration with Project Manager, PSM and OA team for Integration Support etc Supporting and Validating the test performed during pre-go live phase. (UAT/SIT testing and data mapping configuration, support in process definition) VPS process Walkthrough call with all the new CLIENTs during Hypercare Governance Manage regular Operations calls (Corrections call/Post-payroll call etc) Prepare post payroll Review Deck. Manages Operational Plan to track actions/issues. Manage issues that need to be escalated - TPV related Ensure adherence to all agreed schedules as per SOW for Client/TPV Collaborate with PSMs to ensure on the quality of services provided by the TPV provided to the client. Requirements 2+ years of client /vendor management experience in similar industries Experience in leading and handling client call 3 years Degree/Dipolma

Posted 1 week ago

Apply

7.0 - 10.0 years

30 - 35 Lacs

Surat

Work from Office

Naukri logo

KP Group is looking for Sr. Manager to join our dynamic team and embark on a rewarding career journey Delegating responsibilities and supervising business operations Hiring, training, motivating and coaching employees as they provide attentive, efficient service to customers, assessing employee performance and providing helpful feedback and training opportunities. Resolving conflicts or complaints from customers and employees. Monitoring store activity and ensuring it is properly provisioned and staffed. Analyzing information and processes and developing more effective or efficient processes and strategies. Establishing and achieving business and profit objectives. Maintaining a clean, tidy business, ensuring that signage and displays are attractive. Generating reports and presenting information to upper-level managers or other parties. Ensuring staff members follow company policies and procedures. Other duties to ensure the overall health and success of the business.

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

ECMS Req # /Demand ID 529266 PU DNA Role Technology Lead Number of Openings 1 Duration of project 1-2 years No of years experience 8-10 years Detailed job description - Skill Set: 6+ years of industry experience including cloud technologies. Very strong hands-on experience in Databricks with AWS cloud service in Data engineering/Data processing hands-on. Hands on experience in AWS C loud-based development and integration Proficiency in programming languages Scala and Spark Data frame for data processing and application development Practical experience with Data Engineering, Data Ingestion/Orchestration with Apache Airflow and the accompanying DevOps with CI-CD tools Strong knowledge on Spark, Databricks SQL - Data engineering pipeline Experience in offshore/onshore model and Ability and agile methodology. Gathering requirements, understand the business need and regular discussion with tech on design, development activities. Should have good experience working with client architect/design team to understand the architecture, requirement and work on the development. Experience working in a Financial Industry. Certification on Databricks and AWS will be added advantage

Posted 1 week ago

Apply

0.0 - 5.0 years

2 - 3 Lacs

Noida, Ghaziabad, Faridabad

Work from Office

Naukri logo

We are looking for a Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records.

Posted 1 week ago

Apply

10.0 - 15.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a Senior Data Engineer to lead the design and implementation of scalable data infrastructure and engineering practices. This role will be critical in laying down the architectural foundations for advanced analytics and AI/ML use cases across global business units. Youll work closely with the Data Science Lead, Product Manager, and other cross-functional stakeholders to ensure data systems are robust, secure, and future-ready. Key Responsibilities: Architect and implement end-to-end data infrastructure including ingestion, transformation, storage, and access layers to support enterprise-scale analytics and machine learning. Define and enforce data engineering standards, design patterns, and best practices across the CoE. Lead the evaluation and selection of tools, frameworks, and platforms (cloud, open source, commercial) for scalable and secure data processing. Work with data scientists to enable efficient feature extraction, experimentation, and model deployment pipelines. Design for real-time and batch processing architectures , including support for streaming data and event-driven workflows. Own the data quality, lineage, and governance frameworks to ensure trust and traceability in data pipelines. Collaborate with central IT, data platform teams, and business units to align on data strategy, infrastructure, and integration patterns. Mentor and guide junior engineers as the team expands, creating a culture of high performance and engineering excellence. Qualifications: 10+ years of hands-on experience in data engineering, data architecture, or platform development. Strong expertise in building distributed data pipelines using tools like Spark, Kafka, Airflow, or equivalent orchestration frameworks. Deep understanding of data modeling, data lake/lakehouse architectures , and scalable data warehousing (e.g., Snowflake, BigQuery, Redshift). Advanced proficiency in Python and SQL , with working knowledge of Java or Scala preferred. Strong experience working on cloud-native data architectures (AWS, GCP, or Azure) including serverless, storage, and compute optimization. Proven experience in architecting ML/AI-ready data environments , supporting MLOps pipelines and production-grade data flows. Familiarity with DevOps practices, CI/CD for data , and infrastructure-as-code (e.g., Terraform) is a plus. Excellent problem-solving skills and the ability to communicate technical solutions to non-technical stakeholders.

Posted 1 week ago

Apply

12.0 - 17.0 years

32 - 37 Lacs

Pune

Work from Office

Naukri logo

Senior Software Engineer - AWS Cloud Engineer As a Senior Software Engineer- AWS Cloud Engineer with Convera , looking for motivated and experienced Voice Engineers and professional who are eager to expand their expertise into the dynamic world of Amazon Connect a cutting-edge, cloud- based contact center solution that offers complete customization with scalable cloud technology. If youre looking to advance your career in software development, AWS, or AI, this is the perfect opportunity to upskill and work on innovative solutions. You will be responsible for: In your role as a Senior AWS Cloud Engineer, you will: Architect and Develop Cloud Solutions: Lead the end-to-end design and development of robust data pipelines and data architectures using AWS tools and platforms, including AWS Glue, S3, RDS, Lambda, EMR, and Redshift. Analyze, implement, support, and provide recommendations for AWS cloud solutions Design, deploy, and manage AWS network infrastructure using VPC, Transit Gateway, Direct Connect, Route 53, and AWS Security Groups while also supporting on-premises networking technologies Architect and deploy AWS infrastructure for hosting new and existing line-of-business applications using EC2, Lambda, RDS, S3, EFS, and AWS Auto Scaling Ensure compliance with AWS Well-Architected Framework and security best practices using IAM, AWS Organizations, GuardDuty, and Security Hub Container Orchestration, deploy and manage containerized applications using AWS ECS and EKS. Event-Driven Serverless Architecture: Design and implement event-driven serverless architectures using AWS Lambda, API Gateway, SQS, SNS, and EventBridge. Implement and test system recovery strategies in accordance with the company s AWS Backup, Disaster Recovery (DR), and Business Continuity (BC) plans Collaborate with AWS Technical Account Managers (TAMs) and customers to provide cloud strategy, cost optimization, and technology roadmaps that align with business objectives Design AWS cloud architectures following Well-Architected guidelines, leveraging CloudFormation, Terraform, and AWS Control Tower Actively participate in team meetings, project discussions, and cross-functional collaboration to enhance AWS cloud adoption and optimization Maintain customer runbooks, automating and improving them with AWS-native solutions such as AWS Systems Manager, CloudWatch, and Lambda Provide off-hours support on a rotational basis, including on-call responsibilities and scheduled maintenance windows Contribute to internal RD projects, validating and testing new processes and/or tools and services for integration into Innovative Solutions offerings Lead or contribute to internal process improvement initiatives, leveraging various DevOps tools enhance automation and efficiency AWS Services within the scope of this role are not limited to the ones specifically called out in this list of responsibilities. A successful candidate for this position should have: Bachelors degree in business or computer science and 12+ years of experience in software engineering or IT including at least four years of experience in a role in which the primary responsibility is git-based application code development and/or DevOps Engineering and/or the development, maintenance, and support of CI/CD pipelines or appropriate combination of industry related professional experience and education Proven experience with AWS services, such as EC2, S3, Lambda, CloudFormation, VPC, among others. Skilled in scripting languages, such as Python, Bash, or PowerShell. Experience with Infrastructure as Code (IaC) tools such as Terraform and AWS CloudFormation monitoring and logging tools such as AWS CloudWatch and ELK stack. Strong understanding of cloud security best practices. Great communication and collaboration skills. Ability to work independently and with a team. Preferred Qualifications AWS Certified Solutions Architect - Associate or Professional. AWS Certified DevOps Engineer - Professional. HashiCorp Certified: Terraform Associate Experience with CI/CD pipelines and DevOps practices. Knowledge of scalable data architecture to ensure efficient and scalable data processing and storage solutions. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers - helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, results-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual Great career growth and development opportunities in a global organization A flexible approach to work #LI-KP1

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

At Quanticate, were pioneers in providing top-tier statistical and data management support to our clients. Were seeking a dedicated "Clinical Data Manager I" whos committed to upholding the highest standards, following procedures, and ensuring compliance with regulations, all while providing exceptional customer care. As a "Clinical Data Manager I" you will lead, co-ordinate, and action all tasks relating to Clinical Data Management from the start to the finish of a study to project manage studies across CDM functions. Core Accountabilities: Activities required of a Clinical Data Manager I (however not restricted to) are as below: To contribute to the efficient running of the CDM department as part of the CDM leadership team. Ensure launch, delivery, and completion of all CDM procedures according to contractual agreement and relevant SOPs, guidelines, and regulations To pro-actively keep abreast of current clinical data management developments and systems To assist in the creation and review of in-house SOPs. To research and provide input into in-house strategies and systems. To perform medical coding activities on projects, if assigned. To perform other reasonable tasks as requested by management. Ensure consistency of process and quality across projects. Project management for allocated projects: To help plan and manage study timelines and resources. To manage progress against schedules and report to management. To perform project management across all functions for a study as appropriate. Management of CRFs and all related tasks Management of allocated staff: Allocation of projects in conjunction with Project Management, as appropriate Performance reviews, as required. Administer training and development of staff, as required. Key Relationships: Act as the primary CDM contact, both external and internal, for Quanticate projects. Manage work assignment and delivery of project tasks to the data processing and programming team as required Line management responsibilities for any assigned direct reports, including professional development/training and performance appraisals. Qualified to an appropriate standard, preferably to degree level in a life sciences subject Four to seven years of relevant experience in CRO Clinical Data Management domain. Extensive knowledge of

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GenAI + Full-Stack Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). 3-8 years of strong hands-on experience in software development, with a focus on AI/ML and Generative AI Hands-on with Generative AI technologies with at least one of the following experiences: Working with Large Language Models (LLMs) such as GPT, LLaMA, Claude, etc. Building intelligent systems using LangGraph, Agentic AI frameworks, or similar orchestration tools Implementing Retrieval-Augmented Generation (RAG), prompt engineering, and knowledge augmentation techniques Proficiency in Python, including experience with data processing, API integration, and automation scripting Demonstrated experience in end-to-end SDLC (Software Development Life Cycle): requirement gathering, design, development, testing, deployment, and support Proficient in CI/CD pipelines and version control systems like Git Experience with containerization technologies such as Docker, and orchestration using Kubernetes Strong problem-solving and debugging skills, with an ability to write clean, efficient, and maintainable code Excellent verbal and written communication skills, with the ability to collaborate effectively across technical and business teams About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 week ago

Apply

10.0 - 11.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

About the position: Chevron ENGINE is looking for high-performing Technical Geophysicist candidates to join our Earth Science team. The role provides a wide variety of technical products to support asset teams across the enterprise in a range of development settings and basins; offshore, onshore, conventional and unconventional, exploration through to development. Key responsibilities: Delivers key technical geophysical analysis products and interpretation such as: Seismic to well tie, Acoustic Impedance Inversion, Acoustic FD earth model building, Time2depth velocity modeling, Depth uncertainty analysis, Seismic data processing support, 4D & CCUS (Carbon Capture Utilization and Storage) feasibility modeling and analysis Tasks will also include performance of routine compliance tasks, automated and manual software compatibility testing for periodic system and software upgrades, and close coordination with Subsurface Platform Systems Engineers Petrel and geophysical software development skills Teaming with US-based research and development groups focusing on developing and deploying geophysical products and workflows Continual communication with asset and exploration teams spanning the globe Required Qualifications: MSc degree in Earth Science, (Geophysics preferred) from deemed/recognized (AICTE) university At least 5 years industry related experience Industry Experience in technical geophysics including, but not limited to seismic interpretation, seismic-to-well tie, Acoustic Impedance inversion, acoustic FD earth model building, Time2depth velocity modeling, Depth uncertainty analysis, Seismic data processing support Experience with Petrel, DELFI would be a differentiator. Familiarity with Hampson-Russell, Jason, as well as seismic processing software skills will be a benefit Understanding of physical processes associated with earth science, reservoir modeling and subsurface Good communication skills and work effectively in a team environment Fundamental knowledge of geophysical workflows applied to subsurface Skills of using ML/AI to accelerate performance and accuracy of reservoir characterization is a plus Experience geophysical application within the oil and gas industry is preferred C# programing skills or Ocean SDK experience will be differentiating is a plus Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1:30pm to 10:30pm. Chevron participates in E-Verify in certain locations as required by law.

Posted 1 week ago

Apply

4.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Career Category Operations Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. GCP Quality Compliance Manager What you will do The Quality Compliance Manager is a global role and part of the Process Quality team for the RD Quality Organization. In this vital role you will work with a team of process-focused colleagues who work to complete Amgen s Process Quality strategy, which is vital to ensuring that Amgen s Research and Development Standards (SOPs and associated documentation) are adequate, clear, and up to all applicable current regulations and quality requirements. The RD Process Quality team supports the Quality Management System (QMS) across all areas of research at Amgen, including discovery through the full clinical development lifecycle. This team ensures that all Amgen s business procedures meet internal and external quality standards and are managed for optimum efficiency and effectiveness. The Process Quality team also ensures that Amgen s RD Business Process Network develops and manages fit for purpose standards (SOPs) that are continuously improved upon using quality by design (QbD), and risk management methods that include QMS analytics showing quality signals and trends. In addition, this individual will help support end users in RD with the digital quality management system (DQMS) with queries, deviations and Corrective and Preventive Actions (CAPAs). The Quality Compliance Manager will contribute to implementing strategies and providing leadership to ensure excellence in RD Quality Processes. As an integral team member working globally with Business Process Owners to ensure compliance with regulations and other requirements. Roles Responsibilities: This role will work both independently and in a team environment. Their primary responsibility is to support continuous improvement initiatives for RD quality, but they will also be responsible for any other operational or strategy activities assigned. Generate and review process area Knowledge Maps (spider maps, lessons learning, and data processing techniques, stored in a graph-based database for better search, analysis, and visualization) to help determine inherent and residual risks, document risk assessments, and collaborate with Business Process Owners and Quality Leads to ensure accurate risk classification and preventive actions. Supports Amgen s procedural framework so that all procedures maintain compliance to relevant laws, regulations, and internal quality standards; works to ensure that procedures maintain the ethical and safe treatment of all research subjects and that all data has integrity. Provide real-time, site-level quality oversight using analytical tools to identify trends, weaknesses, and data quality issues. Perform focused quality control checks on-site and remotely at clinical trial locations, especially key target sites. Offer independent and objective quality advice to local study teams Conduct risk assessments to inform audit site selection and pre-inspection/mock inspection visits. Support site/sponsor inspection readiness and management, including prep, conduct, response, and close-out phases. Ensures that all procedures are written clearly for the execution of Amgen s research tasks within a diverse, complex, and cross-functional team of researchers. Supports incoming procedural change requests, including the assessment of changes (impact to the QMS, including traceability of changes across other document sets. Supports the work of Business Process Owners and applies risk-based strategies consistently to identify and mitigate risks towards the continuous advancement of Amgen s RD QMS. Applies industry standard methodologies for optimal (standardized and lean) procedural documentation, and the use of technology to drive an efficient and effective knowledge management system. Supports the application of process metrics (KQI, KPI - leading and lagging) and modern analytic methods across the Business Process Network in order to enable Management Reviews (periodic review by management to ensure QMS health is maintained). Collaborates with other quality professionals within RD to support the QMS continuous improvement cycle (Plan, Do, Check, Act), including Deviation Management/ Corrective and Preventative Actions (CAPA). What we expect of you Basic Qualifications and Experience: Master s degree and 4-6 years in Pharma and Biotechnology RD Quality OR Bachelor s degree and 6-8 years of years in Pharma and Biotechnology RD Quality. Diplomas degree and 10-12 years of years in Pharma and Biotechnology RD Quality. Functional Skills: Must-Have Skills: Exceptional attention to detail and accuracy in all deliverables. Ability to work independently and proactively in a fast-paced environment. Proficiency in Microsoft Office Suite (Word, Excel, PowerPoint, Outlook) and virtual collaboration tools (e. g. , Teams, WebEx) Solid understanding of SOP/Standards management, and methods/ technology used to drive knowledge management across a diverse RD environment. Good-to-Have Skills: Familiarity with project management tools and methodologies. Knowledge of GCP, GLP and/or GPvP. Experience working in a multinational environment with global teams. Experience within Biotech/pharmaceutical Research, including the application of Global Regulations. Direct experience working with standard procedural documentation, including their creation, change control (requests for change and the execution of changes. Soft Skills: Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Strong problem-solving abilities and adaptability to changing priorities. Collaborative attitude and ability to build positive relationships across diverse teams. Resilience, discretion, and the ability to thrive under pressure What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Chennai

Work from Office

Naukri logo

TransUnions Job Applicant Privacy Notice What Well Bring: Data Pipeline Engineer at Orion project are embedded within our engineering teams and support the development and operation. What Youll Bring: Lead Data Engineer What We Offer We are looking for an individual to be part of an autonomous, cross-functional agile/scrum team where everyone shares responsibility for all aspects of the work. The ideal candidate will have a strong interest to join our growing Data Engineering and Analytics track of GFS Core services who will drive building next generation suite of products and platform by designing, coding, building and deploying highly scalable and robust solutions. We are looking for enthusiastic professionals who are excited to learn, love a good challenge, and are always looking for opportunities to contribute. Finally yet importantly, we look for dedicated team players who enjoy collaboration and can work effectively with others to achieve common goals. TransUnion is currently seeking a Lead Data Engineer with 7+ years experience to work in our Chennai office, India. You will be working with some of the latest tools and a great team of cross-functional engineers. We work with multiple technologies. This will be an opportunity to work on core services of an industrial strength Identity and Risk solution by streamlining design and collaborating with the team to build orchestration platform in cloud. Who We Are At TransUnion, we are dedicated to finding ways information can be used to help people make better and smarter decisions. As a trusted provider of global information solutions, our mission is to help people around the world access the opportunities that lead to a higher quality of life, by helping organizations optimize their risk-based decisions and enabling consumers to understand and manage their personal information. Because when people have access to more complete and multidimensional information, they can make decisions that are more informed and achieve great things. Every day TransUnion offers our employees the tools and resources they need to find ways information can be used in diverse ways. What you ll bring: Bachelors Degree in a quantitative field, plus 7+ years of work experience or equivalent practical experience. 5+ years of experience in Big Data technologies Experience designing and implementing data pipelines Experience with SQL, PostgreSQL and/or Redshift, or other data management, reporting and query tools. Big Data Technologies - Hadoop HDFS, Hive, Spark, Kafka, Sqoop Designing Logical Data Model and Physical Data Models including data warehouse and data mart designs. Expertise in writing complex, highly optimized queries across large data sets to write data pipelines and data processing layers. Cloud System experience on AWS, Azure, GCP -Preferably GCP Coach / Mentor / Lead a team of Data Engineers Design, build, test and deploy cutting edge Big Data solutions at scale Extract, Clean, transform, and analyze vast amounts of raw data from various Data Sources Build data pipelines and API integrations with various internal systems Proactively monitor, identify, and escalate issues or root causes of systemic issues Evaluate and communicate technical risks effectively and ensure assignments delivery in scheduled time with desired quality Work across Data Engineering, Data Architecture, Data Visualization functions What we ll bring: At TransUnion, we have a welcoming and energetic environment that encourages collaboration and innovation we re - consistently exploring new technologies and tools to be agile. This environment gives our people the opportunity to hone current skills and build new capabilities, while discovering their genius. Come be a part of our team - you will work with great people, pioneering products and cutting-edge technology. This role is for a Lead Data Engineer that will operate as a lead for Data Pipeline track and responsible for development of the global fraud solutions of TransUnion. We pride ourselves in working in a collaborative cross-functional manner where all engineers are expected to contribute to design, build, deployment and operation of our cloud platform. Location: Chennai Job Type: Full-time day job Impact Youll Make: N/A This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Lead Developer, Software Development

Posted 1 week ago

Apply

11.0 - 19.0 years

32 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e. g. , data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc. ) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e. g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e. g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e. g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e. g. , Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e. g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e. g. Architecture Trade off Analysis

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 4 Lacs

Faridabad

Work from Office

Naukri logo

We are seeking a highly detail-oriented and technically adept 3D Data Annotation Specialist to join our growing team. This role is critical in shaping high-quality datasets for training cutting-edge AI and computer vision models, particularly in domains such as LiDAR data processing, and 3D object detection. Roles and Responsibilities Qualifications: B.Tech in Computer Science, IT, or related field preferred (others may also apply strong analytical and software learning abilities required). Strong analytical and reasoning skills, with attention to spatial geometry and object relationships in 3D space. Basic understanding of 3D data formats (e.g., .LAS, .LAZ, .PLY) and visualization tools. Ability to work independently while maintaining high-quality standards. Excellent communication skills and the ability to collaborate in a fast-paced environment. Attention to detail and ability to work with precision in visual/manual tasks. Good understanding of basic geometry, coordinate systems, and file handling. Preferred Qualifications: Prior experience in 3D data annotation or LiDAR data analysis. Exposure to computer vision workflows. Comfortable working with large datasets and remote sensing data Key Responsibilities: Annotate 3D point cloud data with precision using specialized tools [ Training would be provided] Label and segment objects within LiDAR data, aerial scans, or 3D models. Follow annotation guidelines while applying logical and spatial reasoning to 3D environments. Collaborate with ML engineers and data scientists to ensure annotation accuracy and consistency. Provide feedback to improve annotation tools and workflow automation. Participate in quality control reviews and conduct re-annotation as needed

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As a Data Engineer , you will leverage your expertise in Databricks , big data platforms , and modern data engineering practices to develop scalable data solutions for our clients. Candidates with healthcare experience, particularly with EPIC systems , are strongly encouraged to apply. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. Responsibilities: Develop data ingestion, data processing and analytical pipelines for big data, relational databases and data warehouse solutions Design and implement data pipelines and ETL/ELT processes using Databricks, Apache Spark, and related tools. Collaborate with business stakeholders, analysts, and data scientists to deliver accessible, high-quality data solutions. Provide guidance on cloud migration strategies and data architecture patterns such as Lakehouse and Data Mesh Provide pros/cons, and migration considerations for private and public cloud architectures Provide technical expertise in troubleshooting, debugging, and resolving complex data and system issues. Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides Experience working with Data Governance, Data security and Data Privacy (Unity Catalogue or Purview) If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: 3+ years of consulting or client service delivery experience on Azure Graduate/Postgraduate in computer science, computer engineering, or equivalent with minimum of 8 years of experience in the IT industry. 3+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases such as SQL server and data warehouse solutions such as Azure Synapse Extensive hands-on experience implementing data ingestion, ETL and data processing. Hands-on experience in and Big Data technologies such as Java, Python, SQL, ADLS/Blob, PySpark and Spark SQL, Databricks, HD Insight and live streaming technologies such as EventHub. Experience with cloud-based database technologies (Azure PAAS DB, AWS RDS and NoSQL). Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, etc. Experience with monitoring and diagnostic tools (SQL Profiler, Extended Events, etc). Expertise in data mining, data storage and Extract-Transform-Load (ETL) processes. Experience with relational databases and expertise in writing and optimizing T-SQL queries and stored procedures. Experience in using Big Data File Formats and compression techniques. Experience in Developer tools such as Azure DevOps, Visual Studio Team Server, Git, Jenkins, etc. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Preferred Technical And Professional Experience Cloud platform certification, e.g., Microsoft Certified: (DP-700) Azure Data Engineer Associate, AWS Certified Data Analytics – Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working with EPIC healthcare systems (e.g., Clarity, Caboodle). Databricks certifications (e.g., Databricks Certified Data Engineer Associate or Professional). Knowledge of GenAI tools, Microsoft Fabric, or Microsoft Copilot. Familiarity with healthcare data standards and compliance (e.g., HIPAA, GDPR). Experience with DevSecOps and CI/CD deployments Experience in NoSQL databases design Knowledge on , Gen AI fundamentals and industry supporting use cases. Hands-on experience with Delta Lake and Delta Tables within the Databricks environment for building scalable and reliable data pipelines. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Data Cleansing & Integration Project Delivery: Execute high visibility data programs as assigned by the Data Cleansing Manager. Utilize SAP data load solutions such as SAP Migration Cockpit and LSMW for data loading and template creation. FDO Data Change Management Methodology: Assist in defining data cleansing approaches using Mass Change functionality. Develop and prepare data cleansing strategies. Data Cleansing & Integration Technical Guidance: Understand SAP landscape and data flow to underlying/consumed systems to prevent data synchronization issues. Data Quality: Collaborate with the Data Quality (DQ) team to define DQ rules and enhance visibility of existing data quality. Data Governance: Work with the Data Governance (DG) team to ensure proper governance before implementing system changes. Conduct necessary data load testing in test systems. Data Sourcing: Maintain and update the data catalogue/data dictionary, creating a defined list of data sources indicating the best versions (golden copies). Data Ingestion: Collaborate with DG and project teams on data harmonization by integrating data from multiple sources. Develop sustainable integration routines and methods. Qualifications: Experience: Minimum of 6 years in data-related disciplines such as data management, quality, and cleansing. Technical Skills: Proven experience in delivering data initiatives (cleansing, integration, migrations) using established technical data change methodologies. Proficiency in handling large data sets with tools like Microsoft Excel and Power BI. Experience with SAP native migration and cleansing tools such as SAP Migration Cockpit, LSMW, and MASS. Knowledge of Master Data Management in SAP MDG, SAP ECC, and associated data structures. Collaboration: Ability to work effectively with internal cross-functional teams.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills: Pyspark, Spark, Python Preferred Skills: Technology->Analytics - Packages->Python - Big Data Technology->Big Data - Data Processing->Spark

Posted 1 week ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit

Posted 1 week ago

Apply

2.0 - 7.0 years

1 - 3 Lacs

Guwahati

Work from Office

Naukri logo

Proficiency in Microsoft Excel / Google Sheets Strong knowledge of Excel formulas Experience with Pivot Tables Knowledge of Macros (preferred) Background in Mathematics (advantageous) Prior experience in MIS reporting If Interested kindly share your resume with your update details t.globalzonehr@gmail.com

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies