Home
Jobs

1625 Data Processing Jobs - Page 16

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

2 - 3 Lacs

Noida, Ghaziabad, Faridabad

Work from Office

Naukri logo

We are looking for a Data Entry Operator to update and maintain information on our company databases and computer systems. Data Entry Operator responsibilities include collecting and entering data in databases and maintaining accurate records.

Posted 1 week ago

Apply

10.0 - 15.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a Senior Data Engineer to lead the design and implementation of scalable data infrastructure and engineering practices. This role will be critical in laying down the architectural foundations for advanced analytics and AI/ML use cases across global business units. Youll work closely with the Data Science Lead, Product Manager, and other cross-functional stakeholders to ensure data systems are robust, secure, and future-ready. Key Responsibilities: Architect and implement end-to-end data infrastructure including ingestion, transformation, storage, and access layers to support enterprise-scale analytics and machine learning. Define and enforce data engineering standards, design patterns, and best practices across the CoE. Lead the evaluation and selection of tools, frameworks, and platforms (cloud, open source, commercial) for scalable and secure data processing. Work with data scientists to enable efficient feature extraction, experimentation, and model deployment pipelines. Design for real-time and batch processing architectures , including support for streaming data and event-driven workflows. Own the data quality, lineage, and governance frameworks to ensure trust and traceability in data pipelines. Collaborate with central IT, data platform teams, and business units to align on data strategy, infrastructure, and integration patterns. Mentor and guide junior engineers as the team expands, creating a culture of high performance and engineering excellence. Qualifications: 10+ years of hands-on experience in data engineering, data architecture, or platform development. Strong expertise in building distributed data pipelines using tools like Spark, Kafka, Airflow, or equivalent orchestration frameworks. Deep understanding of data modeling, data lake/lakehouse architectures , and scalable data warehousing (e.g., Snowflake, BigQuery, Redshift). Advanced proficiency in Python and SQL , with working knowledge of Java or Scala preferred. Strong experience working on cloud-native data architectures (AWS, GCP, or Azure) including serverless, storage, and compute optimization. Proven experience in architecting ML/AI-ready data environments , supporting MLOps pipelines and production-grade data flows. Familiarity with DevOps practices, CI/CD for data , and infrastructure-as-code (e.g., Terraform) is a plus. Excellent problem-solving skills and the ability to communicate technical solutions to non-technical stakeholders.

Posted 1 week ago

Apply

12.0 - 17.0 years

32 - 37 Lacs

Pune

Work from Office

Naukri logo

Senior Software Engineer - AWS Cloud Engineer As a Senior Software Engineer- AWS Cloud Engineer with Convera , looking for motivated and experienced Voice Engineers and professional who are eager to expand their expertise into the dynamic world of Amazon Connect a cutting-edge, cloud- based contact center solution that offers complete customization with scalable cloud technology. If youre looking to advance your career in software development, AWS, or AI, this is the perfect opportunity to upskill and work on innovative solutions. You will be responsible for: In your role as a Senior AWS Cloud Engineer, you will: Architect and Develop Cloud Solutions: Lead the end-to-end design and development of robust data pipelines and data architectures using AWS tools and platforms, including AWS Glue, S3, RDS, Lambda, EMR, and Redshift. Analyze, implement, support, and provide recommendations for AWS cloud solutions Design, deploy, and manage AWS network infrastructure using VPC, Transit Gateway, Direct Connect, Route 53, and AWS Security Groups while also supporting on-premises networking technologies Architect and deploy AWS infrastructure for hosting new and existing line-of-business applications using EC2, Lambda, RDS, S3, EFS, and AWS Auto Scaling Ensure compliance with AWS Well-Architected Framework and security best practices using IAM, AWS Organizations, GuardDuty, and Security Hub Container Orchestration, deploy and manage containerized applications using AWS ECS and EKS. Event-Driven Serverless Architecture: Design and implement event-driven serverless architectures using AWS Lambda, API Gateway, SQS, SNS, and EventBridge. Implement and test system recovery strategies in accordance with the company s AWS Backup, Disaster Recovery (DR), and Business Continuity (BC) plans Collaborate with AWS Technical Account Managers (TAMs) and customers to provide cloud strategy, cost optimization, and technology roadmaps that align with business objectives Design AWS cloud architectures following Well-Architected guidelines, leveraging CloudFormation, Terraform, and AWS Control Tower Actively participate in team meetings, project discussions, and cross-functional collaboration to enhance AWS cloud adoption and optimization Maintain customer runbooks, automating and improving them with AWS-native solutions such as AWS Systems Manager, CloudWatch, and Lambda Provide off-hours support on a rotational basis, including on-call responsibilities and scheduled maintenance windows Contribute to internal RD projects, validating and testing new processes and/or tools and services for integration into Innovative Solutions offerings Lead or contribute to internal process improvement initiatives, leveraging various DevOps tools enhance automation and efficiency AWS Services within the scope of this role are not limited to the ones specifically called out in this list of responsibilities. A successful candidate for this position should have: Bachelors degree in business or computer science and 12+ years of experience in software engineering or IT including at least four years of experience in a role in which the primary responsibility is git-based application code development and/or DevOps Engineering and/or the development, maintenance, and support of CI/CD pipelines or appropriate combination of industry related professional experience and education Proven experience with AWS services, such as EC2, S3, Lambda, CloudFormation, VPC, among others. Skilled in scripting languages, such as Python, Bash, or PowerShell. Experience with Infrastructure as Code (IaC) tools such as Terraform and AWS CloudFormation monitoring and logging tools such as AWS CloudWatch and ELK stack. Strong understanding of cloud security best practices. Great communication and collaboration skills. Ability to work independently and with a team. Preferred Qualifications AWS Certified Solutions Architect - Associate or Professional. AWS Certified DevOps Engineer - Professional. HashiCorp Certified: Terraform Associate Experience with CI/CD pipelines and DevOps practices. Knowledge of scalable data architecture to ensure efficient and scalable data processing and storage solutions. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers - helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, results-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual Great career growth and development opportunities in a global organization A flexible approach to work #LI-KP1

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

At Quanticate, were pioneers in providing top-tier statistical and data management support to our clients. Were seeking a dedicated "Clinical Data Manager I" whos committed to upholding the highest standards, following procedures, and ensuring compliance with regulations, all while providing exceptional customer care. As a "Clinical Data Manager I" you will lead, co-ordinate, and action all tasks relating to Clinical Data Management from the start to the finish of a study to project manage studies across CDM functions. Core Accountabilities: Activities required of a Clinical Data Manager I (however not restricted to) are as below: To contribute to the efficient running of the CDM department as part of the CDM leadership team. Ensure launch, delivery, and completion of all CDM procedures according to contractual agreement and relevant SOPs, guidelines, and regulations To pro-actively keep abreast of current clinical data management developments and systems To assist in the creation and review of in-house SOPs. To research and provide input into in-house strategies and systems. To perform medical coding activities on projects, if assigned. To perform other reasonable tasks as requested by management. Ensure consistency of process and quality across projects. Project management for allocated projects: To help plan and manage study timelines and resources. To manage progress against schedules and report to management. To perform project management across all functions for a study as appropriate. Management of CRFs and all related tasks Management of allocated staff: Allocation of projects in conjunction with Project Management, as appropriate Performance reviews, as required. Administer training and development of staff, as required. Key Relationships: Act as the primary CDM contact, both external and internal, for Quanticate projects. Manage work assignment and delivery of project tasks to the data processing and programming team as required Line management responsibilities for any assigned direct reports, including professional development/training and performance appraisals. Qualified to an appropriate standard, preferably to degree level in a life sciences subject Four to seven years of relevant experience in CRO Clinical Data Management domain. Extensive knowledge of

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GenAI + Full-Stack Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). 3-8 years of strong hands-on experience in software development, with a focus on AI/ML and Generative AI Hands-on with Generative AI technologies with at least one of the following experiences: Working with Large Language Models (LLMs) such as GPT, LLaMA, Claude, etc. Building intelligent systems using LangGraph, Agentic AI frameworks, or similar orchestration tools Implementing Retrieval-Augmented Generation (RAG), prompt engineering, and knowledge augmentation techniques Proficiency in Python, including experience with data processing, API integration, and automation scripting Demonstrated experience in end-to-end SDLC (Software Development Life Cycle): requirement gathering, design, development, testing, deployment, and support Proficient in CI/CD pipelines and version control systems like Git Experience with containerization technologies such as Docker, and orchestration using Kubernetes Strong problem-solving and debugging skills, with an ability to write clean, efficient, and maintainable code Excellent verbal and written communication skills, with the ability to collaborate effectively across technical and business teams About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 week ago

Apply

10.0 - 11.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

About the position: Chevron ENGINE is looking for high-performing Technical Geophysicist candidates to join our Earth Science team. The role provides a wide variety of technical products to support asset teams across the enterprise in a range of development settings and basins; offshore, onshore, conventional and unconventional, exploration through to development. Key responsibilities: Delivers key technical geophysical analysis products and interpretation such as: Seismic to well tie, Acoustic Impedance Inversion, Acoustic FD earth model building, Time2depth velocity modeling, Depth uncertainty analysis, Seismic data processing support, 4D & CCUS (Carbon Capture Utilization and Storage) feasibility modeling and analysis Tasks will also include performance of routine compliance tasks, automated and manual software compatibility testing for periodic system and software upgrades, and close coordination with Subsurface Platform Systems Engineers Petrel and geophysical software development skills Teaming with US-based research and development groups focusing on developing and deploying geophysical products and workflows Continual communication with asset and exploration teams spanning the globe Required Qualifications: MSc degree in Earth Science, (Geophysics preferred) from deemed/recognized (AICTE) university At least 5 years industry related experience Industry Experience in technical geophysics including, but not limited to seismic interpretation, seismic-to-well tie, Acoustic Impedance inversion, acoustic FD earth model building, Time2depth velocity modeling, Depth uncertainty analysis, Seismic data processing support Experience with Petrel, DELFI would be a differentiator. Familiarity with Hampson-Russell, Jason, as well as seismic processing software skills will be a benefit Understanding of physical processes associated with earth science, reservoir modeling and subsurface Good communication skills and work effectively in a team environment Fundamental knowledge of geophysical workflows applied to subsurface Skills of using ML/AI to accelerate performance and accuracy of reservoir characterization is a plus Experience geophysical application within the oil and gas industry is preferred C# programing skills or Ocean SDK experience will be differentiating is a plus Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1:30pm to 10:30pm. Chevron participates in E-Verify in certain locations as required by law.

Posted 1 week ago

Apply

4.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Career Category Operations Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. GCP Quality Compliance Manager What you will do The Quality Compliance Manager is a global role and part of the Process Quality team for the RD Quality Organization. In this vital role you will work with a team of process-focused colleagues who work to complete Amgen s Process Quality strategy, which is vital to ensuring that Amgen s Research and Development Standards (SOPs and associated documentation) are adequate, clear, and up to all applicable current regulations and quality requirements. The RD Process Quality team supports the Quality Management System (QMS) across all areas of research at Amgen, including discovery through the full clinical development lifecycle. This team ensures that all Amgen s business procedures meet internal and external quality standards and are managed for optimum efficiency and effectiveness. The Process Quality team also ensures that Amgen s RD Business Process Network develops and manages fit for purpose standards (SOPs) that are continuously improved upon using quality by design (QbD), and risk management methods that include QMS analytics showing quality signals and trends. In addition, this individual will help support end users in RD with the digital quality management system (DQMS) with queries, deviations and Corrective and Preventive Actions (CAPAs). The Quality Compliance Manager will contribute to implementing strategies and providing leadership to ensure excellence in RD Quality Processes. As an integral team member working globally with Business Process Owners to ensure compliance with regulations and other requirements. Roles Responsibilities: This role will work both independently and in a team environment. Their primary responsibility is to support continuous improvement initiatives for RD quality, but they will also be responsible for any other operational or strategy activities assigned. Generate and review process area Knowledge Maps (spider maps, lessons learning, and data processing techniques, stored in a graph-based database for better search, analysis, and visualization) to help determine inherent and residual risks, document risk assessments, and collaborate with Business Process Owners and Quality Leads to ensure accurate risk classification and preventive actions. Supports Amgen s procedural framework so that all procedures maintain compliance to relevant laws, regulations, and internal quality standards; works to ensure that procedures maintain the ethical and safe treatment of all research subjects and that all data has integrity. Provide real-time, site-level quality oversight using analytical tools to identify trends, weaknesses, and data quality issues. Perform focused quality control checks on-site and remotely at clinical trial locations, especially key target sites. Offer independent and objective quality advice to local study teams Conduct risk assessments to inform audit site selection and pre-inspection/mock inspection visits. Support site/sponsor inspection readiness and management, including prep, conduct, response, and close-out phases. Ensures that all procedures are written clearly for the execution of Amgen s research tasks within a diverse, complex, and cross-functional team of researchers. Supports incoming procedural change requests, including the assessment of changes (impact to the QMS, including traceability of changes across other document sets. Supports the work of Business Process Owners and applies risk-based strategies consistently to identify and mitigate risks towards the continuous advancement of Amgen s RD QMS. Applies industry standard methodologies for optimal (standardized and lean) procedural documentation, and the use of technology to drive an efficient and effective knowledge management system. Supports the application of process metrics (KQI, KPI - leading and lagging) and modern analytic methods across the Business Process Network in order to enable Management Reviews (periodic review by management to ensure QMS health is maintained). Collaborates with other quality professionals within RD to support the QMS continuous improvement cycle (Plan, Do, Check, Act), including Deviation Management/ Corrective and Preventative Actions (CAPA). What we expect of you Basic Qualifications and Experience: Master s degree and 4-6 years in Pharma and Biotechnology RD Quality OR Bachelor s degree and 6-8 years of years in Pharma and Biotechnology RD Quality. Diplomas degree and 10-12 years of years in Pharma and Biotechnology RD Quality. Functional Skills: Must-Have Skills: Exceptional attention to detail and accuracy in all deliverables. Ability to work independently and proactively in a fast-paced environment. Proficiency in Microsoft Office Suite (Word, Excel, PowerPoint, Outlook) and virtual collaboration tools (e. g. , Teams, WebEx) Solid understanding of SOP/Standards management, and methods/ technology used to drive knowledge management across a diverse RD environment. Good-to-Have Skills: Familiarity with project management tools and methodologies. Knowledge of GCP, GLP and/or GPvP. Experience working in a multinational environment with global teams. Experience within Biotech/pharmaceutical Research, including the application of Global Regulations. Direct experience working with standard procedural documentation, including their creation, change control (requests for change and the execution of changes. Soft Skills: Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Strong problem-solving abilities and adaptability to changing priorities. Collaborative attitude and ability to build positive relationships across diverse teams. Resilience, discretion, and the ability to thrive under pressure What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Chennai

Work from Office

Naukri logo

TransUnions Job Applicant Privacy Notice What Well Bring: Data Pipeline Engineer at Orion project are embedded within our engineering teams and support the development and operation. What Youll Bring: Lead Data Engineer What We Offer We are looking for an individual to be part of an autonomous, cross-functional agile/scrum team where everyone shares responsibility for all aspects of the work. The ideal candidate will have a strong interest to join our growing Data Engineering and Analytics track of GFS Core services who will drive building next generation suite of products and platform by designing, coding, building and deploying highly scalable and robust solutions. We are looking for enthusiastic professionals who are excited to learn, love a good challenge, and are always looking for opportunities to contribute. Finally yet importantly, we look for dedicated team players who enjoy collaboration and can work effectively with others to achieve common goals. TransUnion is currently seeking a Lead Data Engineer with 7+ years experience to work in our Chennai office, India. You will be working with some of the latest tools and a great team of cross-functional engineers. We work with multiple technologies. This will be an opportunity to work on core services of an industrial strength Identity and Risk solution by streamlining design and collaborating with the team to build orchestration platform in cloud. Who We Are At TransUnion, we are dedicated to finding ways information can be used to help people make better and smarter decisions. As a trusted provider of global information solutions, our mission is to help people around the world access the opportunities that lead to a higher quality of life, by helping organizations optimize their risk-based decisions and enabling consumers to understand and manage their personal information. Because when people have access to more complete and multidimensional information, they can make decisions that are more informed and achieve great things. Every day TransUnion offers our employees the tools and resources they need to find ways information can be used in diverse ways. What you ll bring: Bachelors Degree in a quantitative field, plus 7+ years of work experience or equivalent practical experience. 5+ years of experience in Big Data technologies Experience designing and implementing data pipelines Experience with SQL, PostgreSQL and/or Redshift, or other data management, reporting and query tools. Big Data Technologies - Hadoop HDFS, Hive, Spark, Kafka, Sqoop Designing Logical Data Model and Physical Data Models including data warehouse and data mart designs. Expertise in writing complex, highly optimized queries across large data sets to write data pipelines and data processing layers. Cloud System experience on AWS, Azure, GCP -Preferably GCP Coach / Mentor / Lead a team of Data Engineers Design, build, test and deploy cutting edge Big Data solutions at scale Extract, Clean, transform, and analyze vast amounts of raw data from various Data Sources Build data pipelines and API integrations with various internal systems Proactively monitor, identify, and escalate issues or root causes of systemic issues Evaluate and communicate technical risks effectively and ensure assignments delivery in scheduled time with desired quality Work across Data Engineering, Data Architecture, Data Visualization functions What we ll bring: At TransUnion, we have a welcoming and energetic environment that encourages collaboration and innovation we re - consistently exploring new technologies and tools to be agile. This environment gives our people the opportunity to hone current skills and build new capabilities, while discovering their genius. Come be a part of our team - you will work with great people, pioneering products and cutting-edge technology. This role is for a Lead Data Engineer that will operate as a lead for Data Pipeline track and responsible for development of the global fraud solutions of TransUnion. We pride ourselves in working in a collaborative cross-functional manner where all engineers are expected to contribute to design, build, deployment and operation of our cloud platform. Location: Chennai Job Type: Full-time day job Impact Youll Make: N/A This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Lead Developer, Software Development

Posted 1 week ago

Apply

11.0 - 19.0 years

32 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e. g. , data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc. ) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e. g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e. g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e. g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e. g. , Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e. g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e. g. Architecture Trade off Analysis

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 4 Lacs

Faridabad

Work from Office

Naukri logo

We are seeking a highly detail-oriented and technically adept 3D Data Annotation Specialist to join our growing team. This role is critical in shaping high-quality datasets for training cutting-edge AI and computer vision models, particularly in domains such as LiDAR data processing, and 3D object detection. Roles and Responsibilities Qualifications: B.Tech in Computer Science, IT, or related field preferred (others may also apply strong analytical and software learning abilities required). Strong analytical and reasoning skills, with attention to spatial geometry and object relationships in 3D space. Basic understanding of 3D data formats (e.g., .LAS, .LAZ, .PLY) and visualization tools. Ability to work independently while maintaining high-quality standards. Excellent communication skills and the ability to collaborate in a fast-paced environment. Attention to detail and ability to work with precision in visual/manual tasks. Good understanding of basic geometry, coordinate systems, and file handling. Preferred Qualifications: Prior experience in 3D data annotation or LiDAR data analysis. Exposure to computer vision workflows. Comfortable working with large datasets and remote sensing data Key Responsibilities: Annotate 3D point cloud data with precision using specialized tools [ Training would be provided] Label and segment objects within LiDAR data, aerial scans, or 3D models. Follow annotation guidelines while applying logical and spatial reasoning to 3D environments. Collaborate with ML engineers and data scientists to ensure annotation accuracy and consistency. Provide feedback to improve annotation tools and workflow automation. Participate in quality control reviews and conduct re-annotation as needed

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As a Data Engineer , you will leverage your expertise in Databricks , big data platforms , and modern data engineering practices to develop scalable data solutions for our clients. Candidates with healthcare experience, particularly with EPIC systems , are strongly encouraged to apply. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. Responsibilities: Develop data ingestion, data processing and analytical pipelines for big data, relational databases and data warehouse solutions Design and implement data pipelines and ETL/ELT processes using Databricks, Apache Spark, and related tools. Collaborate with business stakeholders, analysts, and data scientists to deliver accessible, high-quality data solutions. Provide guidance on cloud migration strategies and data architecture patterns such as Lakehouse and Data Mesh Provide pros/cons, and migration considerations for private and public cloud architectures Provide technical expertise in troubleshooting, debugging, and resolving complex data and system issues. Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides Experience working with Data Governance, Data security and Data Privacy (Unity Catalogue or Purview) If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: 3+ years of consulting or client service delivery experience on Azure Graduate/Postgraduate in computer science, computer engineering, or equivalent with minimum of 8 years of experience in the IT industry. 3+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases such as SQL server and data warehouse solutions such as Azure Synapse Extensive hands-on experience implementing data ingestion, ETL and data processing. Hands-on experience in and Big Data technologies such as Java, Python, SQL, ADLS/Blob, PySpark and Spark SQL, Databricks, HD Insight and live streaming technologies such as EventHub. Experience with cloud-based database technologies (Azure PAAS DB, AWS RDS and NoSQL). Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, etc. Experience with monitoring and diagnostic tools (SQL Profiler, Extended Events, etc). Expertise in data mining, data storage and Extract-Transform-Load (ETL) processes. Experience with relational databases and expertise in writing and optimizing T-SQL queries and stored procedures. Experience in using Big Data File Formats and compression techniques. Experience in Developer tools such as Azure DevOps, Visual Studio Team Server, Git, Jenkins, etc. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Preferred Technical And Professional Experience Cloud platform certification, e.g., Microsoft Certified: (DP-700) Azure Data Engineer Associate, AWS Certified Data Analytics – Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working with EPIC healthcare systems (e.g., Clarity, Caboodle). Databricks certifications (e.g., Databricks Certified Data Engineer Associate or Professional). Knowledge of GenAI tools, Microsoft Fabric, or Microsoft Copilot. Familiarity with healthcare data standards and compliance (e.g., HIPAA, GDPR). Experience with DevSecOps and CI/CD deployments Experience in NoSQL databases design Knowledge on , Gen AI fundamentals and industry supporting use cases. Hands-on experience with Delta Lake and Delta Tables within the Databricks environment for building scalable and reliable data pipelines. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Data Cleansing & Integration Project Delivery: Execute high visibility data programs as assigned by the Data Cleansing Manager. Utilize SAP data load solutions such as SAP Migration Cockpit and LSMW for data loading and template creation. FDO Data Change Management Methodology: Assist in defining data cleansing approaches using Mass Change functionality. Develop and prepare data cleansing strategies. Data Cleansing & Integration Technical Guidance: Understand SAP landscape and data flow to underlying/consumed systems to prevent data synchronization issues. Data Quality: Collaborate with the Data Quality (DQ) team to define DQ rules and enhance visibility of existing data quality. Data Governance: Work with the Data Governance (DG) team to ensure proper governance before implementing system changes. Conduct necessary data load testing in test systems. Data Sourcing: Maintain and update the data catalogue/data dictionary, creating a defined list of data sources indicating the best versions (golden copies). Data Ingestion: Collaborate with DG and project teams on data harmonization by integrating data from multiple sources. Develop sustainable integration routines and methods. Qualifications: Experience: Minimum of 6 years in data-related disciplines such as data management, quality, and cleansing. Technical Skills: Proven experience in delivering data initiatives (cleansing, integration, migrations) using established technical data change methodologies. Proficiency in handling large data sets with tools like Microsoft Excel and Power BI. Experience with SAP native migration and cleansing tools such as SAP Migration Cockpit, LSMW, and MASS. Knowledge of Master Data Management in SAP MDG, SAP ECC, and associated data structures. Collaboration: Ability to work effectively with internal cross-functional teams.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills: Pyspark, Spark, Python Preferred Skills: Technology->Analytics - Packages->Python - Big Data Technology->Big Data - Data Processing->Spark

Posted 1 week ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit

Posted 1 week ago

Apply

2.0 - 7.0 years

1 - 3 Lacs

Guwahati

Work from Office

Naukri logo

Proficiency in Microsoft Excel / Google Sheets Strong knowledge of Excel formulas Experience with Pivot Tables Knowledge of Macros (preferred) Background in Mathematics (advantageous) Prior experience in MIS reporting If Interested kindly share your resume with your update details t.globalzonehr@gmail.com

Posted 1 week ago

Apply

0.0 - 4.0 years

1 - 4 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

We Are looking For Computer Operator, Who can Perform defined tasks per documented instructions/process Male And Female Both Can apply Fresher And Experience Both Can Apply Basic computer knowledge must Hardworking Work from Home

Posted 1 week ago

Apply

0.0 - 4.0 years

1 - 4 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Applicants must have qualification with any years in Data Entry or Back Office work, demonstrating proficiency in computer. Success in this role requires strong organizational skills, attention to detail, and the ability to handle multiple tasks

Posted 1 week ago

Apply

0.0 - 2.0 years

1 - 5 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Call Handling, Messaging: Answer inbound calls from potential job seeker, listen to their needs, & qualify them. Provide information on WhatsApp. Pass lead to recruitment team for qualified leads - in a professional and timely manner. Work From Home Required Candidate profile Immediate Joiner Work From Home Candidate should be from Hyderabad, New Delhi, Mumbai, Pune, Bangalore,

Posted 1 week ago

Apply

2.0 - 3.0 years

2 - 3 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Accountabilities: As a Nonclinical SEND Data Associate, you will provide support across nonclinical functional areas, promoting and enforcing the use of nonclinical data standards. You will maintain metadata libraries, review and assess the impact of newly released and updated data standards, and communicate AZ data requirements to external vendors. You will also be responsible for quality validation and management of nonclinical datasets, training internal customers, assisting in warehousing and visualization activities, and ensuring submission-ready datasets are produced. Essential Skills/Experience: - Experience in a scientific environment - Knowledge of CDISC standards - Understanding of nonclinical study designs, data and documentation - Basic knowledge of regulatory guidelines and industry standards (FDA, ICH/GLP, PhUSE) - Experience in data process builds - Experience with LIMS and SEND solution software - Bachelor s degree (B.A./B.S.) or equivalent in a scientific or related discipline - Ability to communicate effectively (written and spoken) in English Desirable Skills/Experience: - Open to candidates with diverse skills and experiences

Posted 1 week ago

Apply

0.0 years

1 - 3 Lacs

Ahmedabad

Work from Office

Naukri logo

Ready to shape the future of work? At Genpact, we don't just adapt to change we drive it. AI and digital innovation are redefining industries and were leading the charge. Genpacts AI Gigafactory, our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook. Mega Virtual Drive for Customer Service roles -English+ Hindi Language on 13th June 2025 (Wednesday) || Ahmedabad Location Date: 13-June-2025 (Friday) MS Teams meeting ID: 495 160 646 115 5 MS Teams Passcode: f75F7qa3 Time: 12:00 PM - 1:00 PM Job Location: Ahmedabad (Work from office) Languages Known: Hindi + English Shifts: Flexible with any shift Responsibilities • Respond to customer queries and customer's concern • Provide support for data collection to enable Recovery of the account for end user. • Maintain a deep understanding of client process and policies • Reproduce customer issues and escalate product bugs • Provide excellent customer service to our customers • You should be responsible to exhibit capacity for critical thinking and analysis. • Responsible to showcase proven work ethic, with the ability to work well both independently and within the context of a larger collaborative environment Qualifications we seek in you Minimum qualifications • Graduate (Any Discipline except law) • Only Freshers are eligible • Fluency in English & Hindi language is mandatory Preferred qualifications • Effective probing skills and analyzing / understanding skills • Analytical skills with customer centric approach • Excellent proficiency with written English and with neutral English accent • You should be able to work on a flexible schedule (including weekend shift) Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. **Note: Please keep your E-Aadhar card handy while appearing for interview.

Posted 1 week ago

Apply

5.0 - 10.0 years

17 - 19 Lacs

Gurugram

Work from Office

Naukri logo

The Finance Automation COE team within the Finance Technology Data & Transformation (FTD&T) group is responsible for strengthening business engagement framework, identifying and delivering results on critical projects and initiatives leveraging process automation and the use of advanced technologies. We are looking for a motivated colleague who is collaborative and passionate about transforming processes using automation tools & capabilities. The individual is expected to play a critical role of partnering with Controllership and broader Finance leadership teams for regulatory & treasury processes supporting Fast Forward initiatives. The position is expected to develop automated reports and analytical capabilities by leveraging tools like Tableau & Power Products, including process analysis, and partner with technical teams to deploy solutions for various finance processes. A strong focus on process optimization and driving results is required. Key Responsibilities: Collaborate with business partners, product owners and developers to conceptualize & deliver analytical and reporting capabilities for regulatory reporting initiatives associated to Fast Forward processes undertaken under Automation COE. Design and develop dashboards in Tableau, power products & similar reporting platforms. Develop SQL queries, scripts and routines to automate data processing. Work with complex data sets to generate insightful data visualization and reports to aid in decision-making and demonstrate value to stakeholders. Design and develop business process flows, UI using tools like Power Automate & ACE. Lead training sessions and create comprehensive documentation to empower end users to leverage self-servicing capabilities like Power BI & Power Apps to automate simple processes. Able to support agile development life-cycle, including writing user stories, support solving issues that arise during development, support SIT & UAT and facilitate deployment of the code. Partner with business SMEs and product owners to design a solution working in Agile environment. Skills required: 5+ years of experience on reporting platforms like Tableau, MicroStrategy & Power Applications. Advanced proficiency in SQL and familiarity with other relational database technologies. Strong analytical, problem-solving and project management skills, coupled with a continuous improvement mindset. Innovative mindset and experience in evaluating business processes to identify opportunities for improvement and automation. Strong communication and written skills, with the ability to interact with and present to all levels of the organization. Proven ability to build and leverage relationships and influence key partners to drive collaboration. Awareness of Automation tools and capabilities like Data Watch, ML, Business Process Management (BPM) and open-source features to make recommendations for the identified opportunities. Development & delivery of projects by using any of the tools will be a plus. Superior problem-solving and analytical skills, strong learning agility, curiosity and willingness to embrace new challenges. Exposure to agile methodologies, ability to coordinate multiple priorities at once and work in a dynamic, time-critical environment. bachelors degree in Computer Science, Engineering or Finance, Technologies or similar field preferred We back you with benefits that support your holistic we'll-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as we'll as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

To handle all Data/BI responsibilities, including a major part of the work on ADF and team handling. Key Responsibilities: Data Warehouse Development: Design and implement scalable and efficient data warehouse solutions. Develop complex SQL Server-based solutions, including T-SQL queries, stored procedures, and performance tuning. Optimize SQL Server databases, develop T-SQL scripts, and improve query performance. ETL Development and Maintenance: Build and optimize ETL workflows using Azure Data Factory (ADF) for data integration from multiple sources. Ensure high-performance data pipelines for large-scale data processing. Integrate and automate data processes using Azure Functions to extend ETL capabilities. Cloud Integration: Implement cloud-native solutions leveraging Azure SQL Database, Azure Functions, and Synapse Analytics. Support hybrid data integration scenarios combining on-premises and Azure services. Data Governance and Quality: Establish and maintain robust data quality frameworks and governance standards. Ensure consistency, accuracy, and security of data across all platforms. Leadership and Collaboration: Lead a BI and data professionals team, providing mentorship and technical direction. Partner with stakeholders to understand business requirements and deliver data-driven solutions. Define project goals, timelines, and resources for successful execution. Should be flexible to support multiple IT platforms Managing day-to-day activities Jira request, SQL execution, access request, resolving alerts, and updating Tickets

Posted 1 week ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project description A Tagetik Developer is responsible for designing, developing, and maintaining financial performance management solutions using the Tagetik platform. This role involves working closely with finance and IT teams to ensure the effective implementation and support of Tagetik applications. Responsibilities Development and CustomizationDevelop and customize Tagetik applications, including creating processes, workflows, and ETL (Extract, Transform, Load) solutions. Reporting and AnalyticsDesign and develop financial reports and dashboards using Tagetik's reporting tools and SQL queries. Data IntegrationManage data integration processes, ensuring accurate data flow between Tagetik and other systems. System ConfigurationConfigure Tagetik applications to meet business requirements, including setting up financial models, budgeting, and forecasting modules. Support and MaintenanceProvide ongoing support and maintenance for Tagetik applications, including troubleshooting issues and implementing enhancements. User TrainingConduct training sessions for end-users to ensure they are proficient in using Tagetik applications. CollaborationWork closely with functional heads and stakeholders to understand requirements and deliver solutions. Skills Must have Overall IT experience of 5-7 years with a minimum of 3 years in Tagetik application development. Hands-on experience with Analytical Information Hub (AIH). Expertise in ETL and DTPs. Experience in developing Forms and Reports. Proficiency in Tagetik Workflow, Data Processing, and Process Cockpit. Hands-on experience with JBOSS and Control-M is an added advantage. Problem-SolvingAbility to investigate and resolve complex problems. DocumentationExperience in creating technical design documents, unit test scripts, and code migration documents. ImplementationMinimum of 2 implementation experiences. SupportDeliver functional and technical Tagetik consolidation to support client needs. Preferred Qualifications: Experience with financial modeling and forecasting within the Tagetik platform. Knowledge of budgeting, planning, and consolidation processes. Familiarity with other financial performance management tools and technologies. Nice to have NA Other Languages EnglishC2 Proficient Seniority Senior

Posted 1 week ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project description A Tagetik Developer is responsible for designing, developing, and maintaining financial performance management solutions using the Tagetik platform. This role involves working closely with finance and IT teams to ensure the effective implementation and support of Tagetik applications. Responsibilities Development and CustomizationDevelop and customize Tagetik applications, including creating processes, workflows, and ETL (Extract, Transform, Load) solutions. Reporting and AnalyticsDesign and develop financial reports and dashboards using Tagetik's reporting tools and SQL queries. Data IntegrationManage data integration processes, ensuring accurate data flow between Tagetik and other systems. System ConfigurationConfigure Tagetik applications to meet business requirements, including setting up financial models, budgeting, and forecasting modules. Support and MaintenanceProvide ongoing support and maintenance for Tagetik applications, including troubleshooting issues and implementing enhancements. User TrainingConduct training sessions for end-users to ensure they are proficient in using Tagetik applications. CollaborationWork closely with functional heads and stakeholders to understand requirements and deliver solutions. Skills Must have Overall IT experience of 5-7 years with a minimum of 3 years in Tagetik application development. Hands-on experience with Analytical Information Hub (AIH). Expertise in ETL and DTPs. Experience in developing Forms and Reports. Proficiency in Tagetik Workflow, Data Processing, and Process Cockpit. Hands-on experience with JBOSS and Control-M is an added advantage. Problem-SolvingAbility to investigate and resolve complex problems. DocumentationExperience in creating technical design documents, unit test scripts, and code migration documents. ImplementationMinimum of 2 implementation experiences. SupportDeliver functional and technical Tagetik consolidation to support client needs. Preferred Qualifications: Experience with financial modeling and forecasting within the Tagetik platform. Knowledge of budgeting, planning, and consolidation processes. Familiarity with other financial performance management tools and technologies. Nice to have NA Other Languages EnglishC2 Proficient Seniority Senior

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

Project description Are you passionate about leveraging the latest technologies for strategic changeDo you enjoy problem solving in clever waysAre you organized enough to drive change across complex data systemsIf so, you could be the right person for this role. As an experienced data engineer, you will join a global data analytics team in our Group Chief Technology Officer / Enterprise Architecture organization supporting our strategic initiatives which ranges from portfolio health to integration. Responsibilities Help Group Enterprise Architecture team to develop our suite of EA tools and workbenches Work in the development team to support the development of portfolio health insights Build data applications from cloud infrastructure to visualization layer Produce clear and commented code Produce clear and comprehensive documentation Play an active role with technology support teams and ensure deliverables are completed or escalated on time Provide support on any related presentations, communications, and trainings Be a team player, working across the organization with skills to indirectly manage and influence Be a self-starter willing to inform and educate others Skills Must have B.Sc./M.Sc. degree in computing or similar 5-8+ years' experience as a Data Engineer, ideally in a large corporate environment In-depth knowledge of SQL and data modelling/data processing Strong experience working with Microsoft Azure Experience with visualisation tools like PowerBI (or Tableau, QlikView or similar) Experience working with Git, JIRA, GitLab Strong flair for data analytics Strong flair for IT architecture and IT architecture metrics Excellent stakeholder interaction and communication skills Understanding of performance implications when making design decisions to deliver performant and maintainable software. Excellent end-to-end SDLC process understanding. Proven track record of delivering complex data apps on tight timelines Fluent in English both written and spoken. Passionate about development with focus on data and cloud Analytical and logical, with strong problem solving skills A team player, comfortable with taking the lead on complex tasks An excellent communicator who is adept in, handling ambiguity and communicating with both technical and non-technical audiences Comfortable with working in cross-functional global teams to effect change Passionate about learning and developing your hard and soft professional skills Nice to have Experience working in the financial industry Experience in complex metrics design and reporting Experience in using artificial intelligence for data analytics Other Languages EnglishC1 Advanced Seniority Senior

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies