Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Description Where you ll work: India (Remote) Engineering at GoTo We re the trailblazers of remote work technology. We build powerful, flexible work software that empowers everyone to live their best life, at work and beyond. And blaze even more trails along the way. There s ample room for growth - so you can blaze your own trail here too. When you join a GoTo product team, you ll take on a key role in this process and see your work be used by millions of users worldwide. Your Day to Day As a Senior Software Engineer - Bigdata, you would be: Design, develop, and maintain robust, scalable, and efficient ETL/ELT pipelines to process structured and unstructured data from various sources. Expertise in Python Programming. Leverage AWS services (e.g., S3, EKS, Lambda, EMR) to architect and implement cloud-native data solutions. Work with Apache Spark and Databricks to process large-scale datasets, optimize performance, and build reusable data transformations. Design and implement data models (both relational and dimensional) that support analytics, reporting, and machine learning use cases. Schedule, monitor, and orchestrate workflows using Apache Airflow or equivalent tools. Collaborate with analysts, data scientists, and business stakeholders to deliver trusted, high-quality data for downstream consumption. Build data quality checks, logging, monitoring, and alerting to ensure pipeline reliability and visibility. Develop SQL-based transformations and optimize queries for performance in cloud data warehouses and lakehouses. Enable data-driven decisions by supporting self-service BI tools like Tableau, ensuring accurate and timely data availability. Ensure adherence to data governance, security, and compliance requirements. Mentor junior engineers and contribute to engineering best practices, including CI/CD, testing, and documentation. What We re Looking For As a Senior Software Engineer - Bigdata, your background will look like: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering or similar roles, with proven ability to build and scale end-to-end data pipelines. Strong expertise in ETL/ELT development, data ingestion, and transformation using SQL and scripting languages (Python preferred). Hands-on experience with Apache Spark and Databricks for big data processing. In-depth knowledge of AWS services such as S3, Hive, Lambda, and EMR. Proficient in data modeling, including dimensional and normalized models. Experience with Airflow or similar orchestration frameworks. Familiarity with BI tools like Tableau for reporting and dashboarding. Strong understanding of data warehousing, lakehouse architectures, and modern data stack concepts. Excellent problem-solving skills, communication, and the ability to work in an agile and collaborative environment. At GoTo, authenticity and inclusive culture are key to our thriving workplace, where diverse perspectives drive innovation and growth. Our team of GoGetters is passionate about learning, exploring, and working together to achieve success while staying committed to delivering exceptional experiences for our customers. We take pride in supporting our employees with comprehensive benefits, wellness programs, and global opportunities for professional and personal development. By maintaining an inclusive environment, we empower our teams to do their best work, make a meaningful impact, and grow their career. Learn more .
Posted 2 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Business Data Technologies (BDT) seeks a Software Development Manager to lead initiatives in data privacy and comprehension while advancing the organizations mission of accelerating data-driven innovation and business insights. The role involves managing a team focused on building innovative systems capable of Classification, protecting, and enriching SDO data at scale for AI and data analytics use cases while raising the bar on global customer trust. Key Responsibilities: 1. Lead and develop engineering teams while providing mentorship and leadership on complex technology issues 2. Own the full technology lifecycle including development, operations, and systems depreciation 3. Drive both technology vision and business vision for the team 4. Guide software and database engineers in determining appropriate strategies Required Qualifications: 1. 5+ years of software development experience and 3+ years of people management experience 2. Strong technical background with proven ability to execute both strategically and tactically 3. Demonstrated experience working with cross-functional teams and exceptional problem-solving skills 4. Customer-focused mindset with ability to lead teams handling complex software problems at the architectural level The role aligns with BDTs core tenets of protecting data privacy, security, and compliance as the first priority, while maintaining data quality and timeliness of insights. The successful candidate will contribute to BDTs vision of providing trustworthy, intuitive, and cost-efficient solutions for Amazons growing analytics needs. In this role, you will be responsible for: Building, maintaining, and organizing your team Defining your technical strategy and product roadmap Defining, measuring, and reporting on your key performance and operational excellence metrics Recruiting and retaining top talent Driving clarity in highly ambiguous technical environments Developing long-term technical roadmaps Guiding and coaching developers Managing projects effectively Communicating effectively with both technical and non-technical audiences Motivating your team to achieve results in a fast-paced environment A day in the life A day in the life of this role would be a good mix of managing multiple programs, focus on long term, managing operational health of systems and making high impact decisions. About the team This is a team with a vision to enable BDTs AI-powered experience where every Amazon employee can have natural, insightful conversations with their data from discovery to insights to actions by implementing data classification at scale accelerating Amazons data-driven culture. 10+ years of engineering experience 5+ years of engineering team management experience Experience partnering with product or program management teams Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems
Posted 2 weeks ago
7.0 - 12.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Senior Azure Databricks Engineer We are looking for a Senior Azure Databricks Engineer to support and maintain our internal BI platform, used by our Finance and Business Operations teams. This is a hands-on technical role focused on backend data operations including data ingestion, transformation, and CI/CD support within a cloud-based data warehouse environment. Key Responsibilities: Ensure stable operation of the internal BI platform used by Finance and Business Operations Develop, maintain, and troubleshoot data pipelines for ingestion, transformation, and load using Azure Databricks (PySpark, SQL). Support and optimize CI/CD pipelines (Azure DevOps) for smooth deployments and minimal downtime. Collaborate with BI front-end analysts, IT teams, and business stakeholders to ensure alignment of data needs and delivery. Monitor and improve system performance, resolve incidents, and ensure data quality and consistency. Maintain data architecture standards and support platform scalability and compliance. Integrate data from systems like D365 Finance Operations and other business applications. Work with Azure services such as Data Lake, Key Vaults, Service Principals, and SQL Database. Maintain proper documentation of processes, configurations, and procedures. Participate in improvement initiatives to enhance platform efficiency and usability. What you need to succeed: 7+ years of experience with Business Data Analytics Platforms. Strong hands-on experience with Azure Databricks, PySpark, and (SparkSQL or SQL) Solid understanding of CI/CD pipelines (preferably with Azure DevOps) and troubleshooting deployment issues. Proficiency in Python and working knowledge of Shell scripting. Experience with data ingestion, ETL processes, and managing large-scale data pipelines. Experience with Azure services such as Azure Key Vaults, Azure SQL, Azure Data Lake, and Service Principals. Understanding data governance, security standards, and handling sensitive data. Ability to work closely with both IT and finance/business stakeholders. Good knowledge of data integration from sources like D365 FO, Unit4, Azure Portal Strong analytical, problem-solving, and communication skills. Excellent problem-solving, collaboration, and communication skills. Department CFO Remote status Hybrid Employment type Full-time Employment level First /Mid-Level Officials Application deadline 30 June, 2025 Contact Lead-Talent Acquisition Colleagues Bengaluru OUR POWER IS CURIOSITY, CREATION AND INNOVATION We believe you love to experiment, challenge the established, co-create, develop and cultivate. Together we can explore new answers to today s challenges and future opportunities, and talk about how industrial digitalisation can be a part of the solution for a better tomorrow. We believe that different perspectives are crucial for developing gamechanging technology for a better tomorrow. Join us in taking on this challenge! About Kongsberg Digital Kongsberg Digital is a provider of next generation software and digital solutions to customers within maritime, oil gas and utilities. Together with the rest of KONGSBERG, Kongsberg Digital offers solutions within autonomy, smart data, augmented reality and other areas. Join Kongsberg Digital as we pursue our mission to digitalize the world s industries for a better tomorrow. We truly believe that technology will drive more efficient and sustainable operations, making the oil sector more energy efficient, ships less polluting and green energy future proof. Founded in 2016 Co-workers 1316 CFO Bengaluru Hybrid Senior Azure Databricks Engineer Loading application form Already working at Kongsberg Digital Let s recruit together and find your next colleague.
Posted 2 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Mumbai
Work from Office
Data Privacy and Governance Consultant Posted: 28/05/2025 Closing Date: 28/06/2025 Job Type: Permanent - Full Time Location: Mumbai Job Category: IT Job Description Are you ready to grow your career in our global tech hub Zurich Cover-More helps people travel safely across the globe every day. We are there at every step of a traveller s journey, to keep them safe and help them out if something goes wrong. We are committed to providing reliable, fast, flexible and bespoke services for our customers as well as the many well-known brands we partner with! So, what s the job Youll support the implementation of the data governance framework across all Zurich Cover-More entities, ensuring regulatory alignment and Zurich standards. Youll collaborate with data owners and stakeholders to create guiding documents and procedures, including data quality, privacy, and retention rules. Youll manage metadata within the data catalogue, ensuring accuracy and supporting internal reviews and functionality tests. Youll engage with stakeholders across business units, IT, legal, and Risk Compliance to drive adoption of the data governance framework. Youll monitor the project s scope, timelines, resources, and communication plans, identifying risks and ensuring milestone delivery. Youll work with data owners and stewards to define and implement data quality rules, monitoring processes, and reporting mechanisms. Youll collaborate with Information Security, legal, and risk teams to ensure compliance with data security, privacy controls, and regulations like GDPR. Youll develop and deliver training and communication materials to educate stakeholders on data governance principles and processes. Youll establish metrics and reporting to track the effectiveness of the data governance framework and identify areas for improvement. Youll review and process access requests, refine processes, and evaluate change requests related to data assets and metadata. Youll continuously refine the data governance framework based on feedback and evolving business needs. Youll work closely with global teams to ensure consistency and alignment of data governance practices across Zurich Cover-More. Conducting Privacy Impact Assessments to understand and identify risks arising out of the processing of personal data and to minimize these risks as far and as early as possible. Manage and maintain the One Trust production platform (primarily Data Mapping and Automated Assessments) And what are we looking for Youll hold a bachelor s degree in computer science, Engineering, or a related field with at least 5 years of industry experience. Previous experience in data management, data quality or related field Understanding of data governance frameworks, principles, proficiency in data management concepts, data quality, metadata management etc. Maintain documentation related to data governance processes, policies and procedure Identify areas for improvement in data governance processes and implement enhancements Strong collaboration skills to work with cross-functional teams, data stewards and business stakeholders to align data governance initiatives with business goals CIPPE, CDMP, or other industry-recognized security certification(s) can be beneficial Familiarity with relevant regulations such as GDPR, CCPA etc. Comprehensive understanding of data privacy methodologies, technologies, and best practices, and working experience with Data Protection Frameworks Conducting DPIAs and risk assessments to identify and address privacy risks So, why choose us We value optimism, caring, togetherness, reliability and determination. We have more than 2600 employees worldwide: we re a global group of digital natives, actuaries, marketers, doctors, nurses, case managers, claims specialists, finance experts and customer service professionals. We share a global mission to look after travellers, at every step of their journey. Job flexibility. We understand the importance of making sure that work fits into your life, not the other way around. Our hybrid work week policy ensures our employees maintain work-life balance with the flexibility of 3 days in the office and 2 days working from home. Career growth . We want you to continue to learn, develop and bring your ideas to the table. We want to hear what you think, and we want you to work with the business - not for the business! Diversity and inclusion . We respect who you are and thoroughly embrace diversity. So whatever walk of life you wander, just be you and come as you are. Take the time you need, for you and your community. We encourage you to take the time you need, when you need it. We offer regular annual and personal leave benefits along with anniversary leave, covid leave (to get vaccinated and for when you re sick), volunteer leave and a comprehensive paid parental leave scheme. We also offer some other perks including: Mediclaim insurance cover in case of any health emergency Coverage under group personal accident insurance Flexible and compressed work weeks and hybrid working options. Generous range of paid leave - 21 annual leave days, 6 sick leave days, 12 public holidays An extra day off for you to take on your birthday or your annual work anniversary. Apply today and let s go great places together! #LI-Hybrid
Posted 2 weeks ago
8.0 - 13.0 years
25 - 30 Lacs
Mumbai
Work from Office
About this role Overview: We are looking for an innovative hands-on technology leader and run Global Data Operations for one of the largest global FinTech s. This is a new role that will transform how we manage and process high quality data at scale and reflects our commitment to invest in an Enterprise Data Platform to unlock our data strategy for BlackRock and our Aladdin Client Community. A technology first mindset, to manage and run a modern global data operations function with high levels of automation and engineering, is essential. This role requires a deep understanding of data, domains, and the associated controls. Key responsibilities: The ideal candidate will be a high-energy, technology and data driven individual who has a track record of leading and doing the day to day operations. Ensure on time high quality data delivery with a single pane of glass for data pipeline observability and support Live and breathe best practices of data ops such as culture, processes and technology Partner cross-functionally to enhance existing data sets, eliminating manual inputs and ensuring high quality, and onboarding new data sets Lead change while ensuring daily operational excellence, quality, and control Build and maintain deep alignment with key internal partners on ops tooling and engineering Foster an agile collaborative culture which is creative open, supportive, and dynamic Knowledge and Experience: 8+ years experience in hands-on data operations including data pipeline monitoring and engineering Technical expert including experience with data processing, orchestration (Airflow) data ingestion, cloud-based databases/warehousing (Snowflake) and business intelligence tools The ability to operate and monitor large data sets through the data lifecycle, including the tooling and observability required to be ensure data quality and control at scale Experience implementing, monitoring, and operating data pipelines that are fast, scalable, reliable, and accurate Understanding of modern-day data highways, the associated challenges, and effective controls Passionate about data platforms, data quality and everything data Practical and detailed oriented operations leader Inquisitive leader who will bring new ideas that challenge the status quo Ability to navigate a large, highly matrixed organization Strong presence with clients Bachelor s Degree in Computer Science, Engineering, Mathematics or Statistics Our benefits . Our hybrid work model . About BlackRock . This mission would not be possible without our smartest investment - the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com / company / blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
1 Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. This Lead role has an expectation of 10-15 years of relevant experience and will provide mentorship to junior members of the team. Key responsibilities: Oversee the entire data infrastructure to ensure scalability, operation efficiency and resiliency. Mentor junior data engineers within the organization. Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 Azure Blob storage). Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize data pipelines in the Azure environment for performance, scalability, and reliability. Ensure data quality and integrity through data validation techniques and frameworks. Develop and maintain documentation for data processes, configurations, and best practices. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. Manage the CI/CD process for deploying and maintaining data solutions. Required Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field (or equivalent experience) and able to demonstrate high proficiency in programming fundamentals. At least 5 years of proven experience as a Data Engineer or similar role dealing with data and ETL processes. 10-15 overall years of experience Strong knowledge of Microsoft Azure services, including Azure Data Factory, Azure Synapse, Azure Databricks, Azure Blob Storage and Azure Data Lake Gen 2. Experience utilizing SQL DML to query modern RDBMS in an efficient manner (e.g., SQL Server, PostgreSQL). Strong understanding of Software Engineering principles and how they apply to Data Engineering (e.g., CI/CD, version control, testing). Experience with big data technologies (e.g., Spark). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Learning agility Technical Leadership Consulting and managing business needs Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted. Experience building spark applications utilizing PySpark. Experience with file formats such as Parquet, Delta, Avro. Experience efficiently querying API endpoints as a data source. Understanding of the Azure environment and related services such as subscriptions, resource groups, etc. Understanding of Git workflows in software development. Using Azure DevOps pipeline and repositories to deploy and maintain solutions. Understanding of Ansible and how to use it in Azure DevOps pipelines. Chevron participates in E-Verify in certain locations as required by law.
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
1 The Cost Engineering Services Lead role is part of the Technical Services organization and leads a team to provide cost estimating, planning/scheduling, and project controls services for small capital projects within the portfolio for 2 LNG facilities. Will ensure the adoption and consistent use of standardized and scalable cost engineering processes, data digitalization, performance measurement, and implement improvement initiatives. This position actively advocates cost engineering principles, relentlessly pursuing a One Team mentality across multifunctional teams in support of the delivery of competitive, predictable, and realistic cost and schedule performance. Key Responsibilities: Lead and provide cost engineering services to small capital and Maintenance projects execution Lead the assessment and implementation of cost engineering project services processes Support performance measurement - benchmarking, KPIs, and project lookbacks as required Support assessment and implementation of cost engineering digital platform for improvements in data quality and accuracy Execute continuous improvement in alignment with best practices and business needs Support ENGINE recruiting efforts and help establish the Cost Engineering function within the ENGINE Required Qualifications: Engineering degree in relevant discipline (B.E./B.Tech.) from a deemed/recognized (AICTE) an appropriate certified university Demonstrated skills in assessment of established cost engineering systems and development of shaping plans for improvements and initiatives Has fundamental knowledge of industry trends, lessons learned and best practices Preferred Qualifications: 10+ years Cost Engineering (estimating, scheduling, cost control, management of changes, progress measurement and assessment, forecasting) experience Prior experience in execution planning and risk management of small capital and Maintenance projects Prior supervisory experience Chevron participates in E-Verify in certain locations as required by law.
Posted 2 weeks ago
0.0 - 2.0 years
8 - 12 Lacs
Gurugram
Work from Office
Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Credit and Fraud Risk (CFR) team helps drive profitable business growth by reducing the risk of fraud and maintaining industry lowest credit loss rates. It utilizes an array of tools and ever-evolving technology to detect and combat fraud, minimize the disruption of good spending, and provide a world-class customer experience. The team leads efforts that leverage data and digital advancements to improve risk management, enable commerce and drive innovation. A single decision can have many outcomes. And when that decision affects millions of card members and merchants, it needs to be the right one. That is where our Product teams come in. Product teams are the backbone of all financial services operations at American Express it impacts every aspect of the company. As a part of this team, you will work with the industry s best talent to create smart and innovative strategies that advance our market share and the way we do business. If you are interested in getting to know all areas of our business and can translate our business needs into remarkable solutions, you should consider a career in Product teams. Job Responsibilities: There are diverse set of roles within the Product job family, with varying responsibilities and skill requirements. A brief description of the roles and skills is outlined below: (1) Product Development Develop next generation software products and solutions to solve complex business problems using the latest tools and technologies. Collaborate with multiple business stakeholders, technology teams and other product teams to build and iterate on products that directly impact millions of customers and prospects. Manage the implementation of critical products, drive global, reusable, and configurable design, rule authoring, testing, integration, and product launch using low-code tools. This cluster includes a diverse set of roles, with varying requirements on technical acumen from Low-Code tools to Pro-Code programming skills. (2) Data Steward Manage end-to-end ownership of enterprise data assets that are used in making business decisions for millions of customers and billions of transactions across the globe. Develop strong subject matter expertise on both internal and external data assets. Act as the custodian for data standardization, data governance, data quality and data ownership, while ensuring compliance and security of the data. Build strong relationships, operate effectively within large cross-functional teams, and influence business stakeholders to drive change. (4) Data Governance Planning or facilitating the execution of Data Risk management and governance requirements to ensure compliance of CFR data with enterprise governance and data related policies. Close collaboration with policy owners, enterprise governance & product teams, CFR Data Stewards, Data custodians (and/or Operational Excellence teams) to execute requirements for managing Data Risk and provide subject matter expertise for remediation of Data Risk Issues. Demonstrate deeper understanding of evolving risk management space and bring external best practices in-house. The Selected candidate will be allocated to one of these roles depending on the fitment and business needs. Responsibilities: Develop robust data management, data integration and data quality processes by leveraging best-in-class technology Innovate with a focus on developing newer and better approaches using big data technologies Find innovative techniques to bring scale to critical initiatives and enhance productivity Manage world class data products by partnering with enterprise teams including Technology, Design and End-Users to enable building of new capabilities, modules, and maintenance of existing assets. Minimum Qualifications 0-2 years of relevant experience preferred Strong analytical and problem-solving skills Hands-on experience on Big-data, SQL will be preferred Effective communication and interpersonal skills Ability to work effectively in a team environment Ability to learn quickly and work independently with complex, unstructured initiatives Ability to challenge the status quo and drive innovation Good Programming skills, Knowledge of GCP native tools and other platforms will be preferred. Prior experience of product development, Data analytics, governance or stewardship will be an added advantage
Posted 2 weeks ago
5.0 - 8.0 years
6 - 16 Lacs
Hyderabad
Hybrid
We have an opening at EY GDS for Collibra with Data Governance in Hyderabad Location Exp: 5 to 8 years Must Have Collibra and Data Governance BE/BTech/MCA/MBA with adequate industry experience Should be at least around 5 to 8 years of experience in DG - (Collibra, Alation, Atacama, Mantas)/Informatica DQ, Axon etc Experience in Working with DG (Collibra) - (Alation, Atacama, Mantas) Data Governance tool, end-to-end implementation. Understanding Governance, Metadata management, profiling process. Should have experience in integrating reporting tools with DG - (Alation, collibra, Atacama, Mantas) Should have knowledge of data modelling. Experience in Security implementation in DG Ability to manage self-service data preparation, data sync, data flow and working with curated data sets Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes etc. Please apply on below Link to process your profile ahead https://careers.ey.com/job-invite/1611911/ Interested Candidates, please send your updated resume on below email ID - Smita.Dattu.Sarwade@gds.ey.com
Posted 2 weeks ago
2.0 - 7.0 years
4 Lacs
Mumbai
Work from Office
Derivative Operations provides operational support across CIB covering key product areas including FX, OTC Derivatives, Principal Collateral, 3rd Party Derivatives, Cleared Derivatives, Agency Collateral, Billing and CASS. Job Summary As a Team Leader in Collateral Operations, you will be responsible for Margin call management, Regulatory adherence for all upcoming regulations, cross LOBs metrics and projects. Additionally, you will be building the culture of continuous improvement supporting business across Back Office, Middle offices as well as Global teams. You will be interacting with multiple Operations Technology teams within the organization to provide business support. Job Responsibilities Manage Collateral Dispute. Perform daily Margin Exchange - Same day Settlement and Exception management. Perform supervisory controls around Collateral exposure. Supervise MTM breaks including data quality and strategic projects. Partner with Middle Offices, Credit risk, VCG, etc. Focus on deep dive and fixing on upstream issues to keep the breaks to minimum. Play a key role in regulatory compliance CFTC, EMIR, NCMR, etc. Improve controls in the process ensure 100% accuracy and compliance to Regulatory rules. Manage any new analysis requirements across multiple stakeholders. Provide regular update to senior management on BAU, projects, etc. Supervise UAT testing. Manage strategic automation projects. Required qualifications, capabilities and skills CA/ MBA with 2 years / Graduate or Post-Graduate with 4 years experience in operations. Familiarity with a global banks process operational environment including management and external reporting is a must. Strong business knowledge i.e. Investment Banking, including OTC product, process and system knowledge. Ability to think and act strategically Deal with day-to-day issues as well as planning and executing projects / initiatives, Ensuring the teams activities support Operations in attaining its strategic goals, Excellent attention to detail, and an ability to know when a deep-dive approach is appropriate. Ability to drive results through a "hands-on" approach. Excellent verbal and written communication skills, and adept at communicating with all levels of the business and technical parts of the organization. Skilled in MS office applications including Outlook, PowerPoint, Excel, Word, Access and Project. Can operate effectively in a dynamic environment with tight deadlines, and can prioritize ones own and team s work to achieve them. Flexibility to work global hours and willing to travel globally, if needed. Preferred qualifications, capabilities and skills Knowledge on CFTC, EMIR, NCMR regulations preferable. Experience on OTC Confirmations, Collateral Management and Reconciliation platforms will be an advantage.
Posted 2 weeks ago
3.0 - 5.0 years
1 - 5 Lacs
Hyderabad
Work from Office
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. Your Teams Impact Our clients are increasingly seeking ways to optimize the integration and maintenance of their workflows to free up time, reduce operational costs, and minimize personnel risk. FactSets Managed Services, leveraging our middle office solutions, allowing users to navigate complex workflows, ensure data quality, and access actionable. What Youll Do Deliver technical effort estimates to the analytics team and other business stakeholders when planning new feature and updating existing implementations. Partner with other internal teams to design/improve efficiency and optimize current processes. Consistently engage and address client needs while serving as a primary point of contact. Use Python and SQL to design and implement scalable data processing solutions. Experience in the design, development, and code review of SQL and Python scripts for aggregating and visualizing complex datasets. Demonstratable technical expertise with the following: Databases: SQL Server, PostgreSQL Scripting and exploration: Python, SQL Visualization: Power BI Work with cloud platforms to deploy and maintain data solutions (AWS, Snowflake). Lead a team of minimum 3-5 members. What Were Looking For Required Skills Bachelors degree in engineering with specialization in Computer Science, IT, or Electronics. 3- 5 years of relevant experience with preferably 1-2 years leading a team Maintain a working knowledge of diverse financial concepts, including bonds, equities, and similar assets. Understanding of both business and technical requirements, and the ability to serve as a conduit between business and technology teams. Ability to prioritize multiple projects and work independently while managing the team. Flexible to work in a hybrid model. Whats In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an SP 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the SP 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 weeks ago
1.0 - 2.0 years
8 - 9 Lacs
Hyderabad
Work from Office
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. Your Teams Impact Our clients are increasingly seeking ways to optimize the integration and maintenance of their workflows to free up time, reduce operational costs, and minimize personnel risk. FactSets Managed Services, leveraging our middle office solutions, allowing users to navigate complex workflows, ensure data quality, and access actionable What Youll Do Provide input on technical effort estimates for new features and updates, collaborating with analytics and business teams. Work in a team of 3-5 members to maintain/improve efficiency and optimize current processes. Use Python and SQL to design and implement scalable data processing solutions. Design and peer review SQL and Python scripts used for aggregating and visualizing complex data. Strong technical knowledge with the following: Databases: SQL Server, PostgreSQL Scripting and exploration: Python, SQL Visualization: Power BI Work with cloud platforms to deploy and maintain data solutions (AWS, Snowflake). What Were Looking For Bachelors degree in engineering with specialization in Computer Science, IT, or Electronics. A minimum of 1-2 years of relevant experience. Working knowledge of diverse financial concepts, including bonds, equities, and similar assets. Ability to prioritize multiple projects and work independently while reporting to the team. Flexible to work in a hybrid model. Whats In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an SP 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the SP 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 weeks ago
1.0 - 2.0 years
12 - 14 Lacs
Hyderabad
Work from Office
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients needs and exceeding their expectations. Your Teams Impact Our clients are increasingly seeking ways to optimize the integration and maintenance of their workflows to free up time, reduce operational costs, and minimize personnel risk. FactSets Managed Services, leveraging our middle office solutions, allowing users to navigate complex workflows, ensure data quality, and access actionable What Youll Do Provide input on technical effort estimates for new features and updates, collaborating with analytics and business teams. Work in a team of 3-5 members to maintain/improve efficiency and optimize current processes. Use Python and SQL to design and implement scalable data processing solutions. Design and peer review SQL and Python scripts used for aggregating and visualizing complex data. Strong technical knowledge with the following: Databases: SQL Server, PostgreSQL Scripting and exploration: Python, SQL Visualization: Power BI Work with cloud platforms to deploy and maintain data solutions (AWS, Snowflake). What Were Looking For Bachelors degree in engineering with specialization in Computer Science, IT, or Electronics. A minimum of 1-2 years of relevant experience. Working knowledge of diverse financial concepts, including bonds, equities, and similar assets. Ability to prioritize multiple projects and work independently while reporting to the team. Flexible to work in a hybrid model. Whats In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an SP 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the SP 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 weeks ago
6.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Get to Know the Team: At Grabber Technology Solutions (GTS), we revolutionise the technology experience for every Grabber. Our mission is to empower our team with seamless and solutions that enhance their daily work. We are a diverse group of forward-thinkers committed to creating personalised IT experiences. If youre passionate about customer-centric innovation and technology at Grab, come join us and help shape the future of technology! Get to Know the Role: We are looking for an experienced Senior Configuration Manager to drive the accuracy, integrity, and strategic value of our Configuration Management Database (CMDB). This important individual contributor role will be the primary owner and performer of CMDB operations, ensuring it serves as the definitive source of truth for our IT landscape. You understand configuration management mechanics, including the seamless integration of hardware and software assets within the CMDB framework. You will report to Manager II, Change Release Management. This role is based in Bangalore. The Critical Tasks You will Perform: Own and maintain the Configuration Management Database (CMDB), ensuring accuracy and completeness by collaborating with cross-functional teams on Configuration Item (CI) identification, documentation, and lifecycle management. Lead and evolve Software Asset Management (SAM) processes, defining inclusive policies, tools, and procedures for licence tracking, compliance, usage, and optimisation. Identify and implement opportunities to streamline and automate Configuration Management processes within the ITSM platform, ensuring seamless integration with core ITSM functions like Change, Incident, Problem, and Release Management. Generate regular reports and KPIs, conduct configuration audits, and support risk assessments to address discrepancies and ensure compliance. Provide expert support for Change Management processes, contributing to accurate and collaborative impact assessments for changes affecting configurations. Stay current with industry trends and emerging technologies, recommending strategic process and tool improvements to enhance Configuration and Asset Management practices. Read more Skills you need What Essential Skills You will Need: Bachelors degree in Computer Science, Information Technology, or a related field 6 to 9 years hands-on experience in IT Operations, Service Management or Configuration Management roles. Deep, hands-on expertise in configuration management principles and practices, including CMDB data modelling, CI lifecycle, relationships and data quality. Track record in defining and implementing Hardware Asset Management (HAM) and Software Asset Management (SAM) processes, policies and tools. Hands-on experience with automated discovery and reconciliation tools and integrating data from multiple IT systems. Demonstrated experience defining and generating reports on KPIs and building data visualisations. Good to have ITIL Expert (v3/v4) certified COBIT 5 Foundation certified Lean/SixSigma certified Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave , and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What We Stand For at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Read more
Posted 2 weeks ago
5.0 - 7.0 years
11 - 15 Lacs
Chennai
Work from Office
Job Title: SAP Analytics Cloud Analyst Career Level : C3 Introduction to role: Are you ready to play a central role in transforming AstraZenecas business processes through automation, analytics, and AIJoin our Process Insights team within Global Business Services as an SAP Analytics Cloud Analyst. We are seeking specialists in SAP Analytics (Datasphere Analytics Cloud) to scale our capabilities and deliver impactful solutions across AstraZeneca. Dive into meaningful work that makes a bigger impact and helps shape the future of healthcare! Accountabilities: Collaborate with collaborators to understand their business process requirements and objectives, translating them into SAP Analytics solutions (SAC Datasphere). Create Extract, Transform, and Load (ETL) data pipelines, data warehousing, and testing. Validate and assure data quality and accuracy through data cleansing, enrichment, and building data models. Develop comprehensive analytics and dashboards for business stakeholders for reporting, business planning, and KPI tracking purposes. Enhance solution experiences and visualizations using low/no-code development. Essential Skills/Experience: Degree in Computer Science, Business Informatics or a comparable degree. Overall 5-7 years of experience and at least 2 years experience working on SAP SAC / Datasphere solutions as a Data Analyst and/or Data Engineer. Experience in SAP Datasphere, ETL, building data pipelines, preparing and integrating data, data modelling, understanding of relational data modelling and denormalization techniques. Experience in SAP Analytics Cloud in creating advanced analytics/dashboards i.e. stories, boardrooms, planning. Knowledge of analytics standard processes. Understanding of SAP related Finance and/or Operations processes will be valued. Certification in one or more of the following will be appreciated: SAC Data Analyst, Data Engineer, Low-Code/No-Code Developer. Superb communication skills and ability to work in an Agile environment. Energetic, organised and self-motivated. Fluent in business English. Desirable Skills/Experience: NA At AstraZeneca, youll be part of a distributed team that drives excellence and breakthroughs. Here, you can apply your skills to genuinely impact patients lives while shaping our digital transformation journey. With a focus on innovation and intelligent risks, we empower our teams to make a difference. Our collaborative environment fosters learning, sharing, and celebrating successes together. Join us as we harness radical technologies to achieve more and contribute to society. Ready to make an impactApply now to join AstraZenecas dynamic team! 13-Jun-2025 13-Jun-2025
Posted 2 weeks ago
0.0 - 3.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: ROC(ROC) Job Category: Corporate Services Experience Level: Experienced Hire Skills and Competencies Some relevant work experience and/or relevant internship experience and/ or knowledge of financial instruments preferred Excellent attention to detail and ability to complete repetitive process with no error Clear written and verbal communication skills with an ability to communicate complex business concepts to a senior audience. Highly organized and efficient along with Strong interpersonal skills Competency in Microsoft Office (Outlook, Excel, Word, and PowerPoint.) A strong client focused orientation with the drive and enthusiasm required to achieve results and assume customer satisfaction. Education Postgraduate or graduate with 0- 3 years experience with good academic record Role As a Data Operations Specialist II, you will work in a dedicated team supporting a global process, entering financial data into Moody s internal databases. In this role, you will be required to understand an operational process, perform market data research, navigate various data environments to make data updates, perform data integrity checks, interpret policies and procedures, provide a high level of service and track and report on activity. Responsibilities Support various ratings groups and business with data maintenance for debt attributes and ensure quality assurance through various backend activities that include but not limited to new debt/ deal set up, data capture and updates (e.g., amendments, redemptions, additional offerings etc.), identifying data inconsistencies through review of publicly available documents, sourcing deal documents for analytical group, screening and reporting to meet regulatory requirements, workflow management and invoicing for various products other than ratings. Monitor market data feeds and other various periodic reports to identify in-scope candidates for the process and search documents on various public sources and other data sources. Monitors designated mailboxes to ensure timely and effective handling of internal and external client requests. Organizes work to meet deadlines and time sensitive requests/projects. Facilitates resolution to technical issues and/or more complex external inquiries with supervision by Data Operations Specialist III/ Data Operations Associates.Builds strong stakeholder relationships and delivers professional, high-quality service across many transactions. Promptly and efficiently escalates conflicts / problems / database / data inconsistency Efficiently escalates issues, identifies and researches data discrepancies, and resolves basic client inquiries. Demonstrates increasing awareness of procedures, guidelines, and regulatory requirements as it pertains to their job function by asking relevant questions. Liaison with Rating Teams and other Moody s departments (Commercial, Information Technology, etc.) as required Provide back-up coverage for designated associate in the event of absence and holidays to ensure seamless service to GMO clients. Continue to develop broad based knowledge of financial instruments, terminology, and related business practices Places interest of the team above individual self-interest, Willing to accept new challenges. Contributes positively to the team even under pressure or when performing routine and/or administrative tasks This job description is issued as a guideline to assist you in your duties, it is not exhaustive, and we would be pleased to discuss any constructive comments you may have. Because of the evolving nature and changing demands of our business this job description may be subject to change. You may, on occasion, be required to undertake additional or other duties within the context of this job description, and according to the needs of the business. About the team The Global Middle Office (GMO) provides transaction management support, workflow coordination, rating desk services, and other broad operational support to Moody s Ratings teams. The department works closely with the lines of business to improve both business process and data quality across the rating lifecycle. The GMO has over 100 employees in 5 countries and is a key player in business process development, new company wide initiatives, and technology projects.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Job Title: ServiceNow SAM Integration Lead About the Team: The Software Asset Management team is part of Digital Strategy and Operations department within Sanofi. Our mission is to ensure software assets are governed and monitored throughout their lifecycle within our organization to ensure proper governance, optimize investments, maintain compliance with licensing agreements, and enhance security across all business units globally. The SAM team operates globally, covering all Sanofi business units (RD, industrial, commercial, support functions) across all geographies, with strong partnerships with Procurement, Legal, and IT teams. We have successfully implemented ServiceNow SAM Pro module to monitor software license compliance for key vendors. We are now expanding to manage the complete software asset lifecycle by integrating SAM Pro with other ServiceNow modules and external solutions (e.g., SCCM). We are looking for a skilled ServiceNow SAM integration expert to join our team. You will be responsible for leading this ServiceNow SAM integration project. Main responsibilities: Lead the ServiceNow integration project to manage the full lifecycle of software assets Collaborate with the ServiceNow CoE team to design and implement integrations between SAM Pro and other ServiceNow modules Define and implement processes for software request, approval, deployment, monitoring, and retirement Configure and customize ServiceNow SAM Pro to meet Sanofis specific requirements Develop integration solutions with external systems (SCCM, etc.) as required Create dashboards and reports to provide visibility into software assets Train and support users on the new processes and tools Ensure data quality and integrity across the integrated platform Document processes and technical solutions About you Experience : 5+ years of experience on ITAM/SAM and more specifically in deploying and integrating ServiceNow SAM Pro with other ServiceNow modules to manage full lifecycle of software assets Soft skills : Strong project leadership capabilities, Excellent communication and interpersonal skills with ability to work across global teams Analytical mindset with attention to detail and data accuracy Pragmatic, results-oriented approach to problem-solving Ability to translate technical concepts to business stakeholders Self-motivated with ability to work independently and as part of a team Technical skills : Strong expertise in ServiceNow platform, particularly SAM Pro module Experience integrating ServiceNow modules (e.g. CMDB, ITAM, ITSM, CSD, HR) to manage software assets lifecycle Experience with software discovery tools and integration methods Proficiency in Agile project management methodologies Experience with JIRA is a plus Education : Bachelors degree in computer science, information technology or related field (MBA in IT Governance/MIS a plus) 5-8 years professional working experience in Information Technology 5+ years of relevant experience implementing and integrating ServiceNow SAM Pro Languages : Fluent English (written and verbal) Travel requirements : Occasional short-term international travel (approximately 1%) Why choose us Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary .
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Kochi, Bengaluru
Work from Office
Job Description Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices.
Posted 2 weeks ago
3.0 - 8.0 years
11 - 15 Lacs
Bengaluru
Work from Office
This website uses cookies to ensure you get the best experience. Kongsberg Digital and our selected partners use cookies and similar technologies (together cookies ) that are necessary to present this website, and to ensure you get the best experience of it. If you consent to it, we will also use cookies for analytics and marketing purposes. You can withdraw and manage your consent at any time, by clicking Manage cookies at the bottom of each website page. Accept all cookies Decline all non-necessary cookies Select which cookies you accept On this site, we always set cookies that are strictly necessary, meaning they are necessary for the site to function properly. If you consent to it, we will also set other types of cookies. You can provide or withdraw your consent to the different types of cookies using the toggles below. You can change or withdraw your consent at any time, by clicking the link Manage Cookies , which is always available at the bottom of the site. These cookies are necessary to make the site work properly, and are always set when you visit the site. These cookies collect information to help us understand how the site is being used. These cookies are used to make advertising messages more relevant to you. In some cases, they also deliver additional functions on the site. Accept these cookies Decline all non-necessary cookies Servicenow Expert-Consultant Kongsberg Digital (KDI) is a provider of next generation software and digital solutions to customers within maritime, oil gas, and renewables utilities. The company consists of more than 500 software experts with leading competence within internet of things, smart data, artificial intelligence, automation and autonomous operations. In addition, Kongsberg Digital is the group-wide canter of digital expertise for KONGSBERG. We are seeking a skilled and proactive ServiceNow Expert to join our Global Customer Support team. This role will focus on unlocking the full potential of the ServiceNow platform, driving operational excellence, and enabling data-driven decision-making through robust analytics and reporting capabilities. Key Responsibilities: Platform Enablement: Managing user access and permissions within the ServiceNow system. This includes creating and updating user accounts, defining roles and access controls, and ensuring proper authentication mechanisms. Implementing and maintaining security best practices to protect sensitive data and ensure compliance with relevant regulations and standards. Managing platform upgrades and applying patches to ensure the system is up-to-date with the latest features and security fixes. Drive continuous improvement of ServiceNow capabilities, including Incident, Problem, Knowledge, and Request Management. Implement and optimize support workflows, SLAs, notifications, and self-service capabilities. Collaborate with internal stakeholders to gather requirements and translate them into scalable ServiceNow solutions. Support the rollout of new features aligned with business goals. Analytics Reporting Develop, maintain, and enhance real-time dashboards and performance analytics. Provide actionable insights on team performance, SLA compliance, CSAT, NPS, and ticket trends. Support voice-of-the-customer and quality initiatives through analytics. Drive automation of data collection and reporting where possible. Ensure data quality, integrity, and consistency across the platform. Maintain documentation, configuration standards, and user guides. Train and support global users on ServiceNow functionalities and best practices. Identify and implement opportunities for efficiency, automation, and scalability. Stay current with ServiceNow updates and industry trends to recommend improvements. Qualifications: 3+ years of hands-on experience with ServiceNow platform administration and development. Strong knowledge of ServiceNow ITSM modules and reporting capabilities. Proficiency in performance analytics, dashboards, and KPIs in ServiceNow. Strong analytical, problem-solving, and communication skills. Experience working in a fast-paced, cross-cultural, and collaborative environment. ServiceNow certifications (e.g., Certified System Administrator, ITSM) are a plus. Experience with customer support metrics (e.g., CSAT, NPS, SLA). Understanding of ITIL best practices. Familiarity with integrations, scripting, and automation within ServiceNow . First /Mid-Level Officials Talent Acquisition Specialist-APAC Global job opportunities OUR POWER IS CURIOSITY, CREATION AND INNOVATION We believe you love to experiment, challenge the established, co-create, develop and cultivate. Together we can explore new answers to today s challenges and future opportunities, and talk about how industrial digitalisation can be a part of the solution for a better tomorrow. We believe that different perspectives are crucial for developing gamechanging technology for a better tomorrow. Join us in taking on this challenge! About Kongsberg Digital Kongsberg Digital is a provider of next generation software and digital solutions to customers within maritime, oil gas and utilities. Together with the rest of KONGSBERG, Kongsberg Digital offers solutions within autonomy, smart data, augmented reality and other areas. Join Kongsberg Digital as we pursue our mission to digitalize the world s industries for a better tomorrow. We truly believe that technology will drive more efficient and sustainable operations, making the oil sector more energy efficient, ships less polluting and green energy future proof. Founded in 2016 Co-workers 1316 Already working at Kongsberg Digital Let s recruit together and find your next colleague.
Posted 2 weeks ago
5.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About Tarento: Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Job Description We are seeking a highly skilled Data Modeler with deep experience in the CPG (preferably) domain to join our data engineering team. You will be responsible for designing scalable and business-aligned data models to support analytical and operational use cases across functions like Sales, Trade, Supply Chain, Marketing, and Finance. This role requires close collaboration with both business stakeholders and technical teams to ensure data structures accurately reflect business processes and drive actionable insights. Key Responsibilities Design and develop conceptual, logical, and physical data models aligned with CPG business requirements. Collaborate with stakeholders to gather and interpret data requirements across domains such as Retail Execution, Customer 360, Trade Promotion, and Supply Chain. Translate KPIs and business rules into structured models supporting reporting, data warehousing, and advanced analytics. Apply industry-standard modelling techniques like dimensional modelling (star/snowflake), 3NF, and data vault. Document models through ER diagrams, data dictionaries, and metadata to support governance and knowledge sharing. Work with data engineering teams to implement models on cloud platforms such as Snowflake, Databricks, or Azure Synapse. Ensure modelling standards and best practices are followed for consistency and scalability across teams. Support BI and analytics teams by building models that enable effective self-service analytics and reporting. Align models with data quality, lineage, and compliance frameworks (e.g., GS1 standards for product data). Continuously optimize data models based on evolving business needs and system performance. Required Skills and Experience BE/BS/MTech/MS in Computer Science or equivalent work experience. 5+ years of experience in data modelling with exposure to CPG domain preferred. Proficient in dimensional modelling, 3NF, and data vault methodologies. Hands-on experience with data modelling tools such as ER/Studio, Erwin, or dbt. Strong understanding of SQL and data warehousing principles. Experience working with modern cloud data platforms (Snowflake, Azure Synapse, Databricks). Excellent communication skills to bridge business and technical teams. Experience collaborating with cross-functional stakeholders like engineers, BI developers, and analysts. Ability to guide teams and promote modelling best practices in cloud-based data environments. Cloud certifications (e.g., Azure Data Engineer Associate, AWS Big Data Specialty, Google Data Engineer). Experience with data governance and catalog tools like Unity Catalog, Azure Purview, or Collibra. Knowledge of MDM concepts and tools, especially within the CPG context. Familiarity with GS1 standards and product hierarchy management. Exposure to Agile or SAFe delivery frameworks. Certifications in data architecture or modelling (e.g., CDMP). Prior experience leading data teams of 10-20+ members. Strong ability to balance strategic thinking with hands-on implementation
Posted 2 weeks ago
5.0 - 10.0 years
6 - 10 Lacs
Noida, Gurugram, Bengaluru
Work from Office
">Informatica Developer 5-10 Years Noida, Gurgaon, Bengaluru Informatica Cloud IICS Experience Required : 5-10 Years Location : Delhi NCR Job Summary: We are seeking a skilled Application Integration Developer to design, develop, and maintain robust integration solutions using Informatica Intelligent Cloud Services (IICS). This role will be pivotal in connecting critical business systems such as Salesforce, ERP platforms, RevPro, and other SaaS applications to ensure seamless data flow, quality, integrity, and high performance across all integration points. Key Responsibilities: Design, develop, and maintain scalable and reliable integration solutions using IICS (Cloud Data Integration - CDI and Cloud Application Integration - CAI). Develop complex mappings, taskflows, and processes to support diverse integration needs. Integrate multiple data sources and targets including databases, APIs, and flat files. Collaborate with business and technical teams to understand requirements and translate them into efficient integration designs. Monitor and optimize the performance of integration workflows to meet business SLAs. Ensure data quality, integrity, and consistency across all integrated systems. Troubleshoot and resolve integration issues in a timely manner. Document integration processes, configurations, and best practices. Qualifications: Proven experience with Informatica Intelligent Cloud Services (IICS), specifically CDI and CAI modules. Strong knowledge of application integration concepts and techniques. Experience integrating Salesforce, ERP systems, RevPro, and other SaaS applications. Proficient with various data sources and targets including relational databases, REST/SOAP APIs, and file-based systems. Ability to develop complex mappings, workflows, and taskflows. Strong analytical, problem-solving, and communication skills.
Posted 2 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Noida
Work from Office
">QA Manual - EDW 5-10 Years Noida QA Manual EDW Job Summary: We are looking for a meticulous and analytical Quality Analyst to join our Enterprise Data Warehouse (EDW) team. In this role, you will be responsible for ensuring the accuracy, consistency, and integrity of data across the warehouse by designing and executing thorough testing strategies. You will collaborate closely with digital and business teams to validate data transformations, identify discrepancies, and ensure alignment with business requirements. Key Responsibilities: Develop and execute detailed test plans, test cases, and test strategies for EDW components and data pipelines. Perform functional and technical testing , including data validation, transformation logic verification, and end-to-end testing. Use SQL to validate data accuracy across source systems, staging, and data warehouse layers. Identify, log, and track defects using defect tracking tools; assist in root cause analysis and drive resolution in coordination with development teams. Collaborate with business analysts and developers to ensure thorough understanding of requirements and provide support during User Acceptance Testing (UAT) . Maintain testing documentation including test cases, data mapping documents, and QA status reports. Ensure adherence to business rules and data quality standards throughout the testing lifecycle. Continuously improve test processes and recommend quality best practices. Required Skills Qualifications: Proven experience in test planning, test execution , and data validation for data warehouse and ETL environments. Strong proficiency in SQL for data querying, validation, and debugging. Solid understanding of data warehousing concepts , data models, and ETL processes. Hands-on experience with defect tracking tools (e.g., JIRA, HP ALM, Azure DevOps). Excellent documentation and communication skills. Strong problem-solving ability and attention to detail . Ability to work independently and collaboratively in a cross-functional team environment. Preferred Qualifications: Experience in testing within major EDW platforms (e.g., Snowflake, Redshift, Teradata). Familiarity with automation tools or frameworks for data validation. Exposure to BI/reporting tools (e.g., Tableau, Power BI) for report testing.
Posted 2 weeks ago
8.0 - 10.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Data: integrate data in a flexible, open scalable platform to power healthcare s digital transformation Analytics: deliver analytic applications services that generate insight on how to measurably improve Expertise: provide clinical, financial operational experts who enable accelerate improvement Engagement: attract, develop and retain world-class team members by being a best place to work Job Title: Principal Snowflake Data Engineer Data Engineering Lead Experience: 8-10 Years Employment Type: Full-Time About the Role: We are seeking a Principal Snowflake Data Engineer with 8-10 years of experience in data engineering, including deep specialization in the Snowflake Data Cloud, and a proven track record of technical leadership and team management. This role goes beyond individual contribution you will also lead and mentor cross-functional teams across data synchronization, Data Operations, and ETL domains, driving best practices and architectural direction while ensuring the delivery of scalable, efficient, and secure data solutions across the organization. Key Responsibilities Technical Leadership Own the architectural vision and implementation strategy for Snowflake-based data platforms. Lead the design, optimization, and maintenance of ELT pipelines and data lake integrations with Snowflake. Drive Snowflake performance tuning, warehouse sizing, clustering design, and cost governance. Leverage Snowflake-native features like Streams, Tasks, Time Travel, Snowpipe, and Materialized Views for real-time and batch workloads. Establish robust data governance, security policies (RBAC, data masking, row-level access), and regulatory compliance within Snowflake. Ensure best practices in schema design, data modeling, and version-controlled pipeline development using tools like dbt, Airflow, and Git. Team People Management Lead and mentor the data synchronization, Data Operations, and ETL engineering teams ensuring alignment with business and data strategies. Drive sprint planning, project prioritization, and performance management within the team. Foster a culture of accountability, technical excellence, collaboration, and continuous learning. Partner with product managers, business analysts, and senior leadership to translate business requirements into technical roadmaps. Operational Excellence Oversee end-to-end data ingestion and transformation pipelines using Spark, AWS Glue, and other cloud-native tools. Implement CI/CD pipelines and observability for data operations. Establish data quality monitoring, lineage tracking, and system reliability processes. Champion automation and Infrastructure-as-Code practices across the Snowflake and data engineering stack. Required Skills 8-10 years of data engineering experience with at least 4-5 years of hands-on Snowflake expertise. Proven leadership of cross-functional data teams (ETL, Data Operations, data synchronization). Deep expertise in: o Snowflake internals (clustering, caching, performance tuning) o Streams, Tasks, Snowpipe, Materialized Views, UDFs o Data governance (RBAC, secure views, masking policies) Strong SQL and data modeling (dimensional normalized) Hands-on experience with: o Apache Spark, PySpark, AWS Glue o Orchestration frameworks (Airflow, dbt, Dagster, or AWS Step Functions) o CI/CD and Git-based workflows Strong understanding of data lakes, especially Delta Lake on S3 or similar Nice to Have Snowflake Certifications (SnowPro Advanced Architect preferred) Experience with Data Operations tools (e.g., Datadog, CloudWatch, Prometheus) Familiarity with Terraform, CloudFormation, and serverless technologies (AWS Lambda, Docker) Exposure to Databricks and distributed compute environments Why Join Us Lead and shape the future of data architecture and engineering in a high-impact, cloudnative environment. Be the go-to Snowflake expert and technical mentor across the company. Enjoy the opportunity to manage teams, drive innovation, and influence strategy at scale. Flexible remote work options, high autonomy, and strong support for career development. The above statements describe the general nature and level of work being performed in this job function. They are not intended to be an exhaustive list of all duties, and indeed additional responsibilities may be assigned by Health Catalyst . Studies show that candidates from underrepresented groups are less likely to apply for roles if they don t have 100% of the qualifications shown in the job posting. While each of our roles have core requirements, please thoughtfully consider your skills and experience and decide if you are interested in the position. If you feel you may be a good fit for the role, even if you don t meet all of the qualifications, we hope you will apply. If you feel you are lacking the core requirements for this position, we encourage you to continue exploring our careers page for other roles for which you may be a better fit. At Health Catalyst, we appreciate the opportunity to benefit from the diverse backgrounds and experiences of others. Because of our deep commitment to respect every individual, Health Catalyst is an equal opportunity employer.
Posted 2 weeks ago
3.0 - 5.0 years
4 - 7 Lacs
Mumbai
Work from Office
to support our VFX studios by managing and troubleshooting SAP access requests, user interface issues, reporting configurations, data quality assurance, and overall data management . The ideal candidate will possess strong problem-solving abilities, collaboration skills, outstanding communication and organizational skills, as well as a strong customer-oriented mindset . Working within a fast-paced and dynamic environment, this role will be instrumental in ensuring seamless SAP operations , driving efficiency, and supporting users across multiple studios. KEY RESPONSIBILITIES 1. SAP Access User Support Manage and troubleshoot user access requests , ensuring compliance with security policies. Administer role-based access control (RBAC) , maintaining integrity and proper authorization structures. Provide first-line support for user interface (UI) issues , ensuring smooth system navigation. 2. Reporting Configuration Troubleshooting Configure, maintain, and troubleshoot SAP reporting tools to meet operational needs. Work with finance, HR, and production teams to optimize reporting structures for accurate data retrieval. Ensure reporting consistency across multiple studios in multiple locations worldwide. 3. Data Quality Assurance Data Management Monitor and enforce data integrity standards , ensuring accurate and reliable information across all SAP modules. Perform regular audits and validations to detect and correct data inconsistencies. Collaborate with IT and functional teams to ensure smooth data imports, migrations, and updates . 4. System Optimization Collaboration SAP system improvements Work closely with SAP functional consultants, developers, and end-users to ensure system efficiency. Partner with internal stakeholders to improve SAP workflows and usability . 5. Documentation Training Develop and maintain SAP system documentation, training guides, and troubleshooting manuals . Conduct training sessions to enhance user adoption and system proficiency. Qualifications THE IDEAL CANDIDATE PROFILE 3-5+ years of experience in SAP system administration, user support, and troubleshooting. Strong knowledge of SAP modules relevant to VFX operations (e.g., Finance, HR, Project Management). Experience with reporting tools, data management, and access control mechanisms in SAP. Ability to troubleshoot technical issues and optimize workflows in a fast-paced environment. Familiarity with SAP security policies, role management, and compliance frameworks . Excellent problem-solving skills , with a proactive and analytical approach. Outstanding communication and collaboration skills , capable of engaging with technical and non-technical stakeholders. Strong organizational skills , able to manage multiple priorities efficiently. Customer-oriented mindset , ensuring users receive the highest level of support. Ability to work independently and take initiative while collaborating effectively with cross-functional teams. Additional Information All your information will be kept confidential according to EEO guidelines. Nothing in this job description restricts the Company s right to assign or reassign duties and responsibilities to this job at any time. The Company prohibits discrimination in employment against otherwise qualified applicants because of a physical or mental disability and will make reasonable accommodations to enable qualified persons with known disabilities to perform the essential functions of their job consistent with applicable law. The Company will consider qualified applicants with criminal histories in a manner consistent with applicable law. To apply, please click the Apply button. Please review our Privacy Policy for information on how we collect and store your data.
Posted 2 weeks ago
6.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
" Senior Data Engineer (Contract) Location: Bengaluru, Karnataka, India About the Role: Were looking for an experienced Senior Data Engineer (6-8 years) to join our data team. Youll be key in building and maintaining our data systems on AWS. Youll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. Youll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What Youll Do: Design and build efficient data pipelines using Spark / PySpark / Scala . Manage complex data processes with Airflow , creating and fixing any issues with the workflows ( DAGs ). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript , for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes ( CI/CD ) using GitHub Actions . Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What Youll Need: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 6-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs . Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation.. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena ). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment ( CI/CD ). Good understanding of data warehousing concepts. Strong database skills - OLAP/OLTP Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. Bonus Points: Familiarity with data lake technologies (e.g., Delta Lake, Apache Iceberg). Experience with other stream processing technologies (e.g., Flink, Kinesis). Knowledge of data management, data quality, statistics and data governance frameworks. Experience with tools for managing infrastructure as code (e.g., Terraform). Familiarity with container technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana).
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane