Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
55 - 60 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Dear Candidate, We are looking for a Data Governance Specialist to establish and maintain data governance policies, ensuring data quality, compliance, and responsible usage across the organization. Key Responsibilities: Define and enforce data governance policies and standards. Work with data owners and stewards to improve data quality. Monitor compliance with data privacy regulations (GDPR, HIPAA, etc.). Support metadata management, data lineage, and cataloging initiatives. Promote data literacy across departments. Required Skills & Qualifications: Experience with data governance tools (Collibra, Alation, Informatica). Knowledge of regulatory frameworks and data privacy laws. Strong analytical and documentation skills. Understanding of data architecture, MDM, and data stewardship. Excellent stakeholder management and communication skills. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 weeks ago
1.0 - 6.0 years
4 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
CirrusLabs Private Limited is looking for Informatica CDC Consultant to join our dynamic team and embark on a rewarding career journey Developing and maintaining data integration workflows using Informatica PowerCenter Designing and implementing data quality and data governance processes Troubleshooting and resolving technical issues related to Informatica workflows and processes Creating and maintaining technical documentation for data integration processes Staying up-to-date with the latest Informatica technologies and trends in data management and integration You have an entrepreneurial spirit You enjoy working as a part of well-knit teams You value the team over the individual You welcome diversity at work and within the greater community You
Posted 2 weeks ago
7.0 - 10.0 years
5 - 8 Lacs
Mumbai
Work from Office
Monitoring Evaluation System Design and Implementation Manage and implement a robust ME framework, tools, and systems aligned with AKAHs global and national strategies. Manage / Establish performance indicators and benchmarks for ongoing/new Programs Ensure data collection, validation, analysis, and timely reporting across projects. Program Monitoring and Reporting Conduct regular monitoring visits to project sites. Work with program teams to ensure timely submission of quality data and reports. Prepare ME reports, impact assessments, and dashboards for internal and donor use. Provide strategic feedback to program teams based on evidence and findings. Capacity Building Build capacity of staff and partners on ME tools, processes, and techniques. Conduct training sessions on data collection, data quality assurance, and outcome tracking. Evaluation and Learning Coordinate mid-term and final evaluations of projects. Support baseline, endline, and impact assessment studies. Facilitate learning reviews and reflection workshops with teams and stakeholders. Document lessons learned, case studies, and best practices for knowledge sharing. Data Management and Technology Oversee data management platforms (e.g., KoboToolbox, Power BI, Excel dashboards). Ensure data security, integrity, and quality in compliance with AKAH and donor standards. Promote the use of digital tools for real-time M\\E tracking. Compliance and Donor Reporting Support compliance with donor M\\E requirements. Provide data and evidence for proposals, donor reports, and strategic plans. Qualifications and Experience: Masters degree in Social Sciences, Development Studies, Statistics, Public Policy, or a related field. Minimum 7 to10 years of experience in ME, preferably in the development /humanitarian sector. Experience working with National / international NGOs or UN is preferred. Strong knowledge of logical frameworks, RBM (Results-Based Management), and theory of change. Skills and Competencies: Excellent analytical and statistical skills (SPSS, STATA, Excel, Power BI). Proficiency in digital data collection tools (e.g., ODK, Kobo). Strong communication and presentation skills in English and Hindi. Ability to synthesize complex information into actionable insights. Strong interpersonal skills and ability to work with diverse teams. High attention to detail and commitment to quality. Willingness to travel frequently to project sites in Urban, rural and remote areas. Candidates from Mumbai are encouraged to apply.
Posted 2 weeks ago
3.0 - 8.0 years
20 - 25 Lacs
Hyderabad
Work from Office
The Sr Software Engineer will be part of a team of some of the best and brightest in the industry who are focused on full-cycle development of scalable web and responsive applications that touch our growing customer base every day As part of the Labs team, you will work collaboratively with agile team members to design new system functionality and to research and remedy complex issues as they arise, embodying a passion for continuous improvement and test-driven development About Us When you join iCIMS, you join the team helping global companies transform business and the world through the power of talent Our customers do amazing things: design rocket ships, create vaccines, deliver consumer goods globally, overnight, with a smile As the Talent Cloud company, we empower these organizations to attract, engage, hire, and advance the right talent we're passionate about helping companies build a diverse, winning workforce and about building our home team we're dedicated to fostering an inclusive, purpose-driven, and innovative work environment where everyone belongs Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Collaborate with software engineers, data scientists, and product managers to understand data requirements and provide tailored solutions Optimize and enhance the performance of our data infrastructure to support analytics and reporting Implement and maintain data governance and security best practices Troubleshoot and resolve data-related issues and ensure data quality and integrity Mentor and guide junior data engineers, fostering a culture of continuous learning and improvement Qualifications bachelors or masters degree in computer science, Engineering, or a related field 3+ years of experience in data engineering or a similar role Proficiency in SQL and experience with relational databases (eg, PostgreSQL, MySQL) Strong programming skills in Python Familiarity with cloud platforms (eg, AWS, GCP, Azure) and containerization (eg, Docker) Excellent problem-solving skills and attention to detail Strong communication and collaboration skills
Posted 2 weeks ago
5.0 - 8.0 years
45 - 50 Lacs
Bengaluru
Work from Office
We are seeking a highly motivated and experienced Lead Data Engineer to join our growing engineering team. In this role, you will be a key contributor to building and scaling our data infrastructure, ensuring data quality, and enabling real-time analytics for our platform. You will work closely with engineers and product managers to deliver impactful solutions that drive business growth. Key Focus Areas Develop and maintain scalable backend services and APIs for our applications powe'red by real-time databases. Architect and implement data pipelines for ingestion, processing, and storage leveraging cloud services, preferably AWS. Set up, manage, and optimize Apache Pinot for real-time analytics, including schema design, indexing strategies, and query tuning. Design and implement data lake strategies for storing and accessing raw and processed data. Implement data quality and anomaly checks and monitoring across backend and data pipelines. Develop and maintain ETL processes, integrating them with backend services. Optimize data models and storage for performance and efficiency. Contribute to data infrastructure design, focusing on scalability and fault tolerance. Troubleshoot and resolve data-related issues in backend and pipelines. Stay current with backend and data engineering trends. Participate in code reviews and improve engineering practices. Contribute to automated testing and deployment processes. About You 5 8 years of experience in data engineering. Strong understanding of data engineering concepts, including data architecture, pipeline design, data modeling, and data quality. Proficiency in SQL and experience with relational databases. Experience with Python and data processing libraries (eg, Pandas). Experience with real-time analytics platforms (eg, Apache Pinot/Druid) is highly desired. Experience with ETL tools and processes, including Fivetran or similar platforms. Experience with cloud platforms (eg, AWS, GCP, Azure) is a plus. Strong problem-solving and debugging skills. Excellent communication and collaboration skills. Bachelors degree in Computer Science or a related field. Bonus Skills (Nice-to-Have) Experience with streaming technologies (eg, Kafka, Flink). Experience setting up and managing open-source data tools (eg, Superset, Redash, Airflow). Experience with monitoring and alerting tools (eg, Prometheus, Datadog). Contributions to open-source projects. What s in it for You A we'll-deserved compensation package that recognizes your skills and contributions. Opportunity to share in the companys success through equity ownership. Work with hand-picked individuals who are experts in their domains and passionate about the product. Contribute to a rapidly growing product used by thousands of businesses globally. Flexible work options, including in-office and remote work for optimal work-life balance. Comprehensive health insurance coverage for you and your family. Dedicated mental health support, fostering a nurturing workplace environment
Posted 2 weeks ago
1.0 - 3.0 years
3 - 7 Lacs
Bengaluru
Work from Office
We are seeking a skilled Data Analyst to join our dynamic team at Game Theory. This role is essential in transforming our raw data into actionable insights that drive product decisions and business strategy. The successful candidate will leverage strong SQL skills, MongoDB aggregation expertise, and proficiency with analytics tools like Metabase, ClickHouse, and Amplitude to extract meaningful patterns from our data. You will play a crucial role in helping us understand user behavior, optimize features, and redefine the way people play sports. Key Responsibilities: SQL Development : Write efficient, complex SQL queries to extract and analyze data from relational databases, ensuring high performance and accuracy. MongoDB Analysis : Create advanced MongoDB aggregation pipelines to process and analyze large datasets from our NoSQL databases. Dashboard Creation : Design and maintain intuitive dashboards in Metabase that provide stakeholders with clear visualizations of key metrics and performance indicators. ClickHouse Implementation : Leverage ClickHouse for high-performance analytical queries on large datasets to support real-time analytics needs. User Behavior Analysis : Use Amplitude to track and analyze user behavior, identify patterns, and generate insights to improve user engagement and retention. Metrics Definition : Collaborate with product and business teams to define, implement, and track key performance metrics. Ad-hoc Analysis : Respond to business questions with data-driven insights through timely ad-hoc analyses. Data Quality : Ensure data accuracy and integrity through validation processes and quality checks. Reporting Automation : Develop automated reporting solutions to streamline data delivery and reduce manual effort. Insight Communication : Present data findings clearly to non-technical stakeholders, translating complex patterns into actionable recommendations. Required Qualifications: Experience : 1-3 years of full-time work experience as a Data Analyst or in a similar analytics role. Technical Skills : Strong proficiency in SQL with the ability to write complex queries (joins, window functions, CTEs) Experience with MongoDB aggregation framework and query optimization Hands-on experience with analytics tools (Metabase, Amplitude, or similar platforms) Familiarity with columnar databases like ClickHouse Competency in at least one programming or scripting language (Python, R, or JavaScript) Knowledge of data visualization principles and best practices Analytical Thinking : Strong ability to approach problems logically, break down complex issues, and derive meaningful conclusions from data. Business Acumen : Understanding of how data insights connect to business outcomes and product decisions. Communication : Excellent verbal and written communication skills to present findings clearly to diverse audiences. Education : Bachelors degree in Computer Science, Statistics, Mathematics, Economics, or a related field. Preferred Qualifications: Experience working with sports-related data or in the sports technology industry. Knowledge of A/B testing methodologies and statistical significance. Familiarity with data modeling concepts and dimensional modeling. Experience with version control systems (Git) for collaborative analytics work. Understanding of machine learning concepts and predictive analytics. Experience working in an agile development environment. Knowledge of data privacy regulations and best practices. Proficiency with Python data analysis libraries (Pandas, NumPy). What We Offer: The opportunity to be at the forefront of sports technology innovation, using data to drive real impact. A dynamic and challenging environment that promotes personal and professional growth. Autonomy to explore datasets and uncover insights with significant business impact. Competitive compensation package, including equity options
Posted 2 weeks ago
3.0 - 6.0 years
22 - 25 Lacs
Bengaluru
Work from Office
We are seeking a highly motivated and experienced SDE II - Backend and Data to join our growing platform engineering team. In this role, you will be a key contributor to building and scaling our data infrastructure, ensuring data quality, and enabling real-time analytics for our platform. You will work closely with other engineers and product managers to deliver impactful solutions that drive business growth. What we are looking for in our Backend and Data Engineer (SDE2): Develop and maintain scalable backend services and APIs for our applications powe'red by real time data base. Architect and implement data pipelines for ingestion, processing, and storage leveraging cloud services, preferably AWS. Set up, manage, and optimize Apache Pinot for real-time analytics, including schema design, indexing strategies, and query tuning. Design and implement data lake strategies for storing and accessing raw and processed data. Implement data quality and anomaly checks and monitoring across backend and data pipelines. Develop and maintain ETL processes, integrating them with backend services. Optimize data models and storage for performance and efficiency. Contribute to data infrastructure design, focusing on scalability and fault tolerance. Troubleshoot and resolve data-related issues in backend and pipelines. Stay current with backend and data engineering trends. Participate in code reviews and improve engineering practices. Contribute to automated testing and deployment processes. What we expect from you: 3-6 years of experience in backend development and data engineering. Strong understanding of data engineering concepts, including data architecture, pipeline design, data modeling, and data quality. Proficiency in SQL and experience with relational databases. Experience with Python and data processing libraries (eg, Pandas). Experience with real-time analytics platforms (eg, Apache Pinot/Druid) is highly desired. Experience with ETL tools and processes, including understanding of Fivetran or similar platforms. Experience with cloud platforms (eg, AWS, GCP, Azure) is a plus. Strong problem-solving and debugging skills. Excellent communication and collaboration skills. Bachelors degree in Computer Science or a related field. Bonus Points: Experience with streaming technologies (eg, Kafka, Flink). Experience setting up and managing open-source data tools (eg, Superset, Redash, Airflow) and working experience with such tools. Experience with monitoring and alerting tools (eg, Prometheus, Datadog). Contributions to open-source projects. What s in store for you at Uniqode A we'll-deserved compensation package that recognizes your skills and contributions to our team. Join our journey with an opportunity to share in the companys success through equity ownership. Get the opportunity to work with hand-picked individuals who are experts in their domain and passionate about the product. Contribute to a product that is rapidly growing and is the chosen platform of thousands of businesses across the globe. Experience the flexibility of working both in-office and remotely, optimizing your work-life balance. Secure your we'll-being with comprehensive health insurance coverage, ensuring you and your family peace of mind. Receive dedicated mental health support, fostering a nurturing workplace environment that values your emotional we'll-being.
Posted 2 weeks ago
2.0 - 4.0 years
11 - 15 Lacs
Gurugram
Work from Office
The role is responsible for executing data management processes aimed at ensuring clean and quality data in the KKR data ecosystem. They will be part of KKR s enterprise data group which collects, manages, and harnesses the power of data across our diverse portfolio investments. They will work collaboratively across the firm to set standards & best practices for data management while providing the operating leverage to centrally support the roll-out/ execution of these frameworks ROLES & RESPONSIBILITIES Operational Excellence Develop specifications as we'll as testing and enhancing tools/applications in conjunction with the IT team to maintain complete, accurate and up to date data Maintain consistent, accurate and complete data within KKR s data ecosystem Implement data quality controls leveraging industry best tools ie Collibra Create and maintain data quality reporting functionality as per business needs Ensure data governance practices and activities are embedded across business units Execute and manage ad hoc data related projects within specified deadlines Collibra workflow development and maintenance Stakeholder Management Collaborate with engineering and IT to support and make recommendations for enhanced digital reporting capabilities and automated data reconciliation Communicate and work closely with relevant teams to close data gaps in a clear and timely manner Serve as point of contact for data-related questions and updates from various internal and external groups, delivering ad-hoc analytics to answer key business questions in a timely manner Reporting & Governance Design and document standard operating procedures for data management Implement and own best in class data governance practices; ensuring that data is we'll defined & transparently documented QUALIFICATIONS bachelors Degree or equivalent work experience required 2-4 years of data operation experience in financial services Experience in a multinational Financial Services organization and/or Private Equity preferred Ability to manage standard reports, templates & dashboards Ability to validate and review data Ability to provide support for internal stakeholders by sending reminders of emails, filling timesheets, collecting information as per service requests Ability to adhere to the compliance requirements of processes Ability to develop and enhance data protection and management tools or applications Ability to design and execute data management focusing on data governance and data quality activities. Experience of using tool like Collibra is a must. Systems/ Tools/ Application knowledge: - Experience with process design and process enhancement - Proficiency in data operations and data management - Advanced proficiency in Excel - Skills in a BI tool such as Power BI - Advanced SQL skills - Experience with Python is a plus Displays high attention to detail Demonstrates outstanding initiative and strong work ethic Focuses on delivery excellence and accountability Displays team-work orientation and is highly collaborative Displays strong integrity and professionalism Builds strong relationships with local and global colleagues Demonstrates strong track record in accuracy and organization Demonstrates excellent written, verbal, and interpersonal communication skills
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
We are seeking a Regulatory Reporting Business Analyst to join our team and support the analysis, implementation, and maintenance of global transaction reporting obligations. This role requires deep expertise in regulatory reporting frameworks such as EMIR, MiFID II, SFTR, and CFTC reporting, along with a strong understanding of financial products and data management. As a Regulatory Reporting Business Analyst, you will play a crucial role in ensuring regulatory compliance and data accuracy by closely collaborating with multiple internal departments, including software development teams, regulatory operations team and compliance team. Primary duties will include: Analyze and interpret complex transaction reporting requirements across multiple regulatory frameworks (EMIR, MiFID, SFTR, CFTC, etc), with focus on field population, data transformation, and validation rules Develop and maintain comprehensive data mapping documentation and data lineage diagrams, tracking data flows and transformations across various systems and products Drive end-to-end investigation efforts to identify root causes and implement solutions for transaction reporting discrepancies and data quality issues Lead User Acceptance Testing (UAT) initiatives by creating detailed test cases, coordinating with development teams, and collaborating with other business analysts to ensure quality deliverables Coordinate cross-functionally with Operations, Technology, and Compliance teams to ensure successful implementation of regulatory reporting requirements across all regimes Monitor regulatory changes and assess their impact on existing reporting processes, providing recommendations for necessary system or process modifications Provide training and guidance to business users on regulatory reporting requirements and processes Review and validate third-party vendor solutions for regulatory reporting, ensuring they meet business requirements and compliance standards To land this role you will need : 3-5 years of experience as a business analyst, data analyst or quality analyst role associated with regulatory operations or compliance functions Financial product knowledge (EQ, FI, COMM, IR, FX) including detailed understanding of trade/transaction lifecycles and regulatory requirements Deep understanding of global transaction reporting regulations, particularly MIFID, EMIR, SFTR and CFTC Experience in data mapping and data lineage documentation Strong analytical and problem-solving skills with attention to detail What makes you stand out: Experience with SQL queries, store procedures and XML data formats Experience with data visualization tools Ability to work autonomously with minimum supervision Working environment: Hybrid
Posted 2 weeks ago
8.0 - 13.0 years
15 - 19 Lacs
Pune
Work from Office
KPI Partners is seeking a highly skilled and experienced GenAI Engineer with a strong background in Data Engineering and Software Development to join our team. The ideal candidate will focus on enhancing our information retrieval and generation capabilities, with specific experience in Azure AI Search, data processing for RAG, multimodal data integration, and familiarity with Databricks. Key Responsibilities: Design, develop, and optimize Retrieval-Augmented Generation models to improve information retrieval and generation processes within our applications. Develop and maintain search solutions using Azure AI Search to ensure efficient and accurate information access Process and prepare data to support RAG workflows, ensuring data quality and relevance. Integrate and manage various data types (eg, text, images) to enhance retrieval and generation capabilities. Work closely with cross-functional teams to integrate data into our existing retrieval eco-system, ensuring seamless functionality and performance. Ensure the scalability, reliability, and performance of data retrieval in production environments. Stay updated with the latest advancements in AI, ML, and data engineering to drive innovation and maintain a competitive edge. What we're looking for: masters degree in Data Science or a related field is preferred. Approximately 8 years of experience in Data Science, MLOps, and Data Engineering Proven experience in AI and ML solution implementation, particularly in semiconductor manufacturing. Proficiency in Python Proven experience in data engineering and software development, with a focus on building and deploying RAG pipelines or similar information retrieval systems. Familiarity with processing multimodal data (eg, text, images) for retrieval and generation tasks. Strong understanding of database systems (SQL and NoSQL) and data warehousing solutions. Proficiency in Azure AI, Databricks, and other relevant tools.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Noida
Work from Office
We are looking for a Data Engineer II, Analytics to join our growing team and help build scalable, high-performance data pipelines that enable meaningful business insights you'll work with modern data tools, collaborate with cross-functional teams, and contribute to building a robust data foundation that supports Eightfolds AI-driven analytics and reporting needs What You Will Learn To Do Design & Develop Pipelines: Build, maintain, and optimize reliable ETL/ELT pipelines to ingest, process, and transform large datasets from diverse sources using Databricks and Amazon Redshift Data Modeling & Architecture: Support the design and implementation of scalable data models and architectures to meet evolving analytics and business intelligence needs Data Quality & Integration: Ensure accuracy, consistency, and quality of integrated data from structured and unstructured sources across systems Performance Tuning: Optimize the performance of queries, pipelines, and databases to ensure high efficiency and reliability for analytics workloads Collaboration & Delivery: Partner with analytics engineers, data scientists, product managers, and business stakeholders to deliver high-quality data solutions that enable business insights and product innovations Documentation & Best Practices: Contribute to documentation, promote data engineering best practices, and ensure data governance, security, and compliance standards are upheld What We Need: Experience: 3-6 years of hands-on experience as a Data Engineer or in a similar data engineering role, ideally in analytics-focused environments Databricks: Practical experience building and managing pipelines and workflows using Databricks Amazon Redshift: Strong understanding of Amazon Redshift, including data modeling and query optimization Programming: Proficiency in SQL and working knowledge of Python or Scala for data processing tasks ETL/ELT Tools: Hands-on experience developing and maintaining ETL/ELT processes Big Data Tools (Good to Have): Familiarity with Apache Spark, Hadoop, Kafka, or other big data technologies is a plus Analytical & Problem-Solving Skills: Ability to troubleshoot data issues and optimize performance effectively Communication & Collaboration: Strong communication skills with a collaborative mindset in fast-paced environments
Posted 2 weeks ago
2.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
The Salesforce Renewal Manager is responsible for owning and executing a portfolio of renewal contracts in an assigned territory. Renewal Managers partner with internal stakeholders such as Sales and Customer Success & Growth (CSG) organizations to secure every renewal. They are responsible for minimizing financial attrition, locking in the most favorable terms, identifying growth opportunities Renewal Managers are responsible for ensuring that customers are set up for success while maximizing the financial results for Salesforce. Responsibilities: Develop and execute win/win negotiation strategies for small account contract renewals that maximize contract value while protecting and enhancing customer trust. Own and manage the renewals process in collaboration with the account teams Collaborate with internal resources such as Customer Success, Account Executives, and Sales Operations to ensure successful renewals Identify customer requirements, uncover roadblocks, and manage the renewal to completion Communicate risk clearly and take steps to mitigate Accurately maintain/update a rolling 120 day forecast of your territory and communicate any renewal risk to internal resources in order to develop resolution strategies Follow and adhere to best practices for all internal processes including, but not limited to, Opportunity Management, Data Quality and accuracy, Quotations and Forecasting Achieves financial and strategic targets for minimizing attrition, positioning favorable terms and boosting incremental revenue via up-sells, cross-sells and add-ons Required Skills/Experience: 2+ years of demonstrated success in a Sales, Operations, or Account Management capacity with a focus on negotiating contracts Possess negotiation skills that allow for value-based contract negotiations with customer billing contact Strong process management and ability to manage a high volume of transactions and tasks.Customer management experience Bachelors Degree Desired Skills/Experience: Knowledge of salesforce.com product and platform features, capabilities, and best use Experience with an enterprise CRM or customer service application. Experience with salesforce.com a significant plus. Ability to manage transactions through every stage Leadership Qualities: PASSION: Passionate about Customer Success BEGINNERS MIND: Always learning, approaches each interaction with open mind, great listener and hands-on URGENCY: Ability to move fast and drive business value and results OHANA: Embodies Aloha culture: A team player that everyone enjoys working with and has a generous heart TRUST: Trust the company s core values ADAPTABLE: Excels in high levels of uncertainty and change
Posted 2 weeks ago
3.0 - 6.0 years
15 - 19 Lacs
Bengaluru
Work from Office
We are seeking an enthusiastic and detail-oriented Entry-Level BI Analyst to join our Global Sales Compensation Team. This full-time role focuses on creating sales compensation-related reports, dashboards, and scorecards primarily using Smartsheet, Power BI, and Excel. The ideal candidate will possess strong process improvement skills, a focus on data quality, and an interest in process automation. Knowledge of Sales Compensation and Callidus is desired. HOW you'll SPEND YOUR TIME HERE Develop and maintain sales compensation reports, dashboards, and scorecards using Smartsheet, Power BI, and Excel. Collaborate with global cross-functional teams, including Sales Operations and Commissions, to gather requirements and ensure accurate reporting. Integrate data from various platforms, including Workday, Callidus, Salesforce, Oracle, and others to ensure comprehensive reporting. Ensure data integrity and accuracy across all reporting platforms, with an emphasis on data quality. Assist in automating reporting processes to improve efficiency and reduce manual effort. Document processes and maintain operational documents for reporting activities. Contribute to process improvement initiatives to enhance reporting efficiency and effectiveness. we'd LOVE TO TALK TO YOU IF YOU HAVE MANY OF THE FOLLOWING bachelors degree in Business, Finance, Data Analytics, or a related field (recent graduates are encouraged to apply). Strong Excel skills for data analysis and reporting. Familiarity with Smartsheet and Power BI for report creation and data visualization. Experience with data integration from platforms such as Workday, Callidus, Salesforce, and Oracle is a plus. Knowledge of Sales Compensation is desired. Strong skills in process improvement, data quality, and process automation. Attention to detail and problem-solving abilities. Good communication and writing skills to effectively convey information and collaborate with team members. Ability to work collaboratively with diverse teams across different time zones.
Posted 2 weeks ago
6.0 - 12.0 years
16 - 20 Lacs
Gurugram
Work from Office
Risk Assurance Services (RAS) is one of PwC s high growth verticals. It supports clients in defining their strategy, formulating business objectives and managing performance while achieving a balance between risk and opportunity or return. Our services within the Risk Assurance practice cover the entire risk & controls spectrum across Internal Audit, Governance, Risk & Controls, Contract & Compliance, Data analytics etc Technical Skills Experience in Internal Audit/ Process Audit concepts & methodology Processes, Subprocesses, and Activities as we'll as their relationship Must be proficient in MS Office Sarbanes Oxley Act (SOX)/ IFC Reviews, SOP s Internal control concepts (eg, Preventive Controls; Detective Controls; Risk Assessment; Antifraud Controls; etc) Soft Skills Clarity of thought, articulation, and expression Takes ownership, sincere and focused on execution Confident and good verbal communication skills Ability to organize, prioritize and meet deadlines Mandatory skill sets Internal Audit Education qualification MBA/ M.Com/ MCA/ CA Education Degrees/Field of Study required Master of Business Administration, Chartered Accountant Diploma Degrees/Field of Study preferred Required Skills Internal Auditing Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Analytical Thinking, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Coaching and Feedback, Communication, Compliance Auditing, Corporate Governance, Creativity, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Financial Accounting
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Software Development Engineer II - Azure Data Engineering Back to job search results Tesco India Bengaluru, Karnataka, India Full-Time Permanent Apply by 07-Jun-2025 About the role We are looking for a skilled Data Engineer to join our team, working on end-to-end data engineering and data science use cases. The ideal candidate will have strong expertise in Python or Scala, Spark (Databricks), and SQL, building scalable and efficient data pipelines on Azure. What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles -simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Design, build, and maintain scalable ETL/ELT data pipelines using Azure Data Factory, Databricks, and Spark. Develop and optimize data workflows using SQL and Python or Scala for large-scale data processing and transformation. Implement performance tuning and optimization strategies for data pipelines and Spark jobs to ensure efficient data handling. Collaborate with data engineers to support feature engineering, model deployment, and end-to-end data engineering workflows. Ensure data quality and integrity by implementing validation, error-handling, and monitoring mechanisms. Work with structured and unstructured data using technologies such as Delta Lake and Parquet within a Big Data ecosystem. Contribute to MLOps practices, including integrating ML pipelines, managing model versioning, and supporting CI/CD processes. You will need Primary Skills: Data Engineering & Cloud: Proficiency in Azure Data Platform (Data Factory, Databricks). Strong skills in SQL and [Python or Scala] for data manipulation. Experience with ETL/ELT pipelines and data transformations. Familiarity with Big Data technologies (Spark, Delta Lake, Parquet). Data Optimization & Performance: Expertise in data pipeline optimization and performance tuning. Experience on feature engineering and model deployment. Analytical & Problem-Solving: Strong troubleshooting and problem-solving skills. Experience with data quality checks and validation. Nice-to-Have Skills: Exposure to NLP, time-series forecasting, and anomaly detection. Familiarity with data governance frameworks and compliance practices. Basics of AI/ML like: ML & MLOps Integration Experience supporting ML pipelines with efficient data workflows. Knowledge of MLOps practices (CI/CD, model monitoring, versioning) About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built. Apply
Posted 2 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Hyderabad
Work from Office
Software Engineering Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: Evernorth, a leading Health Services company, is looking for an exceptional software engineer to add to our Data & Analytics Engineering organization. The software engineer is responsible for the creation and delivery of new application features and maintenance that help our customers reduce health costs, improve outcomes, provide financial security, satisfy our regulatory commitments and measure business performance. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of this role are the ability to work independently, collaboratively as a team, proficient with technologies. Job Description & Responsibilities: Software Engineers translate business objectives into technical specifications and then code and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers - not institutionalized developers. Primary Skills: Python, SQL Git Testing methodologies Experience Required: 4+ years of experience in Python 4+ years of experience in SQL 4+ years of experience in GIT Experience Desired: Python Frameworks: Unit Testing (pytest, unittest), API (Flask, Django), SQLAlchemy Web Development API Development Experience with AWS development Experience with Databricks, PostgreSQL, Terraform, Docker, Linux Health care information domains preferred. Education and Training Required: Bachelor s degree (or equivalent) required. Additional Skills: Strong communication skills. Take ownership and accountability. Write referenceable & modular code. Have a passion to learn. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. General Shift (11:30 AM - 8:30 PM IST / 1:00 AM - 10:00 AM EST / 2:00 AM - 11:00 AM EDT) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services About Evernorth Health Services
Posted 2 weeks ago
2.0 - 4.0 years
3 - 6 Lacs
Hyderabad, Bengaluru
Work from Office
Paramatrix Technologies Pvt. Ltd is looking for Data Modeller to join our dynamic team and embark on a rewarding career journey As a Data Modeler, you will be responsible for designing and implementing data models, ensuring the integrity and performance of databases, and collaborating with other teams to understand data requirements Your role is pivotal in creating efficient and effective data solutions that align with business objectives Key Responsibilities:Data Modeling:Develop, design, and maintain conceptual, logical, and physical data models based on business requirements Ensure that data models are scalable, flexible, and support future business needs Database Design:Collaborate with database administrators and developers to implement and optimize database structures Design and implement indexing strategies to improve database performance Requirements Analysis:Work closely with business analysts and stakeholders to understand data requirements and translate them into data models Documentation:Create and maintain comprehensive documentation for data models, ensuring clarity and accessibility for other team members Data Governance:Implement and enforce data governance policies and best practices to ensure data quality and consistency Collaborate with data stewards to define and manage data standards Data Integration:Collaborate with ETL (Extract, Transform, Load) developers to ensure smooth data integration processes Design and optimize data integration workflows Data Quality Assurance:Implement data quality checks and validation processes to ensure the accuracy and reliability of data Collaboration:Work closely with cross-functional teams, including business analysts, data scientists, and software developers, to ensure seamless integration of data models into applications
Posted 2 weeks ago
2.0 - 7.0 years
20 - 25 Lacs
Bengaluru
Work from Office
About Us At ANZ, were shaping a world where people and communities thrive, driven by a common goal: to improve the financial wellbeing and sustainability of our millions of customers. About the Role Role Location : Manyata Technology Park, Bangalore We are looking for a talented Data Analyst to join our Australia Data tribe within the Australia Retail division. This role focusing on harnessing the power of data to generate insights, manage information, and support decision-making. As a Data Analyst, you will help foster a data-driven culture by developing analytics capabilities, promoting knowledge sharing, and ensuring adherence to standards, governance, and continuous learning across the tribe. In this role, you will support Customer Service and Operations (CSO) Analytics initiatives, focusing on Home Lending and Work ware Management data What will your day look like? Solve complex business challenges by analysing large datasets, defining data roadmaps, and sourcing insights from multiple sources. Understand and manage the complete analytics process, from data collection to delivering actionable insights. Promote a fact-based decision-making, problem solving culture and present insights to drive innovation and improve propositions across the organization. Work closely with business stakeholders to understand their needs, challenges, and goals and optimize strategies to enhance performance. Collaborate with other data analysts and data practitioners to define data requirements and ensure the delivery of impactful, data-driven solutions. What will you bring? 7 plus years of experience in data domain with expertise as Data Analyst. Data Wrangling - Manages large, structured data from multiple sources, applying Structured Queries to transform, join, and extract data for enhanced analysis. Descriptive and Diagnostic Analytics -Analyses advanced trends in complex data, drawing insights from diverse sources to solve critical business challenges and develop prescriptive analytics. Data Visualization - Creates sophisticated visualizations, drawing insights, and automating data visuals for decision-making using Qlik/Tableau Risk & Issue Management, Agile Practices & Ways of Working: Independently assesses risks and applies data security principles while leading Agile teams, optimizing workflows, and coaching on best practices. Data Modelling - Handles complex data structures, building trust in data reliability and ensuring accurate connections for advanced analytics. Soft skills - Problem Solving, Communication, Collaboration, Critical Thinking, Data Storytelling Data Quality & Validation Checks - Independently creates and applies checks to ensure accurate, complete data. Proficient in Python, Airflow and Git for workflow automation and version control. Familiar with cloud platforms (AWS, Azure, GCP) for efficient data management and processing. Expertise in complex SQL queries for large-scale data analysis and reporting.
Posted 2 weeks ago
12.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
What youll do Docusign is seeking a talented and results-oriented Director, Data Engineering to lead and grow the Bengaluru hub of our Global Data & Analytics Organisation. Your mission is to build the modern, trusted data foundation that powers Enterprise analytics, internal decision-making, and customer-facing insights. As a key leader, you will partner closely with Engineering, Product, GTM, Finance, and Customer Success to set the data strategy for Docusign India, focusing on data architecture, enterprise data foundation and data ingestion. During a typical day, you will drive the development of governed data products, operationalize quality and observability, and lead a high-performing team of data engineers and architects. The ideal candidate will demonstrate strong leadership, a passion for innovation in AI & data technologies, and the drive to achieve "five-nines" reliability for our data platforms. This position is a people manager role reporting to the Senior Director, Global Data & Analytics. Responsibility Own Snowflake architecture, performance and cost governance; define modelling standards (star, data vault, or ELT-first) and enforce security/RBAC best practices Scale our dbt codebase design project structure, modular macros, end-to-end CI/CD and automated testing so every pull-request ships with quality gates and lineage metadata Drive ingestion excellence via Fivetran (50 + connectors today, growing fast) Establish SLAs for freshness, completeness and incident response Embed with business stakeholders (Finance, Sales Ops, Product, Legal & Compliance) to translate ambiguous questions into governed data products and trusted KPIs especially SaaS ARR, churn, agreement throughput and IAM adoption metrics Lead and inspire a high-performing team of data engineers and architects; hire, coach, set OKRs Operationalise quality and observability using dbt tests, Great Expectations, lineage graphs and alerting so we achieve five-nines reliability Partner on AI initiatives deliver well-modelled features to data scientists Evaluate semantic layers, data contracts and cost-optimisation techniques Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic B.E./B.Tech. or M.S. in Computer Science, Data, or related quantitative field Advanced SQL and Python programming skills 12+ years in data engineering with demonstrable success in an enterprise environment 5+ years of experience as a people manager. Comfortable with Git-based DevOps Preferred Expert in Snowflake, dbt (production CI/CD, macros, tests) and Fivetran (setup, monitoring, log-based replication) Proven ability to model SaaS business processes ARR/ACV, funnel, usage, billing, marketing, security and compliance Track record building inclusive, global distributed teams Adept at executive communication and prioritisation Experience operationalizing data quality and observability using tools like dbt tests, data lineage, and alerting Experience partnering on AI initiatives and delivering features for data scientists Experience evaluating and implementing semantic layers, data contracts, and cost-optimization techniques Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic B.E./B.Tech. or M.S. in Computer Science, Data, or related quantitative field Advanced SQL and Python programming skills 12+ years in data engineering with demonstrable success in an enterprise environment 5+ years of experience as a people manager. Comfortable with Git-based DevOps Preferred Expert in Snowflake, dbt (production CI/CD, macros, tests) and Fivetran (setup, monitoring, log-based replication) Proven ability to model SaaS business processes ARR/ACV, funnel, usage, billing, marketing, security and compliance Track record building inclusive, global distributed teams Adept at executive communication and prioritisation Experience operationalizing data quality and observability using tools like dbt tests, data lineage, and alerting Experience partnering on AI initiatives and delivering features for data scientists Experience evaluating and implementing semantic layers, data contracts, and cost-optimization technique
Posted 2 weeks ago
1.0 - 3.0 years
5 - 6 Lacs
Mumbai
Work from Office
Job Title: Senior Primary Research Associate - Company Primary Outreach (Private Markets) Location: Mumbai, India Team: PitchBook Company Primary Outreach Team About PitchBook: The PitchBook team performs web-based research and utilizes technology tools to capture hard to-find data on private capital markets, including private equity (PE), venture capital (VC) and mergers and acquisitions (M&A). This information allows PitchBook customers such as venture capital firms, private equity firms, limited partners, investment banks, law firms, and accounting firms to discover emerging companies, conduct research on potential investment opportunities, and gain a competitive edge in investment decision-making and negotiations. Position Overview: We re looking for a Senior Primary Research Associate to join our dynamic Primary Outreach Team in Mumbai. This role focuses on building relationships and collecting proprietary data directly from industry professionals, including executives at private companies, investment firms, and service providers. You will play a key role in executing targeted outreach campaigns and ensuring high-quality data enrichment within the PitchBook platform. This role is ideal for someone who is curious, detail-oriented, great with communication, and passionate about delivering value through primary research and relationship management. This position will be with Pitchbook Primary Outreach team based at our Mumbai office. Key Responsibilities: Drive outreach initiatives to engage executives and key stakeholders across private equity, venture capital, and related industries. Design and execute email-based outreach campaigns to collect proprietary and high-impact data. Verify and enrich data for private companies, investors, and service providers featured in PitchBook s platform. Leverage internal tools and applications to input and manage data with accuracy and efficiency. Collaborate with internal experts and cross-functional teams to resolve data discrepancies and improve workflows. Support improvements in research processes, outreach strategies, and productivity tools. Act as a subject matter expert on outreach workflows, supporting the team with training, mentoring, and quality reviews. Manage special projects, own key processes, and contribute to team OKRs and performance metrics. Provide support and assist team members in navigating complex research problems. Actively participate in cross-team initiatives and provide insights for strategic improvements. Requirements: 18 to 30 months of relevant work experience, preferably in financial research, data operations, or the broader financial services sector. Bachelor s degree required. Strong verbal and written communication skills, with the ability to interact professionally with stakeholders at all levels, including C-suite executives. Proven ability to manage multiple tasks under tight deadlines while maintaining a high level of data accuracy. High attention to detail, strong analytical skills, and a commitment to data quality. Proactive and self-motivated, with the ability to work independently as well as collaboratively. Comfortable working in a fast-paced, evolving environment with a learning mindset and adaptability to new tools and processes. Demonstrated leadership potential through ownership of tasks, mentorship, and process improvement initiatives. Experience or familiarity with private markets (PE/VC/M&A) is a plus. Morningstar is an equal opportunity employer
Posted 2 weeks ago
15.0 - 16.0 years
8 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Program Manager 15 + years of Experience Bangalore Role: Lead Cross-Functional Teams : Oversee and coordinate AI/ML projects, ensuring they align with business objectives and are delivered on time and within budget. Develop Program Plans : Create and manage comprehensive program plans, including budgets, timelines, and resource allocation. Stakeholder Management : Collaborate with stakeholders, including data scientists, engineers, business analysts, and executives, to define project goals and scope. Performance Tracking : Monitor program progress, track performance metrics, and address potential roadblocks. Innovation and Strategy : Drive innovation by staying updated with the latest AI/ML trends and integrating them into the program strategy. Mandatory Skillsets 15 years of extensive experience in program management, adept at leading diverse teams and managing complex projects. Technical Expertise : Deep understanding of AI/ML technologies, including machine learning algorithms, tools (e.g., TensorFlow, PyTorch), and frameworks. Program Management : Proven experience in managing large-scale AI/ML projects, with strong skills in project planning, execution, and delivery. Leadership : Strong leadership skills with the ability to lead cross-functional teams and manage multiple projects simultaneously. Communication : Excellent communication skills to effectively convey technical concepts to non-technical stakeholders and executives. Analytical Skills : Proficiency in quantitative analysis, cost-effectiveness assessment techniques, and data quality metrics for AI products. Experience : Approximately 15 years of experience in program management, with a significant portion in AI/ML projects. Good to Have Skillsets Cloud Computing : Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) and their AI/ML services. Business Acumen : Understanding of business processes and the ability to align AI/ML projects with business goals. Program Manager 15 + years of Experience Bangalore Role: Lead Cross-Functional Teams : Oversee and coordinate AI/ML projects, ensuring they align with business objectives and are delivered on time and within budget. Develop Program Plans : Create and manage comprehensive program plans, including budgets, timelines, and resource allocation. Stakeholder Management : Collaborate with stakeholders, including data scientists, engineers, business analysts, and executives, to define project goals and scope. Performance Tracking : Monitor program progress, track performance metrics, and address potential roadblocks. Innovation and Strategy : Drive innovation by staying updated with the latest AI/ML trends and integrating them into the program strategy. Mandatory Skillsets 15 years of extensive experience in program management, adept at leading diverse teams and managing complex projects. Technical Expertise : Deep understanding of AI/ML technologies, including machine learning algorithms, tools (e.g., TensorFlow, PyTorch), and frameworks. Program Management : Proven experience in managing large-scale AI/ML projects, with strong skills in project planning, execution, and delivery. Leadership : Strong leadership skills with the ability to lead cross-functional teams and manage multiple projects simultaneously. Communication : Excellent communication skills to effectively convey technical concepts to non-technical stakeholders and executives. Analytical Skills : Proficiency in quantitative analysis, cost-effectiveness assessment techniques, and data quality metrics for AI products. Experience : Approximately 15 years of experience in program management, with a significant portion in AI/ML projects. Good to Have Skillsets Cloud Computing : Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) and their AI/ML services. Business Acumen : Understanding of business processes and the ability to align AI/ML projects with business goals.
Posted 2 weeks ago
14.0 - 16.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director Delivery Management Specialist Key Responsibilities: Provide technical guidance to the application teams on removal of IT configuration items as part of end-to-end decommissioning process and contribute towards project planning. Identify, assess, and mitigate risks associated with application decommissioning and provide guidance and support towards data migration and ensure proper data archiving. Identify data quality issues pertaining to Service/Applications and work with ITSOs for remediation. Manage senior stakeholders, including business leaders, IT teams, and vendors, to ensure effective communication and collaboration. Monitor the execution of removal of IT assets and configuration items in ensuring zero tech debris and charges. Coordinate with various stakeholders for product / service delivery review, integration & enhancement and drive continuous improvements to related processes. Coordinate with decommissioning service requests execution teams and escalates where necessary. Govern the application portfolio, ensuring correct ownership, including on-boarding and transfer of applications in accordance with the organisation s procedures Requirements Qualifications External To be successful in this role you should meet the following requirements: Bachelor s degree in computer science engineering or related field. 14+ years of relevant IT experience. Strong understanding of application architecture and familiarity with decommissioning tools and techniques, such as data archiving/retention and infrastructure removal. Knowledge of IT service management frameworks, such as ITIL and proven experience with project and programme management, primarily focused on IT Asset Management & Configuration Management. Good understanding of decommissioning processes of IT assets & configuration items. Good understanding of tools such as EIM, ServiceNow, ODS, UCMDB, Apptio, etc. Experience of identifying and remediating data quality issues pertaining to application lifecycle data. Strong Stakeholder Management skills including publishing scorecards/dashboards to CIOs. Experience of process improvements, automation, risk management, and control(s) adherence. Experience of working on large scale decoms, orphaned assets management process, data centre exits/migrations would be an added advantage. You ll achieve more when you join HSBC. HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working, and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Posted 2 weeks ago
12.0 - 14.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design and implement scalable, reliable, and high-performance data architectures to support business needs. Develop and maintain real-time data streaming solutions using Kafka and other streaming technologies. Utilize AWS cloud services to build and manage data infrastructure, ensuring security, performance, and cost optimization. Create efficient and optimized data models for structured and unstructured datasets. Develop, optimize, and maintain SQL queries for data processing, analysis, and reporting. Work with cross-functional teams to define data requirements and implement solutions that align with business goals. Implement ETL/ELT pipelines using Python and other relevant tools. Ensure data quality, consistency, and governance across the organization. Troubleshoot and resolve issues related to data pipelines and infrastructure. Required Skills and Qualifications: Experience in Data Engineering and Architecture. Proficiency in Python for data processing and automation. Strong expertise in AWS (S3, Redshift, Glue, Lambda, EMR, etc.) for cloud-based data solutions. Hands-on experience with Kafka for real-time data streaming. Deep understanding of data modeling principles for transactional and analytical workloads. Strong knowledge of SQL for querying and performance optimization. Experience in building and maintaining ETL/ELT pipelines. Familiarity with big data technologies like Spark, Hadoop, or Snowflake is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and stakeholder management skills EXPERIENCE 12-14 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Kafka, Python, Data Modeling, ETL, Data Architecture, SQL, Redshift, Pyspark
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Pune
Work from Office
About Arctera Arctera keeps the world s IT systems working. We can trust that our credit cards will work at the store, that power will be routed to our homes and that factories will produce our medications because those companies themselves trust Arctera. Arctera is behind the scenes making sure that many of the biggest organizations in the world - and many of the smallest too - can face down ransomware attacks, natural disasters, and compliance challenges without missing a beat. We do this through the power of data and our flagship products, Insight, InfoScale and Backup Exec. Illuminating data also helps our customers maintain personal privacy, reduce the environmental impact of data storage, and defend against illegal or immoral use of information. It s a task that continues to get more complex as data volumes surge. Every day, the world produces more data than it ever has before. And global digital transformation - and the arrival of the age of AI - has set the course for a new explosion in data creation. Joining the Arctera team, you ll be part of a group innovating to harness the opportunity of the latest technologies to protect the world s critical infrastructure and to keep all our data safe. Princ IT Business Analyst - NetSuite - EPM, Procurement, Fixed Assets & Tax Company Overview Arctera is a new company recently separated from Veritas Technologies. Our mission is to help organizations thrive by delivering market-leading solutions that enable them to trust, access, and illuminate their critical data. From day one, we are proud to serve more than 75,000 customers worldwide. Arctera products are engineered to manage and protect data across the full span of its lifecycle. Arctera enables organizations to harness the power of their information, with Data Compliance, Data Protection & Data Resilience solutions designed to serve the world s largest and most complex heterogeneous environments. Job Summary We are seeking a skilled NetSuite ERP Business Analyst to work directly with our EPM, Procurement, Fixed Assets and Tax teams to analyze & design robust end-to-end financial solutions. Role would be to lead business discussions to define business requirements and translate them to technology solutions by mapping business processes to NetSuite functionalities. Roles & Responsibilities Analyse & understand complex business processes to identify requirements by working closely with business stakeholders and cross functional IT teams. Translate business requirements into NetSuite solutions by designing functions and workflows. Ensure integrations with other applications to support business operations. Lead design & implementation of financial process flows in NetSuite platform by leveraging inbuilt NetSuite capabilities to streamline & enhance operations. Support revenue recognition & month end close processing. Maintain data integrity and accuracy within NetSuite system. Manage data imports, exports, and conduct regular audits to maintain data quality. Provide technical support and training to end-users. Troubleshoot issues and resolve system-related problems to ensure smooth operations. Document and execute functional test plans, user guides and other documentation to support use and maintenance of NetSuite platform. Required Skills, Experience & Education Bachelor s degree in computer science or engineering or any equivalent field. Master s degree preferred. 10+ years of proven experience as a business analyst / functional consultant in implementing ERP solutions for variety of applications. 5 years of experience in NetSuite implementation/maintenance is required. Exceptional knowledge of core global financial processes like finance, order to cash, procure to pay, record to report, financial close and intercompany transactions. Strong command of NetSuite modules like procurement, billing, accounting, revenue, tax, fixed assets, general ledger, advanced revenue recognition and multi-currency. Experience in Enterprise Performance Management (EPM) module for planning, budgeting, forecasting, account reconciliation, financial close, and reporting processes across the entire organization. Strong firsthand experience in customization and configuration of NetSuite including creating custom fields, forms, reports, and dashboards. Skilled at NetSuite data migration tools and proficient with SuiteScript, SuiteFlow, and SuiteAnalytics. Exposure to integration technologies and patterns. Experience in designing and implementing scalable and reliable integration solutions around NetSuite. Proficiency in mapping data flows, designing integration architectures, and documenting technical requirements. Support in testing activities, including integration testing, end-to-end (business process) testing and UAT. Effective communication & collaboration skills, adaptability to a dynamic start-up environment. Familiarity with cloud platforms and services. NetSuite certifications are a plus.
Posted 2 weeks ago
5.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Design and build scalable data architectures and pipelines that support machine learning, analytics, and IoT initiatives. Develop and optimize data models and algorithms to process and analyse large-scale, complex data sets. Implement data governance, security, and compliance measures to ensure high-quality Collaborate with cross-functional teams (engineering, product, and business) to translate business requirements into data-driven solutions. Evaluate, integrate, and optimize new data technologies to enhance analytics capabilities and drive business outcomes. Apply statistical methods, machine learning models, and data visualization techniques to deliver actionable insights. Establish best practices for data management, including data quality, consistency, and scalability. Conduct analysis to identify trends, patterns, and correlations within data to support strategic business initiatives. Stay updated on the latest trends and innovations in data technologies and emerging data management practices. Skills Required : Bachelors or masters degree in data science, Computer Science, Engineering, Statistics, or a related field. 4.5-9 years of experience in data engineering , data science, or a similar analytical role, with a focus on emerging technologies. Proficiency with big data frameworks (e.g., Hadoop, Spark, Kafka) and experience with modern cloud platforms (AWS, Azure, or GCP). Solid skills in Python, SQL, and optionally R, along with experience using machine learning libraries such as Scikit-learn, TensorFlow, or PyTorch. Experience with data visualization tools (e.g., Tableau or Power BI or D3.js) to communicate insights effectively. Familiarity with IoT and edge computing data architectures is a plus. Understanding of data governance, compliance, and privacy standards. Ability to work with both structured and unstructured data. Excellent problem-solving, communication, and collaboration skills, with the ability to work in a fast-paced, cross-functional team environment. A passion for emerging technologies and a continuous desire to learn and innovate. Notice Period: Immediate to 15 Days Only Work Location : Bangalore(Hybrid) Apply for Position Or refer someone
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane