Jobs
Interviews

1190 Normalization Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

About Firstsource Firstsource Solutions Limited, an RP-Sanjiv Goenka Group company (NSE: FSL, BSE: 532809, Reuters: FISO.BO, Bloomberg: FSOL:IN), is a specialized global business process services partner, providing transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, Retail, and other diverse industries. With an established presence in the US, the UK, India, Mexico, Australia, South Africa, and the Philippines, we make it happen for our clients, solving their biggest challenges with hyper-focused, domain-centered teams and cutting-edge tech, data, and analytics. Our real-world practitioners work collaboratively to deliver future-focused outcomes. DBA Developer Perform DBA Task , Like SQL Server Installation, Backups, Configure HADR, Clustering and Logshipping Performance Tuning Audit and Compliance Knowledge of Databases Design database solutions using tables, stored procedures, functions, views, and indexes Data Transfer from Dev environment to Production and other related environment Schema Comparison Bulk operations Server side coding Understanding Normalization, Denormalization, Primary Keys, Foreign Keys and Constraints, Transactions, ACID, Indexes as optimization tool,Views Working with Database Manager in creating physical tables from logical models ETL, Data Migration (using CSV,EXCEL,TXT files),Adhoc Reporting Migration of Database from Older Version of SQL Server to New Versions Distributed DB’s , Remote Server and configuring LinkServers Integrating SQL Server with Oracle using Openqueries ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

3 - 6 Lacs

Hyderābād

On-site

Job Title: SQL Developer Department: IT Reports To: HR Head Location: Hyderabad Employment Type: Full-time Job Summary: We are looking for a highly skilled SQL Developer with 5 to 6 years of hands-on experience in SQL development, database design, and performance tuning. The ideal candidate will work closely with data analysts, application developers, and business stakeholders to ensure efficient and optimized data solutions Key Responsibilities: Design, develop, and maintain complex SQL queries, stored procedures, functions, and views. Perform performance tuning, query optimization, and troubleshooting. Work with large data sets and ensure data integrity and consistency. Collaborate with application developers and business analysts to understand data requirements. Maintain and optimize existing SQL code for better performance and scalability. Create and maintain documentation of database architecture and processes. Implement data security measures and follow best practices in database design. Support ETL processes and reporting needs when necessary Required Skills: Strong proficiency in SQL (T-SQL / PL-SQL / MySQL / PostgreSQL – as applicable). Experience with one or more RDBMS: SQL Server, Oracle, MySQL, or PostgreSQL. Hands-on experience with writing complex queries, indexing, and performance tuning. Understanding of normalization, data modeling, and relational database concepts. Experience with SSIS/SSRS/ETL tools (if required). Job Type: Full-time Pay: ₹33,333.00 - ₹50,000.00 per month Benefits: Provident Fund Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

What will you Do? Conduct comprehensive internet research to gather external data sources and leverage them to enrich our company's internal data. Utilize advanced Microsoft Excel functions and formulas to analyze, manipulate, and validate large datasets efficiently. Develop and maintain data enrichment processes, including company domain enrichment and matching against CRM data using fuzzy matching techniques. Collaborate with cross-functional teams to define data quality standards and establish best practices for data enrichment and maintenance. Identify and resolve data quality issues, such as duplicate records, inconsistent data formats, and missing values. Perform data cleansing activities, including data standardization, normalization, and deduplication. Evaluate and validate data from internet sources to ensure accuracy and reliability. Stay updated with industry trends and advancements in data enrichment techniques to enhance data quality and efficiency. Document data enrichment processes, procedures, and guidelines for future reference and training purposes. What will you Need? Certification in Korean Language - TOPIK II(Level 3/Level 4) Basic understanding of Microsoft Excel, such as data analysis, complex formulas, and pivot tables. Strong research and analytical skills, with the ability to identify relevant data sources and extract accurate information. Excellent attention to detail, ensuring accuracy and completeness in data enrichment processes. Strong communication and collaboration skills, with the ability to work effectively in cross-functional teams.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 weeks ago

Apply

0 years

0 Lacs

Assam, India

On-site

The Database Administrator is responsible for the design, implementation, maintenance, and performance tuning of critical database systems to ensure high availability, security, and optimal performance. The Database Administrator Will Work Closely With Application Teams, System Administrators, And Project Stakeholders To Ensure That Database Systems Are Robust, Scalable, And Aligned With Organizational Goals, While Also Managing Data Integrity, Access Controls, And Compliance With Relevant Policies And Skilled in working with relational databases such as PostGres, MariaDB and nonrelational databases like MongoDB. Expertise in database design, normalization, and optimization. Knowledge of SQL and query optimization. Familiarity with backup and recovery procedures. Understanding of high availability and disaster recovery solutions. Experience with database security and access control. Proven track record of managing and maintaining large- scale databases. Experience with both on-premises and cloud-based database environments. Strong analytical and problem-solving skills related to database performance and scalability Installed, configured, and maintained database systems based on organizational needs. Implemented and optimized database parameters to ensure optimal performance Conducted performance tuning and optimization of queries and database structures. Monitored and analyzed system performance, making recommendations for improvements Designed and implemented backup and recovery strategies to ensure data integrity and availability. Conducted regular testing of backup and recovery procedures Provided timely and effective support for database- related issues. Conducted root cause analysis for incidents and implemented preventive measures. Maintained comprehensive documentation of database configurations, procedures, and best practices Responsibilities Database Strategy & Architecture : Contribute to the design and implementation of scalable and secure database solutions that align with organizational needs. Work collaboratively with IT and development teams to support the development of reliable and efficient database architectures. Apply database design best practices and assist in enforcing standards across development and production environments. Support the evaluation and adoption of new database tools, technologies, and frameworks under the guidance of technical leads. Database Administration & Maintenance Manage and maintain operational health of production and non-production databases, ensuring optima l uptime and performance. Perform routine database maintenance tasks such as backups, indexing, archiving, and patching. Implement and regularly test disaster recovery plans, ensuring data availability and integrity. Monitor system logs, resolve issues related to slow queries, deadlocks, or storage bottlenecks, and escalate where needed. Security & Compliance Ensure database security through role-based access control, encryption, and secure configurations. Monitor for unauthorized access or unusual activity, working with the security team to respond to threats. Support compliance initiatives by ensuring databases adhere to relevant regulatory standards (e.g., GDPR, HIPAA, or local data laws). Maintain and implement database security policies and assist in audits and reviews as required. Performance Tuning & Optimization Analyze database workloads to identify and address performance bottlenecks. Optimize SQL queries, indexes, and execution plans for better efficiency. Participate in capacity planning and help forecast database scaling needs. Collaborate with developers to review and optimize database schemas and application queries. Database Deployment & Integration Coordinate the deployment of database updates, patches, and schema changes with minimal operational impact. Support database migration and integration efforts across systems and applications. Assist with cloud platform integrations and ensure database components interact smoothly with analytics tools and data pipelines. Database Monitoring & Reporting Implement and manage monitoring tools to track database performance, uptime, and resource utilization. Generate routine health check reports and highlight areas for improvement. Provide input into database performance dashboards and reporting tools used by the IT or DevOps teams. Documentation & Best Practices Maintain accurate documentation for database configurations, maintenance procedures, and incident resolutions. Follow and contribute to database management policies and operational standards. Keep troubleshooting guides and knowledge base entries up to date for use by the IT support team. Collaboration With Business Teams Work closely with business and application teams to understand data requirements and support solution development. Ensure databases are structured to support reporting, analytics, and business intelligence tools. Assist in designing and maintaining data models that reflect evolving business processes. Qualification B.E. / B. Tech in any specialization or MCA. DBA certification or related certifications is preferable. Overall Experience in design, implementation and management of database systems. 7 or more years of experience in large and complex IT systems development and implementation projects. Experienced in Database Management. Fluency in English and Hindi (Speaking, reading & writing). Fluency in Assamese preferrable (ref:hirist.tech)

Posted 3 weeks ago

Apply

5.0 - 31.0 years

3 - 6 Lacs

Secunderabad, Hyderabad Region

On-site

Job Title: SQL Developer / Database Specialist Experience: 5 to 6 Years Location: [Your Location or "Remote"] Employment Type: Full-time Job Description: We are looking for a highly skilled SQL Developer with 5 to 6 years of hands-on experience in SQL development, database design, and performance tuning. The ideal candidate will work closely with data analysts, application developers, and business stakeholders to ensure efficient and optimized data so Key Responsibilities: Design, develop, and maintain complex SQL queries, stored procedures, functions, and views. Perform performance tuning, query optimization, and troubleshooting. Work with large data sets and ensure data integrity and consistency. Collaborate with application developers and business analysts to understand data requirements. Maintain and optimize existing SQL code for better performance and scalability. Create and maintain documentation of database architecture and processes. Implement data security measures and follow best practices in database design. Support ETL processes and reporting needs when necessary. Required Skills: Strong proficiency in SQL (T-SQL / PL-SQL / MySQL / PostgreSQL – as applicable). Experience with one or more RDBMS: SQL Server, Oracle, MySQL, or PostgreSQL. Hands-on experience with writing complex queries, indexing, and performance tuning. Understanding of normalization, data modeling, and relational database concepts. Experience with SSIS/SSRS/ETL tools (if required).

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Category Job Type: Job Location: Salary Years of Experience: Position Title: SQL/PLSQL Developer Experience Level: 5 Years Location: Gurgaon Employment Type: Full-Time Job Summary We are seeking an experienced SQL/PLSQL Developer to join our team. The ideal candidate will have a strong background in designing, developing, and optimizing database solutions. They will work closely with business analysts, data architects, and application developers to ensure seamless integration of database solutions into our applications and processes. The candidate should have experience in Data warehousing projects and worked on at least one ETL tool like Informatica, ODI, Datastage etc, also have good understanding of Data warehousing concept and data modelling skills. Key Responsibilities Database Development: Write and maintain complex SQL queries, stored procedures, functions, triggers, and packages. Design and optimize PLSQL-based solutions for high-performance transactional and analytical applications. Implement advanced database features like partitions, materialized views, and indexing strategies. Performance Tuning Analyze and optimize SQL queries and PLSQL code for performance improvement. Debug and resolve performance bottlenecks in database operations. Data Integration & ETL Develop and maintain ETL processes to integrate data from various sources. Handle data cleansing, transformation, and enrichment as per requirements. Good understanding of data warehouse concepts. Worked on data modelling Facts and Dimensions. Collaboration & Documentation Work with cross-functional teams to understand business requirements and translate them into technical specifications. Create and maintain documentation for database designs, data models, and codebases. Required Skills & Qualifications Technical Skills Proficiency in SQL and PLSQL development (Oracle preferred). Strong understanding of relational database concepts, normalization, and indexing. Experience with query optimization and troubleshooting performance issues. Familiarity with database tools like TOAD, SQL Developer, or similar. Exposure to Agile methodologies and tools like JIRA. Knowledge of ETL tools (Tableau, OBIEE) and Data Integration tools (Informatica, ODI). Understanding of cloud database services (e.g., AWS RDS, Azure SQL) is added advantage. Analytical Skills Ability to analyze complex requirements and design efficient database solutions. Strong problem-solving skills to resolve technical challenges effectively. Communication Skills Excellent written and verbal communication to collaborate with stakeholders effectively. Ability to create clear and concise technical documentation. Experience Hands-on experience with Oracle databases (version 11g or higher). Experience in database migration, data warehousing, or working with large datasets is a plus.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Flowserve is a world-leading manufacturer and aftermarket service provider of comprehensive flow control systems. Join a company whose people are committed to building a more sustainable future to make the world better for everyone. With 16,000+ employees in 50+ countries, we combine our global reach with local presence. Our team challenges themselves to approach each situation with ingenuity and creativity to help provide our customers with the most innovative flow control products and services. We support 10,000+ customers worldwide, creating products to meet the needs of our customers who are supplying energy, fresh water, pharmaceuticals and other essentials to consumers, businesses and governments globally. We invite you to put your talents and career in motion at Flowserve. Company Overview: If a culture of excellence, innovation and ownership is what you’re searching for, consider putting your experience in motion at Flowserve. As an individual contributor, or as a leader of people, your enterprise mindset will ensure Flowserve’s position as the global standard in comprehensive flow control solutions. Here, your opportunity for professional development and industry leading rewards will be supported by our foundational commitments to the values of people first, integrity and safety. Thinking beyond opportunity and reward, at Flowserve, we are inspired by working together to create extraordinary flow control solutions to make the world better for everyone! Role Summary: Flowserve is seeking the best and the brightest to join its team of IT Analysts. Experienced in SAP S4 PP/QM Implementation and rollouts across regions. Capable of leading the PTM thread and manage implementation full cycle and handle all business requirements. Responsibilities: PTM Modules configuration and implementation in SAP S4 / rollouts from legacy ERP to S4 application. Ensure high level accuracy of acquired data. Provide high quality data reporting services. Perform data testing, cleaning, and normalization activities. Support data process leads in KPI maturity roadmaps. Exclusively work with standardized design format files and mapping templates. Support the setup of a ticket system to take in and manage data processing service requests. Complete data processing service requests within target lead-times. Team player with excellent communication and collaborative skills. Other duties as assigned. Requirement: Bachelors degree in information systems, computer science or related discipline 8+ years of experience working in SAP PP/QM Should be able work on cross modules in S4 PTM Modules configuration and implementation in SAP S4 / rollouts from legacy ERP to S4 application. Ability to work independently and as a team when required once requirements are provided and to work efficiently with technical and non-technical stakeholders Must have excellent data analytical and interpretation skills Problem solving mindset and attention to detail Should be motivate and self-directed Excellent analytical and communication skills (oral and written) Req ID : R-15364 Job Family Group : Information Technology Job Family : IT Business Analysis EOE including Disability/Protected Veterans. Flowserve will also not discriminate against an applicant or employee for inquiring about, discussing or disclosing their pay or, in certain circumstances, the pay of their co-workers. Pay Transparency Nondiscrimination Provision If you are a qualified individual with a disability or a disabled veteran, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access flowservecareers.com as result of your disability. You can request a reasonable accommodation by sending an email to employment@flowserve.com. In order to quickly respond to your request, please use the words "Accommodation Request" as your subject line of your email. For more information, read the Accessibility Process.

Posted 3 weeks ago

Apply

14.0 years

0 Lacs

India

Remote

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life Careers that Change Lives Principal Software Engineer in the Cardiac Rhythm Management (CRM) R&D Software Organization developing software supporting Medtronic implantable cardiac devices. The individual will operate in all phases and contribute to all activities of the software development process. Candidates must be willing to work in a fast paced, multi-tasking, team environment. A Day in the Life Design, Develop and test Software high integrity software for class II and III medical devices. Learn and understand software standards for Medical devices, ex. IEC62304. Define and implement software requirements and designs and review software developed by other team members. Contributes and applies advanced technical principles, theories, and concepts to solve complex technical problems. Participate in process improvement initiatives for the software team. This includes recognizing areas for improvement as well as working with others to develop and document process improvements. Demonstrate ownership of software feature/module and drive development of the feature/module through SDLC. Provide hands-on leadership, coaching, mentoring, and software engineering best practices to junior software engineers. Develop reusable patterns and encourage innovation that will increase team velocity. Maintain, improve and design new software tools. These tools use either scripting languages (Perl, Python), programming languages (Java, C, C#), or web technology (HTML5, JavaScript). Work under general direction and collaboratively with internal and external partners. Continuously keep updated with latest technology trends and channel that learning to Medtronic Product development Must Have Job Responsibilities Experience in software design for medical devices. Hands on experience in developing implantable System Software components related to data acquisition, Real Time Data processing and data presentation. Experience in defining control system state machine for processing real time data and synchronizing real time data across different inputs. Applying industry standard best practices to develop system software complying to security requirements to ensure patient privacy and safety. Experience in developing Firmware and Device Drivers for embedded peripherals. Experience in developing simulators for simulating implantable device behavior through design patterns and architecture patterns. Hands on experience in Blue Tooth enabled device communication. Hands on experience in Mobile Operating System apps development targeted at Class III Medical Systems. Strong oral and written communication skills Experience with configuration management tools Proficiency working in a team environment Demonstrated skills in writing engineering documents (specifications, project plans, etc) Must Have Minimum Qualification B.E/BTech.in Computer Science Engineering and 14+ years of experience (or ME/MTech in computers science and 12+ years) Strong programming skills in C#, .NET and Strong knowledge of software design, development, debug and test practices Apply best practices to develop software that’s driven by test first approach. Create automation protocols to test complex software stack for behavior and coverage. Provide design guidance for designing Networking Services (Web Services, SOAP and REST services) for communicating over TCP / UDP between Tablet and external Servers. Perform thorough analysis and synthesis of data at hand to apply relevant software engineering algorithms to provide best user experience for real time data representation. Should be able to design systems that comply to object oriented design patterns for scalability and extensibility. Should be able to analyze system requirements, map them to sub system requirements , create design and design artifacts using UML diagrams, provide traceability into Requirements, Should be able to understand Operating System thread priorities, thread scheduling concepts and apply those concepts to realize efficient and optimal flow of data through the system for real time data processing. Apply software engineering principles for requirement analysis, requirement prioritization, life cycle models such as waterfall, Agile. Should be able to understand Web Based applications design , remote procedure calls and distributed computing and apply those concepts to Product development. Should be able to understand concepts of relational data base management, normalization of tables, and design well normalized data base tables. Should be able to understand Socket communication and design/development of applications involving socket communication across process boundaries. Should be able to perform build system management through thorough understanding of compiler optimization, compiler design. Principal Working Relationship Reports to the Engineering Manager The Principal Software Engineer frequently interacts with Product Owner, Tech Lead, other developers, V&V engineers, internal partners and stakeholders concerning estimations, design, implementation or requirement clarifications, works closely with global sites. Nice to Haves 5+ years of experience in software design for medical devices Strong Leadership skills and mentoring capabilities Experience in mobile software development, ex. iOS, Android Experience in web based technologies, ex. HTML5, JavaScript, CSS Experience in Microsoft Visual Studio development platforms/Azure DevOps/Github/tools Experience in Open Source development platform/tools Effectively communicate and operate within a cross-functional work environment. (Mechanical Engineering, Systems Engineering, Firmware Development, Software Development, Test Development, Manufacturing) Experience leading a software development team. TECHNICAL SPECIALIST CAREER STREAM: An individual contributor with responsibility in our technical functions to advance existing technology or introduce new technology and therapies. Formulates, delivers and/or manages projects assigned and works with other stakeholders to achieve desired results. May act as a mentor to colleagues or may direct the work of other lower level professionals. The majority of time is spent delivering R&D, systems or initiatives related to new technologies or therapies – from design to implementation - while adhering to policies, using specialized knowledge and skills. DIFFERENTIATING FACTORS Autonomy: Recognized expert, managing large projects or processes. Exercises considerable latitude in determining deliverables of assignments, with limited oversight from manager. Coaches, reviews and delegates work to lower level specialists. Organizational Impact: Contributes to defining the direction for new products, processes, standards, or operational plans based on business strategy with a significant impact on work group results. May manage large projects or processes that span outside of immediate job area. Innovation and Complexity: Problems and issues faced are difficult, moderately complex and undefined, and require detailed information gathering, analysis and investigation. Develops solutions to moderately complex problems, and/or makes moderate to significant improvements of processes, systems or products independently to enhance performance of job area. Implements solutions to problems. Communication and Influence: Represents organization as a primary contact for specific projects and initiatives; communicates with internal and external customers and vendors at various levels. May negotiate with others to reach understanding or agreement, and influence decision-making. Leadership and Talent Management: Typically provides guidance, coaching and training to other employees within job area. Typically manages major / moderately complex projects, involving delegation of work and review of work products, at times acting as a team leader. Required Knowledge and Experience: Requires mastery of a specialty area and full knowledge of industry practices, typically obtained through advanced education combined with experience. May have broad knowledge of project management. Requires a Baccalaureate Degree and minimum of 7 years of relevant experience, or advanced degree with a minimum of 5 years of relevant experience. (For degrees earned outside of the United States, a degree which satisfies the requirements of 8 C.F.R. § 214.2(h)(4)(iii)(A)). Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Hyderābād

On-site

Role: Data Domain Expert - Commercial Location: Hyderabad, India Full/ Part-time: Full-time Build a career with confidence Carrier is a leading provider of heating, ventilating, air conditioning and refrigeration systems, building controls and automation, and fire and security systems leading to safer, smarter, sustainable, and high-performance buildings. Carrier is on a mission to make modern life possible by delivering groundbreaking systems and services that help homes, buildings and shipping become safer, smarter, and more sustainable. Our teams exceed the expectations of our customers by anticipating industry trends, working tirelessly to master and revolutionize them. About the role Carrier is looking for an experienced data management professional to join our team as Business Enterprise Data Domain Expert Level 3 - Senior Commercial. The Data Domain Expert Level 3 - Senior Commercial will be part of the Carrier Business Services (CBS) Global Master Data Management (MDM) team and will support the master data improvement strategy and roadmap for the Commercial function. The role will support the implementation of structural improvements, delivering the highest Master Data quality level aiming at running efficient end-to-end business processes and allowing for decision making based on trusted data. The role supports the Data Principal Commercial on the data aspect of the Commercial function in close collaboration with the people, process, and technology partners. He/She designs and implements standardized end to end processes and tools for the maintenance of Commercial Master Data, delivering improved service quality, optimized cost to serve, controls & compliance, as well as enhancing customer experience. It also supports some Commercial function projects from a data perspective. He/She operates in a highly complex systems landscape and within a matrix organization. System complexity includes at least 100+ ERP instances, several other enterprise-level systems, supported by multiple service providers. He/She manages stakeholders’ relationships with business process managers across the Commercial function to implement the strategy and roadmap for high quality Master Data. Key Responsibilities: Being part of an enthusiastic and dynamic team, your responsibilities include: Business Data Analysis Data Analysis and Interpretation: Analyse data using statistical methods and tools to discover trends, patterns, and insights that can inform business decisions. Reporting and Visualization: help businesses create reports and dashboards in Qualtrics using data visualization tools to present complex data in an understandable and visually appealing manner Developing and Implementing Data Models: Develop models to address business issues. This may include predictive models, segmentation strategies, or other statistical models. Collaborating with Stakeholders: Work closely with BU’s and stakeholders to understand their data requirements and deliver insights and insights training that meet their needs. Identifying Opportunities for Process Improvement: Use data to identify inefficiencies or areas for improvement in business processes and recommend solutions. Supporting Data-Driven Decision-Making: Provide the data and analysis needed to support decision-making across the company, including local continuous improvement Staying Up-to-Date with Industry Trends and Tools: Keep abreast of the latest trends in data analytics, tools, and technologies to continuously improve data analysis capabilities. Work closely with Carrier Commercial MDM team on the following Data Quality improvements Data Enrichment: Gather additional customer-related data attributes from well-defined sources (mainly Salesforce CRM), ensuring it is accurate, in time, and compliant with the architecture in place. Ensure compliance with data governance and regulatory requirements. Data Cleaning and Preprocessing: with support of MDM team, clean data to remove inaccuracies, inconsistencies, and duplicates. Prepare data for analysis by performing normalization, transformation, and encoding. Monitoring and Maintaining Data Quality: Regularly monitor data quality and accuracy, implementing measures to maintain high data standards. Support the CBS MDM team in the implementation of the company-wide Data Policies and Governance Framework Deliver the Master Data Management strategy and roadmap within the Commercial function Support the delivery of master data projects, ensuring timely completion within budget and value realization Together with Data Principals and Data Owner, define data standards, cleansing (Get Clean) and monitoring (Dashboard/KPIs) of the Commercial master data Supporting the delivery of controlled, governed, and harmonized data maintenance processes for the Commercial data domain, including their introduction into the CBS operating model, ensuring effective and efficient master data maintenance services Implementing the required KPIs in the Global MDM Dashboard to monitor data standards compliance The above includes supporting technology implementation and required change management activities Supporting the implementation of the Data Collection, Storage and Modelling strategy for the Commercial function Developing documentation and education materials to support business stakeholders and imbed Master Data Management best practices within the Commercial function Fostering a continuous improvement and agile mentality in the delivery of the Master Data Management Commercial roadmap Foster communication and teamwork within and across organizational boundaries Build working relationships and foster communication with internal customers within and across organizational and geographic boundaries Requirements Post-Graduation in Business Administration, Accounting, Finance, ISC, Economics, IT or equivalent work experience Minimum 9 to 12+ years’ overall experience with 6 to 7 years’ experience in process related function and with the Data supporting these processes Systems/IT knowledge required (e.g. MS Excel advanced, Salesforce experience) Experience in implementation of standardized E2E business processes, in MDM processes a plus Experience in interfacing with business stakeholders Experience in process and Data improvement, applying root cause analysis Experience in defining requirements for building data quality reports/dashboards Experience within a corporate business environment Keeps deadlines and produces high quality output Fosters a constructive dialogue within process owners, BU/Functions and other key stakeholders Strong communicator Good level of presentation skills Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way . Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice: Click on this link to read the Job Applicant's Privacy Notice

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Title : T-SQL Database Developer Location: Bangalore / Pune / Chennai Experience: 10+ years Domain: Global US-based Bank We are looking for an Individual Contributor (IC) with deep T-SQL expertise to support high-performance transactional data systems in a regulated, mission-critical environment. Must-Have Skills 🔹 Advanced T-SQL Programming Expert in stored procedures, functions, triggers using joins, CTEs, window functions, dynamic SQL 🔹 Query Optimization & Execution Plans Hands-on tuning of multi-million row queries using indexing and execution plans 🔹 Schema Design & Normalization Strong experience with schema design, normalization, key definitions, and indexing strategies 🔹 Transactions, Locking & Concurrency Proficient with isolation levels, nested transactions, deadlock handling, and retry logic 🔹 SQL Server Tooling Expert in SSMS, DMVs, Query Store, and Activity Monitor for diagnostics and performance analysis 🔹 Code Quality & Peer Collaboration Experienced in code reviews, SQL coding standards, and collaborative development Good-to-Have Skills Azure SQL / Synapse – Experience with migrations or hybrid SQL cloud environments CI/CD for SQL – Familiarity with Git, Azure DevOps, and basic SQL deployment pipelines SQL Unit Testing (tSQLt) – Understanding of unit testing principles using tSQLt framework Monitoring Tools – Exposure to Redgate or SentryOne SSIS / SSRS / SSAS – Basic knowledge of SQL Server BI tools Security & Compliance – Knowledge of SOX/GDPR, encryption, and access control best practices

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

Remote

About This Role Role Description The Private Markets Insight Data Services (PDS) team seeks an Investor Account Services Lead for India region. This individual will lead efforts around Private Markets data processing and providing high quality service to our clients, leveraging technology and automation to drive scale, alongside disciplined Operations best practices. Insight’s Managed Data Service is a key foundation to the growing Data & Analytics solutions delivered by the Insight business, and critical to maintain the growth of the Insight business. The team is responsible for the document retrieval, data extraction, normalization, and delivery for investors in private markets products including Private Equity, Private Credit, Real Estate, and Infrastructure. Key Responsibilities Lead a team focused on creation of the market leading Private Markets database and analytics ecosystem for Cashflow & Capital Account Statement Services Manage a team of data analysts and work with team leads to manage workload and priorities Support a business growing by >30% per annum by ensuring scalable growth of services including new document types, asset classes and beyond Actively participate in Digital transformation of the Business, including transformation of process and workflow to leverage Aladdin's patented data and document automation solution. Partner closely with Insight’s Client Success and Sales teams planning for continued service delivery, on time (client SLAs) and with quality as well as supporting RFPs Create an inclusive environment oriented around trust, open communication, creative thinking, and cohesive team effort A Leader who grows the next set of Leaders in the business, and ability to become a BlackRock Global Citizen Experience Required Bachelor or Master degree (preferably in Economics, Organizational Sciences, Mathematics, or related Accounting background) Demonstrated experience in running an end-to-end managed data service organization Demonstrated transformational leader with the ability and desire to influence the people, process, and technology. Experience in Financial Markets, preferably Private Markets, is preferred Ability to succinctly communicate KPI-driven progress to stakeholders and Senior Leadership Strong organizational and change management skill Excellent communication skills: Fluency in English, both written and verbal Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Secunderābād, Telangana, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 3 weeks ago

Apply

0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for delivering on business metrics of portfolio resolution, norm, rollback and extraction/absolute recovery and ROR as per business operating plan through a team of Agency managers and Collection Vendors. Role Accountability Devise vendor allocation strategy for the CD/region and ensure appropriate capacity addition basis future business inflows in line with ACR guidelines Ensure adequate legal interventions on the portfolio Ensure various critical segments as defined by business are reviewed and performance is driven on them Conduct regular performance review with Vendors and Area collection managers for all critical metrics to track the portfolio health and performance trends Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Reinforce compliance standards with area collection managers and vendors to drive adherence to code of conduct Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure monthly cost provisions are reported as per timelines Identify upcoming markets in accordance with the Sales growth plan and evaluate setting up/expanding operations basis volumes In cases pertaining to Banca delinquencies, collaborate with partner bank branches in respective locations to track customers Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Settlement waiver rate Extraction Rate ACM CAPE ROR Regulatory Customer complaint % Vendor SVCL Audit adherence Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 3 weeks ago

Apply

2.0 - 31.0 years

3 - 9 Lacs

Gurgaon/Gurugram

On-site

Job Title: Manager – Collections (Telecalling Channel Management) Department: Collections Industry: Credit Cards / Financial Services About SBI Card SBI Card is a leading pure-play credit card issuer in India, committed to empowering customers with innovative and rewarding digital payment solutions. With our core philosophy of "Make Life Simple," we aim to enhance every customer interaction while fostering a vibrant, inclusive, and performance-driven workplace. We are proud to be an equal opportunity employer and strive to maintain a culture that celebrates diversity, inclusivity, and respect for all. What’s In It For You: Flexible work-life balance supported by wellness and mental health programs Comprehensive rewards and recognition framework Inclusive team culture with gender-neutral policies Health coverage that includes medical, accidental, dental, OPD, and more Learning & development frameworks for continuous career growth Role Purpose: To manage and drive the performance of telecalling channel partners handling collections across the assigned portfolio, ensuring operational efficiency, regulatory compliance, and achievement of key collection metrics. Key Responsibilities: Channel Partner & Portfolio Management Execute and monitor the collection strategy for the assigned site. Manage vendor capacity planning, resource optimization, and process effectiveness. Drive performance through structured reviews, portfolio segmentation, and strategic dialer management. Oversee agent-level performance including daily call audits, compliance checks, and productivity metrics. Operations & Compliance Ensure system uptime for CRM/Dialer and telecom infrastructure in coordination with internal teams. Conduct regular call quality reviews, sample evaluations, and agent feedback sessions. Track agent behavior trends – absenteeism, login issues, quality deviations – and drive remedial actions. Risk & Legal Recovery Support Identify accounts for legal recovery channels like arbitration, Lok Adalat, etc. Monitor field referrals based on segmentation strategy and historical performance. Ensure training & certifications of telecalling staff per compliance guidelines. Process Excellence & Reporting Maintain oversight on agency payouts, SLA compliance, and billing accuracy. Perform regular spot audits to ensure data security and adherence to internal controls. Analyze daily performance action codes and derive insights for corrective action. Measures of Success: Resolution Rate / Normalization Rate / Rollback Rate KP Target Achievement Money Collected / PLI Penetration / Tele Retention Rate NFTE Productivity and Training Coverage Customer Complaints Volume Vendor SLA Adherence Audit Observations (Zero non-compliance) MOU & Process Adherence Technical Skills / Experience Required: Strong knowledge of credit card products and collections operations Experience in managing large, distributed telecalling vendor teams Hands-on understanding of dialer strategies and contact center tools Competencies Critical to the Role: Stakeholder Management Result Orientation Process and Compliance Orientation Analytical Thinking & Problem Solving Qualifications: Graduate or Postgraduate in any discipline Preferred Industry Experience: Credit Card / Consumer Lending / Financial Services

Posted 3 weeks ago

Apply

8.0 - 11.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

JOB DESCRIPTION Roles & responsibilities Here are some of the key responsibilities of Sr Generative AI Engineer : Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Multimodal Model Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Model Development and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the problem domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Strong programming skills in Python and frameworks like PyTorch or TensorFlow. In depth knowledge on Deep Learning - CNN, RNN, LSTM, Transformers LLMs ( BERT, GEPT, etc.) and NLP algorithms. Also, familiarity with frameworks like Langgraph/CrewAI/Autogen to develop, deploy and evaluate AI agents. Ability to test and deploy open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. Ensure scalability and efficiency, handle data tasks, stay current with AI trends, and contribute to model documentation for internal and external audiences. Cloud computing experience, particularly with Google/Azure Cloud Platform, is essential. With strong foundation in understating Data Analytics Services offered by Google or Azure ( BigQuery/Synapse) Hands-on ML platforms offered through GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Preferred Technical & Functional Skills Strong oral and written communication skills with the ability to communicate technical and non-technical concepts to peers and stakeholders Ability to work independently with minimal supervision, and escalate when needed Key behavioral attributes/requirements Ability to mentor junior developers Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs RESPONSIBILITIES Roles & responsibilities Here are some of the key responsibilities of Sr Generative AI Engineer : Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Multimodal Model Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Model Development and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the problem domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Strong programming skills in Python and frameworks like PyTorch or TensorFlow. In depth knowledge on Deep Learning - CNN, RNN, LSTM, Transformers LLMs ( BERT, GEPT, etc.) and NLP algorithms. Also, familiarity with frameworks like Langgraph/CrewAI/Autogen to develop, deploy and evaluate AI agents. Ability to test and deploy open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. Ensure scalability and efficiency, handle data tasks, stay current with AI trends, and contribute to model documentation for internal and external audiences. Cloud computing experience, particularly with Google/Azure Cloud Platform, is essential. With strong foundation in understating Data Analytics Services offered by Google or Azure ( BigQuery/Synapse) Hands-on ML platforms offered through GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Preferred Technical & Functional Skills Strong oral and written communication skills with the ability to communicate technical and non-technical concepts to peers and stakeholders Ability to work independently with minimal supervision, and escalate when needed Key behavioral attributes/requirements Ability to mentor junior developers Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs #KGS QUALIFICATIONS This role is for you if you have the below Educational Qualifications PhD or equivalent degree in Computer Science/Applied Mathematics/Applied Statistics/Artificial Intelligence Preferences to research scholars from IITs, NITs and IIITs ( Research Scholars who are submitted their thesis) Work Experience 8 to 11 Years of experience with strong record of publications in top tier conferences and journals

Posted 3 weeks ago

Apply

8.0 - 11.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

JOB DESCRIPTION Roles & responsibilities Here are some of the key responsibilities of Sr Generative AI Engineer : Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Multimodal Model Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Model Development and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the problem domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Strong programming skills in Python and frameworks like PyTorch or TensorFlow. In depth knowledge on Deep Learning - CNN, RNN, LSTM, Transformers LLMs ( BERT, GEPT, etc.) and NLP algorithms. Also, familiarity with frameworks like Langgraph/CrewAI/Autogen to develop, deploy and evaluate AI agents. Ability to test and deploy open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. Ensure scalability and efficiency, handle data tasks, stay current with AI trends, and contribute to model documentation for internal and external audiences. Cloud computing experience, particularly with Google/Azure Cloud Platform, is essential. With strong foundation in understating Data Analytics Services offered by Google or Azure ( BigQuery/Synapse) Hands-on ML platforms offered through GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Preferred Technical & Functional Skills Strong oral and written communication skills with the ability to communicate technical and non-technical concepts to peers and stakeholders Ability to work independently with minimal supervision, and escalate when needed Key behavioral attributes/requirements Ability to mentor junior developers Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs RESPONSIBILITIES Roles & responsibilities Here are some of the key responsibilities of Sr Generative AI Engineer : Research and Development: Conduct original research on generative AI models, focusing on model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. Maintain a strong publication record in top-tier conferences and journals, showcasing contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). Multimodal Model Development: Design and experiment with multimodal generative models that integrate various data types, including text, images, and other modalities to enhance AI capabilities. Agentic AI Systems: Develop and design autonomous AI systems that exhibit agentic behavior, capable of making independent decisions and adapting to dynamic environments. Model Development and Implementation: Lead the design, development, and implementation of generative AI models and systems, ensuring a deep understanding of the problem domain. Select suitable models, train them on large datasets, fine-tune hyperparameters, and optimize overall performance. Algorithm Optimization: Optimize generative AI algorithms to enhance their efficiency, scalability, and computational performance through techniques such as parallelization, distributed computing, and hardware acceleration, maximizing the capabilities of modern computing architectures. Data Preprocessing and Feature Engineering: Manage large datasets by performing data preprocessing and feature engineering to extract critical information for generative AI models. This includes tasks such as data cleaning, normalization, dimensionality reduction, and feature selection. Model Evaluation and Validation: Evaluate the performance of generative AI models using relevant metrics and validation techniques. Conduct experiments, analyze results, and iteratively refine models to meet desired performance benchmarks. Technical Leadership: Provide technical leadership and mentorship to junior team members, guiding their development in generative AI through work reviews, skill-building, and knowledge sharing. Documentation and Reporting: Document research findings, model architectures, methodologies, and experimental results thoroughly. Prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Continuous Learning and Innovation: Stay abreast of the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities. Foster a culture of learning and innovation within the team to drive continuous improvement. Mandatory technical & functional skills Strong programming skills in Python and frameworks like PyTorch or TensorFlow. In depth knowledge on Deep Learning - CNN, RNN, LSTM, Transformers LLMs ( BERT, GEPT, etc.) and NLP algorithms. Also, familiarity with frameworks like Langgraph/CrewAI/Autogen to develop, deploy and evaluate AI agents. Ability to test and deploy open source LLMs from Huggingface, Meta- LLaMA 3.1, BLOOM, Mistral AI etc. Ensure scalability and efficiency, handle data tasks, stay current with AI trends, and contribute to model documentation for internal and external audiences. Cloud computing experience, particularly with Google/Azure Cloud Platform, is essential. With strong foundation in understating Data Analytics Services offered by Google or Azure ( BigQuery/Synapse) Hands-on ML platforms offered through GCP : Vertex AI or Azure : AI Foundry or AWS SageMaker Large scale deployment of GenAI/DL/ML projects, with good understanding of MLOps /LLM Ops Preferred Technical & Functional Skills Strong oral and written communication skills with the ability to communicate technical and non-technical concepts to peers and stakeholders Ability to work independently with minimal supervision, and escalate when needed Key behavioral attributes/requirements Ability to mentor junior developers Ability to own project deliverables, not just individual tasks Understand business objectives and functions to support data needs #KGS QUALIFICATIONS This role is for you if you have the below Educational Qualifications PhD or equivalent degree in Computer Science/Applied Mathematics/Applied Statistics/Artificial Intelligence Preferences to research scholars from IITs, NITs and IIITs ( Research Scholars who are submitted their thesis) Work Experience 8 to 11 Years of experience with strong record of publications in top tier conferences and journals

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary: Open Rainbow is seeking an experienced Senior MS SQL Developer to join our on-site team in Hyderabad. The ideal candidate will have deep expertise in writing complex SQL queries and extensive experience in query optimisation and performance tuning. This role is critical to ensuring the performance, reliability, and scalability of our database systems. Success in this role requires a strong ability to analyze execution plans, identify bottlenecks, and implement solutions that enhance database efficiency. You will work closely with cross-functional teams to support data-driven decision-making and ensure high standards of data performance across the organization. Responsibilities: Design, develop, and maintain complex T-SQL queries, stored procedures, views, functions, and triggers. Analyze and interpret SQL Server execution plans to identify inefficiencies and implement performance improvements. Conduct query tuning, indexing strategies, and database refactoring to improve overall system performance. Collaborate with application developers, data analysts, and system architects to support data-related initiatives. Perform root cause analysis for performance issues and provide long-term, scalable solutions. Monitor database performance metrics and proactively address areas for improvement. Ensure high standards of data integrity, availability, and security within SQL Server environments. Participate in code reviews and mentor junior team members on SQL development best practices. Support ongoing database maintenance, including index management, statistics updates, and performance monitoring. Qualifications: Minimum 5 years of experience in MS SQL Server development with a focus on performance optimization. Strong proficiency in writing and tuning complex T-SQL queries. Expert-level knowledge in analysing and improving SQL execution plans. Hands-on experience with indexing, partitioning, and optimization techniques. Familiarity with performance monitoring tools and dynamic management views (DMVs). Solid understanding of database design, normalization, and transactional processing. Strong analytical and problem-solving skills with attention to detail. Effective verbal and written communication skills. Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 12. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Educational Qualification: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus.

Posted 3 weeks ago

Apply

5.0 years

6 - 20 Lacs

India

On-site

Job Description: Senior Database Developer (MySQL & AWS Expert) Location: Hyderabad, India Experience: 5+ Years (Preferably 7+ Years) Employment Type: Full-time Role Overview: We are looking for an exceptionally strong Database Developer with 5+ years of hands-on experience specializing in MySQL database development on Amazon AWS Cloud. The ideal candidate should have deep expertise in high-performance query tuning, handling massive datasets, designing complex summary tables, and implementing scalable database architectures. This role demands a highly analytical and problem-solving mindset, capable of delivering optimized and mission-critical database solutions. Key Responsibilities: • Design, develop, and optimize highly scalable MySQL databases on AWS cloud infrastructure. • Expert-level performance tuning of queries, indexes, and stored procedures for mission-critical applications. • Handle large-scale datasets, ensuring efficient query execution and minimal latency. • Architect and implement summary tables for optimized reporting and analytical performance. • Work closely with software engineers to design efficient data models, indexing strategies, and partitioning techniques. • Ensure high availability, disaster recovery, and fault tolerance of database systems. • Perform root-cause analysis of database bottlenecks and implement robust solutions. • Implement advanced replication strategies, read/write separation, and data sharding for optimal performance. • Work with DevOps teams to automate database monitoring, backups, and performance metrics using AWS tools. • Optimize stored procedures, triggers, and complex database functions to enhance system efficiency. • Ensure best-in-class data security, encryption, and access control policies. Must-Have Skills: • Proven expertise in MySQL query optimization, indexing, and execution plan analysis. • Strong knowledge of AWS RDS, Aurora, and cloud-native database services. • Hands-on experience in tuning high-performance, high-volume transactional databases. • Deep understanding of database partitioning, sharding, caching, and replication strategies. • Experience working with large-scale datasets (millions to billions of records) and ensuring low-latency queries. • Advanced experience in database schema design, normalization, and optimization for high availability. • Proficiency in query profiling, memory management, and database load balancing. • Strong understanding of data warehousing, ETL processes, and analytics-driven data models. • Expertise in troubleshooting slow queries and deadlocks in a production environment. • Proficiency in scripting languages like Python, Shell, or SQL scripting for automation. Preferred Skills: • Experience with big data technologies like Redshift, Snowflake, Hadoop, or Spark. • Exposure to NoSQL databases (MongoDB, Redis) for hybrid data architectures. • Hands-on experience with CI/CD pipelines and DevOps database management. • Experience in predictive analytics and AI-driven data optimizations. Educational Qualification: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Salary & Benefits: • Top-tier compensation package for highly skilled candidates. • Fast-track career growth with opportunities for leadership roles. • Comprehensive health benefits and performance-based bonuses. • Exposure to cutting-edge technologies and large-scale data challenges. If you are a world-class MySQL expert with a passion for solving complex database challenges and optimizing large-scale systems, apply now! Job Types: Full-time, Permanent Pay: ₹634,321.11 - ₹2,091,956.36 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Language: English (Required) Work Location: In person Expected Start Date: 21/07/2025

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Hyderābād

On-site

Hyderabad, Telangana Job ID 30185920 Job Category Finance Role: Data Domain Expert - Commercial Location: Hyderabad, India Full/ Part-time: Full-time Build a career with confidence Carrier is a leading provider of heating, ventilating, air conditioning and refrigeration systems, building controls and automation, and fire and security systems leading to safer, smarter, sustainable, and high-performance buildings. Carrier is on a mission to make modern life possible by delivering groundbreaking systems and services that help homes, buildings and shipping become safer, smarter, and more sustainable. Our teams exceed the expectations of our customers by anticipating industry trends, working tirelessly to master and revolutionize them. About the role Carrier is looking for an experienced data management professional to join our team as Business Enterprise Data Domain Expert Level 3 - Senior Commercial. The Data Domain Expert Level 3 - Senior Commercial will be part of the Carrier Business Services (CBS) Global Master Data Management (MDM) team and will support the master data improvement strategy and roadmap for the Commercial function. The role will support the implementation of structural improvements, delivering the highest Master Data quality level aiming at running efficient end-to-end business processes and allowing for decision making based on trusted data. The role supports the Data Principal Commercial on the data aspect of the Commercial function in close collaboration with the people, process, and technology partners. He/She designs and implements standardized end to end processes and tools for the maintenance of Commercial Master Data, delivering improved service quality, optimized cost to serve, controls & compliance, as well as enhancing customer experience. It also supports some Commercial function projects from a data perspective. He/She operates in a highly complex systems landscape and within a matrix organization. System complexity includes at least 100+ ERP instances, several other enterprise-level systems, supported by multiple service providers. He/She manages stakeholders’ relationships with business process managers across the Commercial function to implement the strategy and roadmap for high quality Master Data. Key Responsibilities: Being part of an enthusiastic and dynamic team, your responsibilities include: Business Data Analysis Data Analysis and Interpretation: Analyse data using statistical methods and tools to discover trends, patterns, and insights that can inform business decisions. Reporting and Visualization: help businesses create reports and dashboards in Qualtrics using data visualization tools to present complex data in an understandable and visually appealing manner Developing and Implementing Data Models: Develop models to address business issues. This may include predictive models, segmentation strategies, or other statistical models. Collaborating with Stakeholders: Work closely with BU’s and stakeholders to understand their data requirements and deliver insights and insights training that meet their needs. Identifying Opportunities for Process Improvement: Use data to identify inefficiencies or areas for improvement in business processes and recommend solutions. Supporting Data-Driven Decision-Making: Provide the data and analysis needed to support decision-making across the company, including local continuous improvement Staying Up-to-Date with Industry Trends and Tools: Keep abreast of the latest trends in data analytics, tools, and technologies to continuously improve data analysis capabilities. Work closely with Carrier Commercial MDM team on the following Data Quality improvements Data Enrichment: Gather additional customer-related data attributes from well-defined sources (mainly Salesforce CRM), ensuring it is accurate, in time, and compliant with the architecture in place. Ensure compliance with data governance and regulatory requirements. Data Cleaning and Preprocessing: with support of MDM team, clean data to remove inaccuracies, inconsistencies, and duplicates. Prepare data for analysis by performing normalization, transformation, and encoding. Monitoring and Maintaining Data Quality: Regularly monitor data quality and accuracy, implementing measures to maintain high data standards. Support the CBS MDM team in the implementation of the company-wide Data Policies and Governance Framework Deliver the Master Data Management strategy and roadmap within the Commercial function Support the delivery of master data projects, ensuring timely completion within budget and value realization Together with Data Principals and Data Owner, define data standards, cleansing (Get Clean) and monitoring (Dashboard/KPIs) of the Commercial master data Supporting the delivery of controlled, governed, and harmonized data maintenance processes for the Commercial data domain, including their introduction into the CBS operating model, ensuring effective and efficient master data maintenance services Implementing the required KPIs in the Global MDM Dashboard to monitor data standards compliance The above includes supporting technology implementation and required change management activities Supporting the implementation of the Data Collection, Storage and Modelling strategy for the Commercial function Developing documentation and education materials to support business stakeholders and imbed Master Data Management best practices within the Commercial function Fostering a continuous improvement and agile mentality in the delivery of the Master Data Management Commercial roadmap Foster communication and teamwork within and across organizational boundaries Build working relationships and foster communication with internal customers within and across organizational and geographic boundaries Requirements Post-Graduation in Business Administration, Accounting, Finance, ISC, Economics, IT or equivalent work experience Minimum 9 to 12+ years’ overall experience with 6 to 7 years’ experience in process related function and with the Data supporting these processes Systems/IT knowledge required (e.g. MS Excel advanced, Salesforce experience) Experience in implementation of standardized E2E business processes, in MDM processes a plus Experience in interfacing with business stakeholders Experience in process and Data improvement, ing root cause analysis Experience in defining requirements for building data quality reports/dashboards Experience within a corporate business environment Keeps deadlines and produces high quality output Fosters a constructive dialogue within process owners, BU/Functions and other key stakeholders Strong communicator Good level of presentation skills Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way . Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Gurgaon

On-site

Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. 10 Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 1. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) o SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: o Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements

Posted 3 weeks ago

Apply

1.0 years

1 - 2 Lacs

Noida

On-site

Position: Web Developer We are looking for a highly skilled Web Developer with 1 years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 1 years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹15,000.00 - ₹20,000.00 per month Work Location: In person

Posted 3 weeks ago

Apply

4.0 years

18 - 22 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of the Weekday's clients Salary range: Rs 1800000 - Rs 2200000 (ie INR 18-22 LPA) Min Experience: 4 years Location: Bangalore, Bengaluru JobType: full-time We are seeking a skilled and detail-oriented Data Modeller with 4-6 years of experience to join our growing data engineering team. In this role, you will play a critical part in designing, implementing, and optimizing robust data models that support business intelligence, analytics, and operational data needs. You will collaborate with cross-functional teams to understand business requirements and convert them into scalable and efficient data solutions, primarily leveraging Amazon Redshift and Erwin Data Modeller. Requirements Key Responsibilities: Design and implement conceptual, logical, and physical data models that support business processes and reporting needs. Develop data models optimized for Amazon Redshift, ensuring performance, scalability, and integrity of data. Work closely with business analysts, data engineers, and stakeholders to translate business requirements into data structures. Use Erwin Data Modeller (Erwin ERP) to create and maintain data models and maintain metadata repositories. Collaborate with ETL developers to ensure efficient data ingestion and transformation pipelines that align with the data model. Apply normalization, denormalization, and indexing strategies to optimize data performance and access. Perform data profiling and source system analysis to validate assumptions and model accuracy. Create and maintain detailed documentation, including data dictionaries, entity relationship diagrams (ERDs), and data lineage information. Drive consistency and standardization across all data models, ensuring alignment with enterprise data architecture and governance policies. Identify opportunities to improve data quality, model efficiency, and pipeline performance. Required Skills and Qualifications: 4-6 years of hands-on experience in data modeling, including conceptual, logical, and physical modeling. Strong expertise in Amazon Redshift and Redshift-specific modeling best practices. Proficiency with Erwin Data Modeller (Erwin ERP) or similar modeling tools. Strong knowledge of SQL with experience writing complex queries and performance tuning. Solid understanding of ETL processes and experience working alongside ETL engineers to integrate data from multiple sources. Familiarity with dimensional modeling, data warehousing principles, and star/snowflake schemas. Experience with metadata management, data governance, and maintaining modeling standards. Ability to work independently and collaboratively in a fast-paced, data-driven environment. Strong analytical and communication skills with the ability to present technical concepts to non-technical stakeholders. Preferred Qualifications: Experience working in a cloud-native data environment (AWS preferred). Exposure to other data modeling tools and cloud data warehouses is a plus. Familiarity with data catalog tools, data lineage tracing, and data quality frameworks

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Control Risks is seeking a highly technical, detail-oriented Data Analyst to join our Data & Technology Consulting (DTC) team. This role is deeply embedded in data analytics, scripting, ETL workflows, and reporting. The candidate is required to have strong skills in Python, SQL, Power BI, Microsoft Fabric, and PowerApps. The successful candidate will play a critical role in developing and implementing data solutions that power our consulting engagements. This is not a generic data analyst role, we're looking for a problem-solver who thrives in complex, fast-paced environments, is confident writing production-level code, and can develop intuitive, scalable reporting solutions. Tasks & responsibilities: • Interrogate, clean, and assess structured and unstructured data for integrity, completeness, and business relevance. • Build and optimize robust ETL pipelines to normalize disparate datasets and enable downstream analysis. • Write efficient SQL and Python scripts to support custom data transformations, enrichment, and automations. • Design, build, and maintain interactive Power BI dashboards and PowerApps solutions aligned to client and internal requirements. • Interpret and analyse complex financial, operational, and transactional datasets to surface insights and support investigative work. • Document methodologies, code logic, data assumptions, and business context throughout the project lifecycle. • Collaborate across multi-disciplinary teams to ensure timely delivery of work products and reporting solutions. Requirements • Minimum 3 years of hands-on experience with: • Writing production-level SQL and Python for data transformation and automation. • Building and maintaining ETL pipelines for large, messy, and complex datasets. • Designing and deploying workflows and reports using Power BI, PowerApps, and Microsoft Fabric. • Advanced proficiency in Excel (pivoting, modelling, formulas, data wrangling). • Demonstrated experience working with relational databases and open-source tools. • Strong understanding of data structures, normalization, and query optimization. • Proven ability to manage multiple priorities in a deadline-driven environment. • Self-motivated, methodical, and committed to high-quality outcomes. • Excellent written and verbal communication in English. Preferred Skills • Experience in consulting, compliance, or risk advisory environments. • Comfort navigating ambiguity and changing priorities. • Exposure to version control systems (e.g., Git), cloud data tools, or APIs is a plus. Education • Bachelor's Degree in Computer Science, Data Science, Information Systems, or a relevant quantitative field.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies