Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining ITSource Technologies Limited, a client-oriented IT services, BPO, and IT staffing company known for its commitment to excellence. With a track record of over 22 years, we specialize in Enterprise Applications, Big Data, Staffing, BI, Cloud, and Web Solutions. Our industry reputation is built on a customer-centric approach and exceptional talent management capabilities. As an AWS Data Architect based in Pune, this full-time on-site role will involve overseeing data governance, data architecture, data modeling, Extract Transform Load (ETL), and data warehousing processes. Your responsibilities will include applying data modeling techniques to address new business requirements, utilizing SQL, Advanced SQL, and NoSQL databases, managing Data Warehousing and ETL tasks, engaging with customers to grasp business needs, and working with Tableau and Power BI tools. To excel in this role, you should possess skills in Data Governance, Data Architecture, Data Modeling, and ETL processes. Strong analytical and problem-solving abilities are essential, as well as familiarity with AWS services and infrastructure. Effective communication and collaboration skills are key, along with a Bachelor's degree in Computer Science or a related field. An AWS certification would be considered a plus.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data bricks Architect at Gramener, you will have the opportunity to engage in diverse impactful customer technical Big Data projects. Your responsibilities will include developing reference architectures, how-to guides, and minimally viable products (MVPs). Leading strategic initiatives encompassing the end-to-end design, build, and deployment of industry-leading big data and AI applications will be a key aspect of your role. You will provide architectural guidance to foster the adoption of Databricks across business-facing functions and collaborate with platform engineering teams to implement Databrick services effectively within the infrastructure. Your expertise in AWS administration and architecture will be utilized to optimize cloud resources and ensure seamless integration with Databricks. You will also leverage your hands-on experience with Databricks Unity Catalog to implement robust data governance and lineage capabilities. Advocating for and implementing CI/CD practices to streamline the deployment of Databricks solutions will be part of your responsibilities. Additionally, you will contribute to developing a data mesh architecture that promotes decentralized data ownership and accessibility across the organization. To qualify for this role, you should have at least 7 years of experience with Big Data technologies, including Apache Spark, cloud-native data lakes, and data mesh platforms, in a technical architecture or consulting role. A minimum of 5 years of independent experience in Big Data architecture is required. Proficiency in Python coding, familiarity with data engineering best practices, and extensive experience working with AWS cloud platforms are essential qualifications. Strong documentation and whiteboarding skills will be necessary to effectively communicate complex ideas. You must possess in-depth knowledge of the latest services offered by Databricks and have the ability to evaluate and integrate these services into the platform. Demonstrated expertise in migrating from Databricks classic platform to Lakehouse architecture, utilizing Delta file format and/or Delta Live Tables, is also required. A collaborative mindset with the ability to work effectively across teams and functions is crucial for success in this role. Gramener offers an inviting workplace with talented colleagues from diverse backgrounds, career paths, and steady growth prospects with great scope to innovate. The company aims to create an ecosystem of easily configurable data applications focused on storytelling for public and private use. If you are looking to be part of a dynamic team that consults and delivers data-driven solutions to organizations, this role could be the perfect fit for you. To learn more about Gramener, visit the company's website and blog. If you are interested in this opportunity, we encourage you to apply for the Data bricks Architect role at Gramener.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The ETL Developer with expertise in Datastage and Snowflake plays a crucial role in the organization by managing and improving the data pipeline initiatives. You will be responsible for extracting, transforming, and loading data from diverse sources to the Snowflake environment, ensuring high-quality data availability for analytics and reporting. As companies increasingly rely on data-driven decisions, the need for a skilled ETL Developer has grown significantly. You will work closely with data analysts, data scientists, and other stakeholders to understand business requirements and translate them into robust ETL processes. By optimizing and maintaining data pipelines, you will enhance data accessibility and efficiency, thus driving informed decision-making and strategic initiatives within the company. Design, develop, and maintain ETL processes using Datastage and Snowflake. Collaborate with data architects and analysts to gather requirements and specifications. Extract data from multiple sources, ensuring integrity and security. Transform data according to business needs, applying rules and practices. Load transformed data into Snowflake, optimizing for performance and efficiency. Monitor ETL jobs for performance, troubleshoot issues, and ensure timely execution. Implement data quality checks and validation processes. Document ETL processes, data flows, and transformations for future reference. Work with the data team to design and implement scalable data models. Enhance existing ETL processes for better performance and reliability. Conduct root cause analysis for data discrepancies or failure in data pipelines. Stay updated with new technologies and trends related to ETL and data warehousing. Train and guide junior team members on ETL best practices. Participate in data governance initiatives, ensuring compliance with policies. Adopt Agile methodologies in project execution for efficient workflow. Required Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - 5+ years of experience working as an ETL Developer or in a similar role. - Proficient in Datastage, with a solid understanding of its functionalities. - Experience with Snowflake and cloud-based data solutions. - Strong command of SQL and relational databases. - Familiarity with data modeling concepts and dimensional modeling. - Experience with performance tuning and optimization of ETL jobs. - Knowledge of data governance and data quality frameworks. - Ability to work collaboratively in a team-oriented environment. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Experience with Agile/Scrum methodologies preferred. - Understanding of big data technologies and frameworks is a plus. - Certifications in data warehousing or ETL tools are advantageous. - Willingness to learn new technologies and tools as needed. - Attention to detail with a focus on maintaining data integrity.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As the Manager, Strategic Program Analytics at Bristol Myers Squibb, you will be instrumental in supporting the Director Agile Sourcing Hyderabad Site lead by leveraging data, analytics, and performance efforts for the entire Global Procurement organization's strategic initiatives. Your role will involve enhancing analytics capabilities, process improvement for data governance and quality, as well as documenting performance against Global Procurement's priorities and objectives. You will play a crucial part in managing procurement activities strategically and efficiently, identifying areas for continuous improvement and efficiencies wherever applicable. Your responsibilities will include delivering analytics metrics and dashboards related to Sourcing events, Supplier, Cashflow, Contracts, Spend, Savings, Market Intelligence, and Cost Intelligence to successfully achieve business objectives. You will collaborate with cross-functional teams, including Business Intelligence and IT teams, to deliver necessary data management tools and system solutions. Your role will also involve developing and monitoring key performance metrics, analyzing performance trends, identifying potential risks, and making fact-based recommendations to close gaps against targets. Furthermore, you will support the end-to-end performance reporting of the functional strategic roadmap through the development of global procurement and functional team scorecards. You will liaise with key stakeholders across all Global Procurement for progress updates and report status to leadership and functional area teams as appropriate. Your role will require effective communication, presentation, and negotiation skills to build strong partnerships across the organization. To qualify for this role, you should have a BA/BS in a quantitative major or concentration, along with 5+ years of experience in developing and using advanced analytics and reporting techniques. Additionally, you should have 3+ years of experience in performing Procurement analytics or relevant experience. Proficiency in tools across the analytic stack, such as data management tools like MapReduce/Hadoop, SPSS/R, SAS, and Workday, is required. Experience with ETL tools like Tableau and Power BI for data integration is preferred. If you are intrigued by this role and believe you have the necessary skills, we encourage you to apply. At Bristol Myers Squibb, we offer a unique and inclusive work environment where every employee has the opportunity to make a meaningful impact on transforming patients" lives through science. Join us in our mission to bring out the highest potential in each of our colleagues while promoting diversity and innovation in an inspiring culture.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for driving the implementation of decisions and policies related to GPS Network standardization. This includes translating Functional Committee Network standardization decisions into action plans for delivery and ensuring that the GPS Network data standard documentation is updated and maintained. In addition, you will work on remediating data to adhere to new or updated data standards. This may involve coordinating with Functional Business leads for remediation activities, partnering with IT and Business teams for mass remediation activities, and collaborating with the Data Strategy and Governance team for integrating necessary Business Rule updates and ensuring testing plans for the Data Quality dashboard. Furthermore, you will partner with IT on system alignment to Data Lifecycle processes. This includes performing impact assessments on GPS systems due to data standard decisions, working with the broader team to understand the scope of work required for changes, and collaborating with IT for system implementation, testing, and User Acceptance Testing. You will also ensure sustainment plans are in place and aligned with GPS Data Lifecycle processes and SOP documentation. To qualify for this role, you should have a B.S. or BA in supply chain, management, or engineering with experience in Global Product Development and Supply business functions. Operational excellence experience and knowledge of Strategic Data Management disciplines are desired. Additionally, experience in Supply Chain, Manufacturing, Quality, or Regulatory Management is required, along with exposure to current Good Manufacturing Practices and regulatory requirements. Moreover, you should have experience in Biotech / Pharmaceutical manufacturing processes, ERP systems like SAP or Oracle, and ideally in ERP deployment. Understanding of SCOR methodology and experience with systems like LIMS and Veeva are highly preferred. In terms of professional and personal requirements, you should possess Business Acumen, an Enterprise Mindset, Strong Problem-Solving and Analytical Skills, the ability to lead Cross Functional Project Teams, Change Agility, and Digital Dexterity. You should be able to influence others, make data-driven decisions, lead collaborative projects, navigate change effectively, and leverage technology for business outcomes.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
En Derevo, nos enfocamos en empoderar tanto a las empresas como a las personas para liberar el valor de los datos en las organizaciones. Nuestra misin implica la implementacin de procesos y plataformas de anlitica que cubren el ciclo completo necesario para lograr este objetivo. Desde nuestra fundacin en 2010, en Derevo hemos buscado crear ms que una empresa; hemos creado una comunidad y un espacio donde cada individuo tiene la oportunidad de construir sus sueos. En esta ocasin, estamos en la bsqueda de talento para la posicin de Consultor ETL ODI. Como parte de tus responsabilidades, estars a cargo de disear tablas de Data Warehouse y la documentacin correspondiente. Desarrollars scripts de creacin de tablas y los ejecutars para el Administrador de Base de Datos (DBA). Analizars fuentes de datos, determinars las relaciones de datos para vincularlas con la documentacin de requisitos comerciales (BRD). Se requiere familiaridad con problemas de calidad de datos y procesos Exact, Transform, Load (ETL). Sers responsable de disear y desarrollar el proceso Extract Load Transform (ELT) para cargar datos desde tablas fuente a tablas de etapa y luego a tablas de hechos y dimensiones utilizando Oracle Data Integrator (ODI). Validars los datos cargados en varias tablas utilizando SQL y otras tcnicas de anlisis. Adems, se espera que tengas conocimiento y comprensin de las herramientas y prcticas aplicables utilizadas en la gobernanza de datos, incluyendo tcnicas de consulta y perfilado de datos. Para ser considerado para esta posicin, es necesario contar con un nivel avanzado de ingls (B2 o superior) y tener al menos 5 aos de experiencia como consultor en ODI. Ofrecemos beneficios que abarcan desde el bienestar integral hasta oportunidades de especializacin en diversas reas y tecnologas. En Derevo, fomentamos la creatividad y el crecimiento personal, participando en proyectos tecnolgicos innovadores y colaborando con equipos multinacionales. Nuestro equipo trabaja en un esquema remoto, siendo flexible y estructurado. Proporcionamos el equipo necesario para trabajar y herramientas de comunicacin interna que faciliten nuestras operaciones y las de nuestros clientes. Si cumples con la mayora de los requisitos y te interesa esta oportunidad, no dudes en postularte. Nuestro equipo de Talento se pondr en contacto contigo! nete a Derevo y desarrolla tu superpoder.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining Tesco as a Data Science Manager focusing on AI Governance and Innovation. The Data Science team at Tesco is dedicated to solving complex business problems and implementing data products on a large scale. The team works across various domains such as physical stores, online platforms, supply chain, marketing, and Clubcard, promoting rotation among Data Scientists to enhance expertise in different subjects. In this role, you will be responsible for driving innovation across all business and technical domains, leading the development of data science algorithmic features, and implementing and managing AI governance frameworks to ensure ethical and responsible AI practices. You will also play a key role in setting up projects for success, framing and scoping Data Science problems, line managing data scientists, and supporting the team in their daily tasks. Collaboration with legal and technical teams to address governance issues and educate employees on AI policies will be essential. To excel in this position, you should possess a strong numerical higher degree in a mathematical, scientific, engineering, or computer science discipline with a focus on Data Science. Proficiency in Python, Machine Learning, Software Engineering best practices, and experience in designing reliable and scalable ML systems are crucial. Additionally, expertise in open-source big data technologies, cloud platforms, and managing open-source Data Science environments is required. Your role will involve leveraging your knowledge of AI technology and ethical considerations, legal and regulatory frameworks related to AI, data governance, and risk management. Strong leadership skills, experience in managing high-performing Data Science teams, and the ability to influence senior stakeholders are key qualifications. Previous experience in AI governance, collaborating with legal and technical teams, and coaching others in technical approaches will be advantageous. If you have a background or understanding of the retail sector, logistics, or e-commerce, it would be beneficial. Overall, this role offers a challenging opportunity to drive innovation projects, manage AI governance, and lead the implementation of data science developments at Tesco.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Informatica MDM Developer at RapidQube Digital Solutions Pvt Ltd, you will be responsible for designing, developing, and implementing Master Data Management (MDM) solutions to ensure high-quality, consistent, and secure data management across the organization. Your expertise in MDM concepts, data modeling, data integration, and the Informatica MDM platform will be crucial in meeting business and technical requirements. Your key responsibilities will include designing, developing, and implementing Informatica MDM solutions, configuring and customizing Informatica MDM components, developing and maintaining data models within the MDM hub, implementing data quality rules, integrating Informatica MDM with other enterprise systems, performing data profiling and standardization, executing tests and optimization strategies, troubleshooting MDM-related issues, collaborating with stakeholders, and documenting technical solutions and workflows. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, have hands-on experience in Informatica MDM development, proficiency in MDM Hub, Data Director, Entity 360, and Business Entity Services, strong knowledge of data modeling and integration principles, experience in SQL and data manipulation techniques, and understanding of data governance and stewardship practices. Additionally, you should possess excellent problem-solving, analytical, and communication skills to effectively work with cross-functional teams. Preferred qualifications include Informatica MDM certification, experience with Informatica PowerCenter and Informatica Data Quality, knowledge of cloud-based MDM solutions, experience in Agile development methodologies, familiarity with version control systems, and industry experience in sectors like Healthcare, Finance, or Retail. Join us at RapidQube Digital Solutions Pvt Ltd to contribute to the continuous improvement of development processes and provide support, maintenance, and enhancements for production MDM environments while staying updated with the latest Informatica MDM features and industry trends.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality, and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Engineer, you should have experience with: - Hands-on experience in PySpark and strong knowledge of Dataframes, RDD, and SparkSQL. - Hands-on experience in developing, testing, and maintaining applications on AWS Cloud. - Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake Formation, Athena). - Design and implement scalable and efficient data transformation/storage solutions using Snowflake. - Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc. - Experience in using DBT (Data Build Tool) with Snowflake for ELT pipeline development. - Experience in writing advanced SQL and PL SQL programs. - Hands-On Experience for building reusable components using Snowflake and AWS Tools/Technology. - Should have worked on at least two major project implementations. - Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. - Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is an added advantage. - Knowledge of Abinitio ETL tool is a plus. Some other highly valued skills may include: - Ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Ability to understand the infrastructure setup and provide solutions either individually or working with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implement Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision-making within the own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility following relevant rules, regulations, and codes of conduct. Maintain and continually build an understanding of how your sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 1 week ago
4.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for leading the end-to-end delivery of Banking Datawarehouse engagements, including planning, execution, and delivery. It is essential to align with the long-term strategy of both the customer and TCS. You will need to liaise with multiple stakeholders from the Technology, Business, and vendors. With over 15 years of experience in Banking data warehousing, Business Intelligence, and Analytics, including at least 4 years in Delivery Lead or Project Management roles, you are well-equipped for this position. A good understanding of the banking domain and technology such as ETL, Database, and BI reporting is crucial. Proficiency in Data quality, Data Governance, and Data Management is also required. Strong communication and negotiation skills, including interpersonal skills, will be beneficial in this role. Your expertise in data quality, business intelligence, BI reporting, project management, delivery lead, data warehousing, negotiation, banking, ETL, database, data governance, communication, analytics, and data management will be key to your success in this position.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be working as a Senior Business Intelligence Engineer at COVU, a venture-backed technology startup focused on modernizing independent insurance agencies using AI and data analytics. Your role will involve designing and building real-time dashboards, optimizing data pipelines, and establishing best practices for BI platforms. As a Senior Business Intelligence Engineer, you will be responsible for owning and operating the BI stack, including architecture, security, and performance. You will design and develop real-time dashboards for various teams within the company and optimize data pipelines using AWS and SQL/NoSQL technologies. Additionally, you will document and promote BI best practices and present insights to both technical and non-technical stakeholders. To qualify for this position, you should have at least 5 years of experience in building BI/analytics solutions on cloud data warehouses. Strong SQL skills and experience with AWS services like QuickSight, Redshift, Glue, and Lambda are required. You should also have expertise in modeling large relational and NoSQL datasets for efficient dashboards and implementing data governance practices with a focus on PII protection. Bonus qualifications include familiarity with P&C insurance data, Applied Epic, and Salesforce. The company offers health and dental insurance benefits, and the position is fully remote, allowing candidates from anywhere in the world to apply.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
You should have a Bachelor's or Master's degree in Computer Science, Data Science, Statistics, Mathematics, or a related field with a minimum of 3+ years of experience. Ideally, you should hold AWS/Microsoft certifications in Data Engineering, Data Analytics, or Cloud. Proficiency in programming languages such as Python, SQL, Java, Scala, or similar is required. Additionally, experience with data visualization tools like Power BI, as well as front-end technologies such as JS/Typescript, is expected. Knowledge in database technologies (e.g., SQL, NoSQL) and data warehousing is crucial, including an understanding of their application in the context of digital products. You should possess a strong understanding of data governance, data modeling, and data warehousing. Experience with big data technologies like Hadoop, Spark, and intelligent automation platforms such as Dataiku, DataRobot, Power Automate is highly valued. Familiarity with SAFe Agile methodologies and the ability to work in a fast-paced environment is essential. Strong problem-solving skills are required, along with the capability to translate business needs into data solutions. An openness to working in agile environments with multiple stakeholders, especially those experienced in SAFe methodologies, is necessary. Your role will involve translating complex technical concepts and solutions related to digital products into business language that non-technical stakeholders can understand. You will be responsible for conveying business requirements related to digital core and digital products to DD&T talent supporting the strategic planning process. Additionally, you will articulate process digitization propositions to business stakeholders, conduct diagnostics to identify use case opportunities, scout and deliver suitable solutions and services to the business, provide support for key users within the organization, and lead service portfolio and service delivery management.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Power BI Architect - Data Engineer, you will play a crucial role in designing, implementing, and managing comprehensive business intelligence solutions. Your focus will be on data modeling, report development, and ensuring data security and compliance. Working within high-performing and collaborative teams, you will present data migration solutions and influence key stakeholders in client groups. Your expertise will assist clients in driving towards strategic data architecture goals by enhancing the coherence, quality, security, and availability of the organization's data assets through the development of data migration roadmaps. Your responsibilities will include designing and leading real-time data architectures for large volumes of information, implementing integration flows with Data Lakes and Microsoft Fabric, optimizing and governing tabular models in Power BI, and ensuring high availability, security, and scalability. You will also coordinate data quality standards with a focus on DataOps for continuous deployments and automation. To be successful in this role, you should have demonstrable experience in Master data management and at least 7 years of experience in designing and implementing BI solutions and data architectures. You must possess advanced modeling skills, proficiency in DAX, and expertise in optimization and governance. Strong knowledge and mastery of Data Lake, Microsoft Fabric, and real-time ingestion methods are essential. Hands-on experience and knowledge of Python or R for data manipulation/transformation and automation are also required. Additionally, you should have proven experience in tabular modeling, DAX queries, and report optimization in Power BI. Your ability to plan, define, estimate, and manage the delivery of work packages using your experience will be crucial. Excellent communication skills and flexibility to respond to various program demands are essential for this role. You should have a deep understanding of key technical developments in your area of expertise and be able to lead the definition of information and data models, data governance structures, and processes. Experience in working in complex environments across multiple business and technology domains is preferred, along with the ability to bridge the gap between functional and non-functional teams.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
Bright Vision Technologies has an immediate Full-time opportunity for a Lead Data Architect. Your Path to a Successful Career in the U.S. Starts Here! At Bright Vision Technologies, we specialize in turning aspirations into reality. Whether you're an IT professional looking for growth or a recent graduate planning your next step, we've got you covered! Why Partner with Us Proven Success: A trusted partner in successful H1B filings for over a decade. End-to-End Support: From documentation to final approval, we handle it all. Top Clients: Access premium opportunities with Fortune 500 companies. Transparent Process: Regular updates and personalized attention at every stage. Green Card Sponsorship: Pathways to long-term residency for eligible candidates. About Bright Vision Technologies: Bright Vision Technologies is a rapidly expanding technology company specializing in innovative solutions. We are dedicated to developing cutting-edge applications that enhance user experiences. We are currently seeking a skilled Data Architect to join our dynamic team and contribute to the creation of industry-leading mobile applications. We are seeking a highly skilled and experienced Senior Data Architect to join our dynamic team. The ideal candidate will have a strong background in data architecture, data modeling, and database design. This role involves working closely with cross-functional teams to design and implement robust data solutions that support our business objectives. Responsibilities: - Design and implement scalable, high-performance data architectures - Develop and maintain data models, data flow diagrams, and database schemas - Collaborate with stakeholders to understand business requirements and translate them into technical specifications - Ensure data integrity, security, and compliance with industry standards and regulations - Optimize and tune database performance - Lead data integration and migration projects - Mentor and guide junior data architects and data engineers - Stay updated with the latest trends and technologies in data architecture and database management Requirement: - Experience with big data technologies (e.g., Hadoop, Spark) - Knowledge of data governance and data quality best practices - Certification in data architecture or related fields As part of the H-1B visa sponsorship process, there are various fees and expenses associated with filing and processing the application. While we are committed to supporting you through this process, we kindly request that candidates cover some of the costs like premium processing (if applicable), and dependent visa fees. This ensures a smooth and efficient process while aligning with our policies. However, as a gesture of goodwill, we will reimburse 50% of the fee after 4 years of continuous employment with us. If you are not comfortable with this arrangement, we kindly ask that you refrain from applying, as this is a necessary condition for sponsorship. Join the Bright Vision Technologies team and be part of an exciting journey to shape the future of mobile applications. As a Lead Data Architect, you will have the opportunity to work on innovative projects, collaborate with talented professionals, and contribute to the growth and success of our organization. Would you like to know more about this opportunity For immediate consideration, please send your resume directly to Sweta Pandey at sweta@bvteck.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Data Modeler - Erwin plays a crucial role in the design and implementation of data models that meet organizational needs. You will be responsible for translating business requirements into well-structured, reusable data models while ensuring data integrity and efficiency. Working closely with stakeholders, you will gather data requirements and translate them into models that provide input for database architectures. Utilizing the Erwin tool, you will enhance data management strategies and ensure compliance with governance standards. Your role is vital as it supports the company's ability to make data-driven decisions and derive insights that align with strategic objectives. Key Responsibilities - Design and maintain logical and physical data models using Erwin. - Collaborate with business analysts and stakeholders to gather data requirements and translate business processes into comprehensive data models. - Ensure data integrity, quality, and security in all modeling activities and implement best practices for data governance and management. - Develop and update metadata associated with data models and provide technical support for database design and implementation. - Conduct data profiling and analysis to define requirements, create data flow diagrams and entity-relationship diagrams, and review and refine data models with stakeholders and development teams. - Perform impact analysis for changes in the modeling structure, train and mentor junior data modeling staff, and ensure compliance with data standards and regulations. - Collaborate with ETL developers to optimize data extraction processes and document modeling processes, methodologies, and standards for reference. Required Qualifications - Bachelors degree in Computer Science, Information Technology, or a related field. - Minimum of 3 years of experience as a data modeler or in a related role with proven expertise in using Erwin for data modeling. - Strong knowledge of relational databases and SQL, experience in data architecture and database design principles, and familiarity with data warehousing concepts and practices. - Ability to analyze complex data structures, recommend improvements, understand data governance frameworks and best practices, and possess excellent analytical and problem-solving skills. - Strong communication and documentation skills, ability to work collaboratively in a team-oriented environment, experience with data integration and ETL processes, and ability to manage multiple projects and deadlines effectively. - Familiarity with data visualization and reporting tools is a plus, willingness to keep skills updated with ongoing training and learning, and certification in Data Modeling or equivalent is desirable. Skills: entity-relationship diagrams, data modeling, documentation skills, database design principles, ETL processes, SQL proficiency, data integration, data architecture, DAX, database design, data governance, data security, SQL, Power Query, data governance frameworks, relational databases, analytical skills, problem-solving, data quality, communication skills, data warehousing, analytical thinking, data flow diagrams, team collaboration, Erwin, data profiling,
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
The candidate must have 12+ years of experience in the IT or relevant industry. Your main responsibility in this position is to design, implement, and maintain infrastructure and tools with the primary goal of automating the provisioning and monitoring of DevOps and infrastructure in Azure and AWS. You will collaborate with the Release (CICD) team, PMO team, QAI (CARS), and other team members to identify and implement automated build, test, and deployment processes in all environments. You will contribute to CI/CD development for deploying applications in Azure/AWS using tools like terraform and Ansible, along with Github, Team City, Jenkins, etc. It will be your responsibility to troubleshoot issues with CI/CD pipelines, identify areas for improvement, and ensure the security and compliance of the CI/CD pipelines and infrastructure. You will develop and maintain scripts and tools to automate the CI/CD process, build One Click environments for any version of software, including patches and hotfixes, and use Infrastructure as Code (IaC) and containerization to create reproducible deployments. Additionally, you will design and build systems for high availability, high throughput, data consistency, security, and end-user privacy. Furthermore, you will mentor other engineers, promote software engineering best practices, embrace automation, and measure everything mindset. You will come up with solutions for scaling data systems for various business needs and collaborate in a dynamic environment. Qualifications: Education: - B.Tech/BE/MCA in Computer Science, Engineering, Mathematics, Physics or other technical courses of study, or Business with an MIS background. MS degree is strongly preferred. Experience: - 12-17 years of experience managing and automating different environments. - 6+ years of DevOps experience with Azure or AWS. - 5+ years of Cloud IaC expertise in Terraform, Ansible deployment, Scripting, and Automation. - Strong understanding of CI/CD tools and a desire to help teams release frequently. - Expertise in DevOps, DevSecOps, DataSecOps, and managing container-based infrastructure. - Experience with tools like Team City, Jenkins, Azure, Terraform, Ansible, and Github. - Strong end-to-end ownership, urgency, cloud network model understanding, and cloud architecture knowledge. - Interest in learning and innovating, accountability, coordination, and people management skills. Preferred Requirements: - Certification in Azure or AWS. - Good communication skills. - Terraform and Hashicorp Certified Terraform Associate. - Familiarity with SDLC, Agile methodologies, and working in the financial industry. Supervisory Responsibility: - Individual Contributor - Team Lead - Manager of Managers Travel: - May be required on a limited basis.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company's data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Within CDAO, The Firmwide Chief Data Office (CDO) is responsible for maximizing the value and impact of data globally, in a highly governed way. It consists of several teams focused on accelerating JPMorgan Chase's data, analytics, and AI journey, including: data strategy, data impact optimization, privacy, data governance, transformation, and talent. As a Data Management Lead within the Firmwide CDO team, you will be responsible for working with stakeholders to define governance, guidance, risk metrics, and tooling requirements in support of the Data Ownership program and overall Data Governance framework. In addition, you will be responsible for delivering on project milestones, tracking program adoption, and producing ongoing senior management reporting and metrics. Job Responsibilities: - Support the evolution and development of the Data Owner and Data Governance framework - Maintain and develop standards and procedures related to Data Governance. Support regulatory, compliance, and audit requests - Drive defining program requirements and deliverables for tools and metrics by identifying, driving, and participating in project milestones and phases, as well as specific action items - Coordinate the execution activities, partnering with Technology and LOB/CF Chief Data Officer teams. Develop risk or operational metrics to facilitate oversight - Develop effective presentations and project update materials suitable for senior executives - Actively participate and collaborate in work stream meetings and ad hoc working sessions including driving agendas, preparing meeting minutes, and keeping track of agreed actions. Monitor the details and status of the project and escalate and resolve any project issues in an efficient manner - Develop training material, including online training courses to support education and tooling Required qualifications, capabilities, and skills: - Formal training or certification on Data Management concepts and 5+ years applied experience - Hands-on in writing and editing Standards, Control Procedures, Guidelines, and Training materials - Experience of Agile Development methodologies; partnering with Development teams to deliver tools and solutions - Excellent communication skills (oral and written), interpersonal, and the ability to work effectively in cross-functional teams - Ability to manage complexity and multiple work streams while delivering solutions within a tight deadline - Excellent project management and organizational skills, with the ability to manage multiple deliverables and work simultaneously - Proficient with project management tools and Microsoft Office skills, including Microsoft Excel, Word, and PowerPoint Preferred qualifications, capabilities, and skills: - Knowledge of data management and governance is preferred - Bachelor's degree in Business, Finance, Economics, or another related area - Familiar in Developing or maintaining Risk, Program, or Operational Metrics - Familiar with Tableau/visualization/workflow automation tools.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our EY-DnA team, you will help clients in developing and implementing data analyses, data collection systems, and other strategies that optimize statistical efficiency and quality. Working with the latest tools available in the market, you will help clients by acquiring data from primary or secondary data sources and maintain databases/data systems to identify, analyze, and interpret trends or patterns in complex data sets. Using your expertise in data analytics, you will help our clients view data in context and make smarter business decisions to achieve improved products and services. We're looking for Gig with expertise in Data & Analytics to join our dynamic Advisory DnA Team. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth of a new service offering. Role Overview: As a STIBO MDM Specialist, you will play a critical role in implementing and managing STIBO Master Data Management (MDM) solutions. You will collaborate with cross-functional teams to ensure seamless data management, integration, and governance, enabling data consistency and accuracy across the organization. Key Responsibilities: - STIBO MDM Implementation: - Lead the design, development, and deployment of STIBO MDM solutions. - Configure data models, workflows, and hierarchies to meet business requirements. - Data Quality and Governance: - Define and implement data governance policies within the STIBO MDM framework. - Monitor and improve data quality by enforcing validation rules and workflows. - Integration and Mapping: - Integrate STIBO MDM with enterprise applications (ERP, CRM, etc.) using APIs or middleware. - Ensure data mapping, transformation, and synchronization across systems. - Collaboration and Training: - Work closely with business teams, data stewards, and IT teams to gather requirements and deliver solutions. - Provide training and support to end-users on the STIBO MDM platform. - Issue and Risk Management: - Identify potential risks related to data inconsistencies or integration issues and propose mitigation strategies. - Troubleshoot and resolve technical issues in a timely manner. Required Skills and Experience: - In-depth knowledge and hands-on experience with the STIBO MDM platform, including configuration and implementation. - Strong understanding of data modeling, data integration, and master data workflows in STIBO. - Proficiency in integrating MDM with enterprise systems using APIs, ETL tools, or middleware. - Expertise in data governance, quality management, and validation techniques. - Strong problem-solving and analytical skills to address complex data challenges. - Excellent communication skills to engage with business and technical stakeholders. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be part of a market-leading, multi-disciplinary team of 550+ professionals, in the only integrated global advisory business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries. What working at EY offers: At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
vellore, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing, developing, and optimizing data pipelines and ETL workflows using AWS Glue, AWS Lambda, and Apache Spark. Your role will involve implementing big data processing solutions utilizing AWS EMR and AWS Redshift. You will also be tasked with developing and maintaining data lakes and data warehouses on AWS, including S3, Redshift, and RDS. Ensuring data quality, integrity, and governance will be a key aspect of your responsibilities, which will be achieved through leveraging AWS Glue Data Catalog and AWS Lake Formation. It will be essential for you to optimize data storage and processing for both performance and cost efficiency. Working with structured, semi-structured, and unstructured data across various storage formats such as Parquet, Avro, and JSON will be part of your daily tasks. Automation and orchestration of data workflows using AWS Step Functions and Apache Airflow will also fall within your scope of work. You will be expected to implement best practices for CI/CD pipelines in data engineering with AWS CodePipeline and AWS CodeBuild. Monitoring, troubleshooting, and optimizing data pipeline performance and scalability will be critical to ensuring smooth operations. Collaborating with cross-functional teams, including data scientists, analysts, and software engineers, will be necessary to drive successful outcomes. Your role will require a minimum of 6 years of experience in data engineering and big data processing. Proficiency in AWS cloud services like AWS Glue, AWS Lambda, AWS Redshift, AWS EMR, and S3 is paramount. Strong skills in Python for data engineering tasks, hands-on experience with Apache Spark and SQL, as well as knowledge of data modeling, schema design, and performance tuning are essential. Understanding AWS Lake Formation and Lakehouse principles, experience with version control using Git, and familiarity with CI/CD pipelines are also required. Knowledge of data security, compliance, and governance best practices is crucial. Experience with real-time streaming technologies such as Kafka and Kinesis will be an added advantage. Strong problem-solving, analytical, and communication skills are key attributes for success in this role.,
Posted 1 week ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
If you are seeking further opportunities to advance your career, you can take the next step in realizing your potential by joining HSBC. HSBC is a global banking and financial services organization operating in 62 countries and territories. The organization's goal is to be present where growth occurs, supporting businesses to thrive, economies to prosper, and individuals to achieve their aspirations. Currently, HSBC is looking for an experienced professional to join the team in the position of Data Technology Lead, focusing on Data Privacy. The role is open in Pune or Hyderabad. As a Data Technology Lead, you will be responsible for driving the strategy, engineering, and governance of Data Privacy technology within the CTO Data Technology function. Your role is crucial in ensuring the bank's compliance with complex global data privacy regulations and commitments to customer trust through scalable, automated, and resilient technology solutions. Your main responsibilities will include designing and executing enterprise-wide capabilities for classifying, protecting, governing, and monitoring personal and sensitive data throughout its lifecycle, from data discovery to secure deletion. This will involve integrating solutions across data platforms, operational systems, and third-party systems. You will lead cross-functional teams and collaborate closely with Group Privacy, Legal, Risk, Cybersecurity, and business-aligned CTOs to implement privacy-by-design practices across platforms and establish robust data protection standards. This leadership role is highly impactful, requiring expertise in privacy technologies, platform engineering, control automation, and global compliance. Key Responsibilities: - Define and lead the enterprise strategy for Data Privacy Technology across different data environments. - Design and implement technology capabilities to support privacy compliance frameworks such as GDPR. - Lead the development and integration of solutions for data classification, consent management, access controls, data subject rights fulfillment, data retention, and disposal. - Govern and oversee the privacy tooling landscape, ensuring robust metadata management and control enforcement. - Collaborate with Group CPO, Legal, and Compliance to translate regulatory mandates into implementable technology solutions. - Embed privacy-by-design principles into data pipelines, software development lifecycles, and DevSecOps practices. - Drive enterprise-wide adoption and monitor privacy controls" effectiveness using metrics, dashboards, and automated audits. - Lead engineering teams to deliver scalable services with resilience, performance, and observability. - Participate in regulatory engagements, internal audits, and risk forums related to data privacy controls. - Cultivate a high-performing team culture and advocate for privacy as a core design principle. Requirements: - 15+ years of relevant experience in enterprise data or technology roles, including senior leadership in data privacy, security, or compliance engineering. - Deep expertise in data privacy technologies, both product-based and in-house development. - Strong knowledge of global privacy regulations and frameworks, such as GDPR. - Technical proficiency in data discovery, classification, masking, encryption, and access control enforcement. - Understanding of metadata-driven architectures and control automation. - Experience in hybrid data estates, including data lakes and multi-cloud environments. - Proven track record of partnering with legal, compliance, and cybersecurity teams to implement privacy programs aligned with business needs. - Ability to lead multi-regional engineering teams, make architectural decisions, and deliver at scale. - Experience with platform monitoring, policy enforcement, and control assurance frameworks. Join HSBC to achieve more in your career. For more information and to explore opportunities, visit www.hsbc.com/careers. Please note that personal data provided by applicants will be handled in accordance with the Bank's Privacy Statement available on the website.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We're looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills. Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions. Define and develop client specific best practices around data management within a cloud environment. Recommend design alternatives for data ingestion, processing, and provisioning layers. Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Synapse. Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Have managed team and have experience in end-to-end delivery. Have experience of building technical capability and teams to deliver. Skills and attributes for success Strong understanding & familiarity with all Cloud Ecosystem components. Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms. Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL. Hands-on expertise in cloud services like AWS, and/or Microsoft Azure ecosystem. Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance. Experience with BI, and data analytics databases. Experience in converting business problems/challenges to technical solutions considering security, performance, scalability, etc. Experience in Enterprise grade solution implementations. Experience in performance benchmarking enterprise applications. Strong stakeholder, client, team, process & delivery management skills. To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience. Ideally, you'll also have Project management skills. Client management skills. Solutioning skills. What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a highly skilled and results-driven AdTech and Data Monetization Specialist, your role will involve leading and optimizing our digital advertising technology stack to unlock revenue opportunities through data-driven strategies. You will be responsible for managing our AdTech infrastructure, developing monetization strategies, and collaborating with cross-functional teams to enhance advertising performance and maximize the value of first-party and third-party data assets. Key Responsibilities: - Design, implement, and optimize AdTech solutions, including DSPs, SSPs, DMPs, CDPs, and ad servers like Google Ad Manager, The Trade Desk, and Adobe Experience Platform. - Develop and execute data monetization strategies leveraging audience segmentation, lookalike modeling, and programmatic advertising. - Manage partnerships with ad networks, exchanges, and data marketplaces. - Collaborate with Data, Product, and Marketing teams to ensure effective collection, enrichment, and activation of data for advertising and revenue purposes. - Oversee yield optimization for programmatic and direct-sold campaigns. - Stay updated with AdTech trends, regulatory changes (e.g., GDPR, CCPA), and competitive benchmarks through market research. - Define and monitor key KPIs including fill rate, CPM, eCPM, CTR, data revenue, and ROI. - Evaluate and onboard new platforms or partners to increase data monetization potential. - Ensure compliance with privacy regulations and internal data governance policies. Requirements: - Bachelor's degree in marketing, Computer Science, Data Science, Business, or a related field. - 5+ years of experience in AdTech, digital advertising, or data monetization. - Strong understanding of programmatic advertising ecosystems, real-time bidding, and data privacy laws. - Proficiency with tools like Google Ad Manager, The Trade Desk, Segment, Snowflake, or LiveRamp. - Experience with data analytics platforms such as Google Analytics, Looker, or Tableau. - Strong project management, analytical, and communication skills. What We Value in Our People: - Taking quick decisions and delivering with accuracy. - Showing ownership, making things happen, and building solutions. - Seeking continuous learning and taking pride in the work done. - Acting in the interest of the organization and empowering others to grow. - Empowering others with tough love and helping them develop.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Lead Consultant, Data Analyst at Genpact, you will be responsible for defining and developing data quality requirements and controls for Financial Crime & Risk Management systems. Your role will involve collaborating with Technology and Business teams to gather data requirements, analyzing Source to Target Mappings documents, and engaging stakeholders to resolve Data Quality (DQ) issues. You will also play a key role in performing root cause analysis across data issues, developing test scripts for various testing phases, and streamlining the DQ exception management process. Your qualifications should include a University Degree, preferably in technology/engineering, and experience working in a financial institution, preferably a Global bank. Additionally, you should have expertise in data lifecycle, data governance, data quality, and experience with SQL & Python programming, particularly in Oracle-based tools. Knowledge of Compliance concepts such as Anti-Money Laundering (AML), Know Your Customer (KYC), and Customer Risk Rating is essential. Experience with Informatica ETL Tool and Project Management would be advantageous. This is a full-time position based in Gurugram, India. If you are a proactive individual with a strong background in data analysis and a passion for driving data quality improvements in a financial services domain, we invite you to apply for this exciting opportunity with Genpact. Please note that this job posting was created on Feb 20, 2025, and the unposting date is scheduled for Aug 18, 2025.,
Posted 1 week ago
3.0 - 15.0 years
7 - 25 Lacs
hyderabad, chennai, bengaluru
Work from Office
Roles and Responsibilities : Design, develop, and maintain metadata frameworks for data governance across various business units. Collaborate with stakeholders to identify and define data quality requirements, ensuring compliance with industry standards. Develop and implement effective data governance policies, procedures, and best practices to ensure high-quality data management. Provide training and support to end-users on metadata management tools and processes. Job Requirements : 3-15 years of experience in Metadata Management or related field (Data Governance). Strong understanding of data quality principles, including data validation, cleansing, and profiling techniques. Location- PAN INDIA Experience with developing metadata frameworks using industry-standard tools such as Informatica IDQ or similar technologies.
Posted 1 week ago
5.0 - 10.0 years
20 - 25 Lacs
mumbai
Work from Office
We are looking for a skilled Business Analyst to support remediation activities related to Account Reference Data. The ideal candidate will collaborate closely with SMEs and other stakeholders to ensure data accuracy, consistency, and compliance with internal standards. Key Responsibilities: Work with Subject Matter Experts (SMEs) to gather, analyze, and document requirements related to account remediation Investigate and cleanse Account Reference Data to ensure completeness and correctness Identify data quality issues and drive resolutions in collaboration with relevant teams Support remediation initiatives by providing detailed analysis and impact assessments Prepare clear documentation, reports, and status updates for project stakeholders Ensure alignment with regulatory requirements and internal data governance policies Proven experience as a Business Analyst, preferably in a remediation or data quality-focused role Strong understanding of account structures and account reference data Ability to work independently and liaise effectively with SMEs and cross-functional teams Excellent analytical, problem-solving, and communication skills Experience working in a banking or financial services environment is a plus
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |