Jobs
Interviews

4996 Data Governance Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

When you mentor and advise multiple technical teams and drive financial technologies forward, you face a significant challenge with a substantial impact. This role of a Senior Manager of Software Engineering at JPMorgan Chase within the Consumer and Community Banking- Data Technology places you in a leadership position. Your responsibilities include providing technical coaching and advice to multiple technical teams, anticipating the needs and potential dependencies of other functions within the firm. Your expertise in the field enables you to influence budget and technical considerations to enhance operational efficiencies and functionalities. As the Senior Manager of Software Engineering, you will: Provide overall direction, oversight, and coaching for a team of entry-level to mid-level software engineers handling basic to moderately complex tasks. Be accountable for decisions affecting team resources, budget, tactical operations, execution, and implementation of processes and procedures. Ensure successful collaboration across teams and stakeholders, identifying and mitigating issues, and escalating them as necessary. Offer input to leadership on budget, approach, and technical considerations to enhance operational efficiencies and functionality for the team. Create a culture of diversity, equity, inclusion, and respect for team members, giving priority to diverse representation. Enable the Gen AI platform and implement the Gen AI Use cases, LLM fine-tuning, and multi-agent orchestration. Manage an AIML Engineering scrum team comprising ML engineers, Senior ML engineers, and lead ML engineer. Required Qualifications, Capabilities, and Skills: Formal training or certification in software engineering concepts with over 5 years of applied experience. Extensive practical experience with Python and AWS cloud services, including EKS, EMR, ECS, and DynamoDB. Experience in DataBricks ML lifecycle development. Advanced knowledge in software engineering, AI/ML, machine learning operations (MLOps), and data governance. Demonstrated prior experience in leading complex projects, encompassing system design, testing, and ensuring operational stability. Expertise in computer science, computer engineering, mathematics, or a related technical field. Understanding of large language model (LLM) approaches, such as Retrieval-Augmented Generation (RAG). Preferred qualifications, capabilities, and skills: Real-time model serving experience with Seldon, Ray, or AWS SM. Experience in agent-based model.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences all created by our global community of developers and creators. Roblox is dedicated to building the tools and platform that empower the community to bring any experience they can imagine to life. The vision at Roblox is to reimagine the way people come together from anywhere in the world and on any device. The mission is to connect a billion people with optimism and civility, and the company is seeking exceptional talent to help achieve this goal. As an Engineering Manager for the Roblox Operating System (ROS) team in India, you will be responsible for establishing and leading a high-performing data engineering team. This role will involve collaborating with US-based ROS and Data Engineering teams, as well as the People Science & Analytics team, to build scalable data pipelines, robust infrastructure, and impactful insights. Working closely with the US ROS Engineering Manager, you will set high technical standards, promote leadership principles, and drive innovation to shape the future of data engineering at Roblox. **Key Responsibilities:** - **Building and Leading:** Attract, hire, mentor, and inspire a team of exceptional engineers with varied strengths. Foster a collaborative and inclusive environment where all team members can thrive. - **Setting the Bar:** Establish and maintain a high standard for technical excellence and data quality. Ensure the team delivers reliable, scalable, and secure solutions that align with Roblox's engineering principles. Be prepared to be hands-on, contributing to code reviews and technical design discussions. - **Cross-Functional Collaboration:** Partner with data scientists, analysts, product, and engineering teams to understand business needs and translate them into technical solutions. - **Strategic Planning:** Contribute to the overall engineering strategy for Roblox India. Identify opportunities for innovation and growth, and prioritize projects that deliver maximum value to users. - **Continuous Improvement:** Foster a culture of learning and continuous improvement within the team. Encourage experimentation, knowledge sharing, and the adoption of new technologies. **Requirements:** - **Proven Leadership:** Demonstrated experience in leading and scaling data engineering teams, preferably in a high-growth environment. - **Technical Expertise:** Solid understanding of data engineering principles and best practices for data governance. Experience in building scalable data pipelines using orchestration frameworks like Airflow. Proficiency in SQL, relational databases, data warehouse solutions (e.g., Snowflake, Redshift, BigQuery), and data streaming platforms (e.g., Kafka, Kinesis, Spark). Knowledge of containerization (e.g., Docker) and cloud infrastructure (e.g., AWS, Azure, GCP). - **Roblox Alignment:** Strong alignment with Roblox's leadership principles focusing on respect, safety, creativity, and community. - **Excellent Communication:** Exceptional communication and interpersonal skills to build relationships with team members, stakeholders, and leaders. - **Problem-Solving:** Strong analytical and problem-solving skills to tackle complex challenges and develop creative solutions. - **Passion for Roblox:** Genuine excitement for the Roblox platform and the possibilities of the metaverse. Please note that roles based in our San Mateo, CA Headquarters require in-office presence on Tuesday, Wednesday, and Thursday, with optional in-office days on Monday and Friday.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Principal Consultant in Finance Data specializing in Data Modeling, you will play a pivotal role in designing and developing finance data models within the Finance Data space. Your contribution will be essential for the Finance Data program's objective of establishing a self-service platform for the distribution of reliable data for the Finance sector. Collaborating with Senior stakeholders across the bank, you will be tasked with dissecting intricate challenges and devising innovative solutions. You will be an integral part of a dynamic and supportive team dedicated to shaping the future of data modeling and architecture design in the financial data realm. Your responsibilities will include assisting in the design and development of conceptual, logical, and application data models aligned with the organization's Future State Finance Data Asset Strategy. Working closely with Finance business teams, you will enhance understanding, interpretation, design, and implementation while supporting the transition to target state data models and Data Asset delivery. Ensuring that Finance data models conform to Enterprise data models and comply with Enterprise Architecture principles will be a key aspect of your role. As a subject matter expert in Finance data modeling, you will contribute to model development planning, governance forums, and continuous improvement efforts within the data modeling estate. Your mandate will also encompass translating Finance business requirements into data modeling solutions, conducting audits of data models, providing technical advice, and communicating data modeling solutions to various audiences. Additionally, you will be expected to support cross-program and cross-group Data Assets execution and delivery strategies. The ideal candidate for this role will exhibit self-motivation, problem-solving skills, and strong collaboration abilities. Being adaptable, possessing excellent communication skills, and demonstrating the capacity to work across multiple Business Functions concurrently will be essential. Moreover, a minimum of 5 years of experience in data management and modeling within the Financial Services sector, familiarity with Agile methodologies, and proficiency in data modeling tools are mandatory requirements. If you are enthusiastic about data privacy, possess an entrepreneurial spirit, and are keen to contribute to the evolution of data modeling in the financial services industry, we look forward to your application. Join us at Capco, where you can engage in transformative projects with leading global banks and make a significant impact in the financial services sector.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Big Data Developer role is integral to our organization, involving the design, construction, and management of extensive data processing systems. Your primary responsibility will be to convert raw data into valuable insights that drive strategic decision-making throughout various departments. Working closely with data analysts, data scientists, and stakeholders, you will develop scalable data pipelines, enhance database architecture, and maintain data quality and accessibility. As businesses increasingly rely on data for decision-making, your expertise in handling large volumes of structured and unstructured data will be pivotal in helping the organization gain a competitive advantage through analytics. This position demands a blend of technical proficiency and a deep understanding of business operations to create data-driven solutions that cater to our diverse client base. Key Responsibilities - Develop and maintain scalable data processing pipelines. - Integrate data from multiple sources into a unified database structure. - Optimize current data systems for improved performance and scalability. - Analyze extensive datasets to identify trends and offer insights. - Design and implement ETL processes for data transformation and loading. - Collaborate with data scientists and analysts to refine data requirements. - Write and optimize SQL queries for data retrieval and manipulation. - Utilize Hadoop frameworks for managing and processing big data. - Monitor system performance, troubleshoot issues in real-time. - Implement data security and governance measures. - Participate in code reviews, maintain programming standards. - Develop documentation for data architecture and processes. - Stay abreast of emerging technologies in the big data field. - Contribute to cross-functional team projects and initiatives. - Provide training and support to team members on best practices. Required Qualifications - Bachelor's degree in Computer Science, Information Technology, or related field. - Demonstrated experience as a Big Data Developer or in a similar capacity. - Proficiency in programming languages like Java and Python. - Hands-on experience with big data tools such as Hadoop, Spark, and Kafka. - Strong SQL skills and familiarity with NoSQL databases like MongoDB. - Knowledge of cloud platforms like AWS, Azure, or Google Cloud. - Understanding of data warehousing concepts and design. - Strong problem-solving abilities and analytical thinking. - Experience with data integration and ETL tools. - Familiarity with data visualization tools is advantageous. - Ability to work under pressure, meet tight deadlines. - Excellent communication skills for effective team collaboration. - Capacity to adapt to new technologies, continuous learning. - Exposure to Agile methodologies and development processes. - Understanding of data governance and compliance standards. - Attention to detail and strong organizational skills. Skills: data security, agile methodologies, data visualization, team collaboration, AWS, data processing, big data, Azure, data governance, Hadoop, problem-solving, NoSQL, SQL, Spark, Google Cloud, ETL, MongoDB, data warehousing, Java, Kafka, data integration, data modeling, Python,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Description: As a STIBO MDM/PIM Developer located in Bengaluru, you will be responsible for managing Master Data and Product Information using STIBO on a full-time on-site basis. Your day-to-day tasks will involve ensuring the smooth functioning of the MDM system and PIM processes. To excel in this role, you should possess a deep understanding of concepts related to MDM or other enterprise platforms such as PIM, CMDM, SMDM, or Multi-domain MDM. Your expertise in STIBO MDM Strategy, Architecture, Design, Integration, Data Quality, Data Governance, and familiarity with Data Stewardship activities will be crucial in successfully carrying out your responsibilities. Collaboration with customers to gather and comprehend requirements, as well as the ability to architect technical designs and solutions, will be key aspects of your role. You will also take the lead as a technical architect in implementing these solutions to meet the organization's needs effectively.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data profiling and data cleansing methodologies.- Familiarity with database management systems and SQL.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics.- Strong understanding of data modeling and database design principles.- Experience with data integration and ETL tools.- Familiarity with data governance and data quality best practices. Additional Information:- The candidate should have minimum 2 years of experience in Microsoft Azure Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration processes and methodologies.- Experience with data quality management and data governance practices.- Familiarity with database management systems and data modeling techniques.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 2 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

8 - 12 Lacs

bengaluru

Work from Office

Project Role : Technology Consulting Practitioner Project Role Description : Advises, leads and works on high impact activities within the systems development lifecycle, and provides advisory work for the IT function itself. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Consulting Practitioner, you will advise, lead, and engage in high-impact activities throughout the systems development lifecycle. Your typical day will involve collaborating with various teams to ensure effective implementation of strategies, providing insights to enhance IT functions, and driving initiatives that align with organizational goals. You will also be responsible for managing project timelines and ensuring that deliverables meet quality standards, all while fostering a collaborative environment that encourages innovation and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and adjust strategies as necessary to meet objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with data integration and management processes.- Ability to analyze and optimize master data processes.- Familiarity with regulatory compliance related to data management. Additional Information:- The candidate should have minimum 5 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

Assists clients in the selection, implementation, and support of the SAP MDG Participate in SAP MDG implementations and develop master data maintenance solutions using the latest technology and tools Design and lead the overall strategic and tactical data migration initiatives, data cleansing initiatives and data profiling initiatives as well as data warehouse design, modeling and implementation for our customers. Experience in implementation planning, fit analysis, configuration, testing, rollout and post-implementation support. Good knowledge and experience within the Data Management Tools and a thorough understanding of SAP integration points Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 12 years of Industry experience Technical Knowledge and Experience in working with SAP Data Management Tools like Data Services, Cockpit, MDG, HANA EIM. Hands on MDG configuration experience, configuration related to customer and core STE processes Experience in IM Architecture, Data Migrations, Data Profiling and Data quality. Implementation experience of MDG in key domains such as Finance, Customer, Supplier, Material & Business Partners Experience in implementation, development or configuration on one or more of the following solutions from Data Management Suite-SAP Data Services / SAP MDG / Migration Cockpit / HANA EIM SDI. Experience to write scripts and complex SQL statements Preferred technical and professional experience Experience in Data Migration methodologies, specifically around legacy to SAP migration using solutions like Data Services will be nice to have. Implementation experience and knowledge in at least two of these areas would be an added advantage Master Data Management

Posted 2 weeks ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

gurugram

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

kolkata, mumbai, new delhi

Work from Office

The Product Owner is the member of both the Product Management Team and an Engineering delivery team and is responsible for feature requirements from definition to delivery. The Product Owner must understand the market, the customers, and the strategy to make sound decisions, work with key stakeholders throughout the organization and communicate decisions or progress throughout the implementation process. The Product Owner also must have a thorough understanding of the overall product vision and strategy and convey that to the delivery team. This includes defining for the team how each feature provides customer value and how it fits into the larger picture. The Product Owner must actively engage with the delivery team, including engineering, quality assurance, and documentation to ensure a successful release. They also are responsible for enabling internal teams such as Sales Engineering, Services, and Support on new features. What you will do: You will work with Product Managers to understand and validate customer needs and understand Product Vision and Product Roadmap. You will support the Product Managers in requirements validation by providing subject matter expertise, facilitating and documenting requirements validation reviews, and validating the requirements with customers. You will breakdown Master Features into Features and Features into individual requirements and acceptance criteria, ensuring there is customer value provided in each individual feature/requirement. You will prioritize the product backlog according to business value. You will convey the requirements and customer value of features to the delivery teams at the beginning of each feature implementation. You will ensure the engineering backlog addresses all detailed requirements by attending and advising in engineering backlog refinement sessions. You will, together with the delivery team, commit backlog items to sprint deliverables in Sprint Planning. You will work with the delivery team to address questions and concerns during sprints related to product functionality, always making sure requirements are being met and scope is maintained. You will make decisions during implementation, should they arise, around requirements, scope, and defects introduced, balancing customer value and timelines. You will enable delivered features to internal and external teams. Other duties may be assigned. What we are looking for: You have 3+ years working in a scrum agile software development environment; previous product owner experience preferred. You have 5+ years of enterprise software product management and/or business analyst experience. You have outstanding communication, presentation, and leadership skills with ability to present to various audiences from individuals on scrum teams to stakeholders and customers. BS in Computer Science or equivalent work experience. Experience in the master data management, data quality, data governance, metadata management software space. Experience with SaaS software delivery.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

pune

Work from Office

Technical Leadership Stay abreast of emerging AI technologies and trends Guide teams in the use of tools such as Python, TensorFlow, PyTorch, Azure ML, and LLM platforms Oversee architecture decisions, model lifecycle management, and data governance Promote experimentation and rapid prototyping to accelerate innovation To succeed, you will need Experience Requirement: 10+ years of experience in AI leadership roles Proven track record of managing team Experience in building and scaling high-performing teams Strong background in service delivery frameworks (ITIL, Agile, DevOps) Experience in managing multi-location or global delivery teams Demonstrated success in driving operational efficiency and excellence Hands-on experience with AI/ML tools and platforms (e g , Python, TensorFlow, PyTorch, Azure ML) Experience with data engineering, model deployment, and MLOps Familiarity with LLMs, generative AI, and prompt engineering Experience with cloud-native architectures and microservices Knowledge of compliance and data privacy regulations (GDPR, HIPAA) Experience with enterprise AI platforms (eg, Databricks) Knowledge of AI ethics, bias mitigation, and responsible AI practices Leadership Traits: Experience in change management and organizational transformation Exposure to leadership coaching or mentoring program You are passionate and have big picture vision for this role You have excellent communication skills, both verbal and written You are entrepreneurial and are open to different cultures You are customer focused, enthusiastic, and professional You can work under time pressure to respect deadlines You can integrate smoothly into the existing team and stimulate the knowledge sharing between your colleagues You can collaborate easily with colleagues from other business functions in the Global IT Hub and GECIA You also maintain good relations with third parties In return, we offer An opportunity to shape the future of AI in a global organization This is a strategic role focused on defining the roadmap for the artificial intelligence competence and achieving it

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

bengaluru

Work from Office

Skill Set Required : - 5 years of experience in ETL design, development and business system designs - Good understanding of AWS networking concepts: VPC, subnets, routing, NAT gateways, security groups - Proficiency in IAM concepts: roles, policies, assume-role, and cross-account access - Hands-on experience in AWS services including but not limited to: S3, KMS, Lambda, Glue (or Spark), SQS, EventBridge - Hand on experience in writing clean, modular, testable, and maintainable code using functional or OOP principle - Good experience in at least one programming language such as Python, Scala, Java etc - Willingness to learn Python/PySpark if not already experienced - Good understanding of Spark architecture, execution plans, and performance tuning - Strong SQL skills and experience with Redshift and/or Athena, or any other distributed compute engines - Experience in writing and maintaining Terraform scripts - Familiarity with Git, GitHub workflows, and CI/CD tools like Jenkins - Exposure to data lake architectures and data governance practices - Familiarity with monitoring and logging tools (eg, CloudWatch)

Posted 2 weeks ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

bengaluru

Work from Office

Description & Requirements Introduction: A Career at HARMAN Technology Services (HTS) We re a global, multi-disciplinary team that s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences Role: Power BI Architect Power BI Architect experience: Highly senior Power BI developer, skilled in Power BI report design and development, preferred experience is 15 years Able to lead development teams Manage semantic models design and development Manage release process Setup development strategy Review code of more junior members Use Tabular editor for model development GitHub for version control Power BI Desktop experience at an Enterprise level with large data sets Proficient in DAX Architect comprehensive data solutions aligned with business goals Design, develop, and manage Power BI dashboards/Semantic Models to meet specific business requirements. Define data architecture strategy based on business reporting needs. Analyse and interpret complex data to create clear visualizations and reports. Collaborate with cross-functional teams to define business intelligence needs. Provide technical expertise and recommendations for data management and improvement. Ensure compliance with data governance and data security requirements. Proven experience in Power BI development and architecture. Optimize performance of dashboards and models Skills: 1. Deep knowledge of database management and data modelling 2. Expertise in Red shift, DAX, SQL, and other analytical tools 3. Expertise in Power BI setup and management 4. Expertise in AWS Cloud Services and Azure Cloud Services 5. Strong critical thinking and problem-solving capabilities 6. Experience in leading and mentoring development teams Educational Qualification Experience working in cross-functional teams and collaborating effectively with different stakeholders. Strong problem-solving and analytical skills. Excellent communication skills to document and present technical concepts clearly. Bachelor s or master s degree in computer science or a related field with 10+ years of relevant Industry experience What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you re ready to innovate and do work that makes a lasting impact, join our talent community today! You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an @harman.com email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in @harman.com about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.( www.harman.com ) Apply

Posted 2 weeks ago

Apply

1.0 - 2.0 years

3 - 4 Lacs

noida

Work from Office

Monitor, maintain, analyze, and provide effective resolution of incoming data quality requests Daily task of gathering, consolidating, aggregating and ensuring quality within the data Effectively communicate and collaborate with data engineering and other teams to meet business and information needs Work with large amounts of data to drive business decisions Traverse large data to provide a wholistic view of customers Assist in making current processes more efficient and scalable Profile company data by analyzing, extracting, and optimizing relevant information Assist in identifying common elements that may be used to enhance current procedures Work with management, sales teams and end-users to create and manage workflows as well as data validation Assisting with technical operations as needed Other projects or tasks that arise to further the quality of company data and contribute to the overall success of C2FO Requirements : Excellent ability to quickly troubleshoot, problem solve, and analyze Intermediate knowledge of Microsoft Excel Ability to multi-task and prioritize multiple requests and projects Excellent research acumen Data management experience (data quality, data standardization, and data governance) Strong attention to detail and desire for accuracy and excellence in all tasks Ability to learn in a fast-paced environment Ability to work independently and as part of a team Ability to assist in defining, developing, testing, and implementing business solutions

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

gurugram

Work from Office

IT Offshore Job Description Experience building, deploying, supporting data ingestion and batch applications on Cornerstone / Google Cloud / LUMI using capabilities such as BigQuery, Cloud Storage, Dataproc, Cloud Composer/Airflow. Strong SQL, Python, PySpark, Hive Must have good knowledge of data management principles, data governance and database design concepts Internal Amex experience with Cornerstone/Lumi Strong understanding of governance processes Strong Data Analysis skills Clear and effective communicator Strong collaborator with multiple business and technology teams Preferred hands-on experience with CI/CD tools such as XLR, GitHub Actions Able to guide technical teams for delivery across time zones Prior experience in American Express Technology is a SIGNIFIC Job Requirements

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

pune

Work from Office

KPI Partners is seeking a skilled Databricks Specialist to join our team. The ideal candidate will possess a strong background in data engineering, analytics, and machine learning with substantial experience in the Databricks platform. Key Responsibilities: - Design, develop, and maintain scalable data pipelines and ETL processes on Databricks. - Collaborate with data scientists and analysts to support analytics initiatives using Databricks and Apache Spark. - Optimize data engineering workflows for performance and cost efficiency. - Monitor and troubleshoot data processing jobs and workflows to ensure high availability and reliability. - Implement and maintain data governance and security measures on Databricks. - Provide technical guidance and support to team members on Databricks best practices and performance tuning. - Stay updated with the latest trends in data engineering, big data, and cloud technologies. Qualifications: - Bachelors or Masters degree in Computer Science, Information Technology, or a related field. - Proven experience in working with Databricks, Apache Spark, and big data technologies. - Strong programming skills in languages such as Python, Scala, or SQL. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience with data visualization tools and frameworks. - Excellent problem-solving skills and the ability to work independently as well as part of a team. Preferred Qualifications: - Databricks certification or relevant big data certifications. - Experience with machine learning libraries and frameworks. - Knowledge of data warehousing solutions and methodologies. If you are passionate about data and possess a deep understanding of Databricks and its capabilities, we encourage you to apply for this exciting opportunity with KPI Partners. Join us in our mission to leverage data for impactful decision-making.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

pune

Work from Office

We are seeking a skilled Data Modeller to join our team at KPI Partners. The ideal candidate will be responsible for designing and implementing data models that effectively meet the business needs of our clients while ensuring data integrity and optimization. Key Responsibilities: - Collaborate with stakeholders to gather requirements and understand data architecture needs. - Design logical and physical data models that align with business objectives and requirements. - Optimize data models for performance and scalability. - Develop and maintain documentation for data models, including data flow diagrams and metadata repositories. - Ensure data quality by implementing data validation and verification procedures. - Work closely with database administrators and developers to ensure seamless data integration and implementation. - Participate in data governance and security initiatives. - Stay updated on industry trends and best practices in data modelling and management. Qualifications: - Bachelors degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Modeller or similar role, with a track record of successful data modeling projects. - Strong understanding of data warehousing concepts and methodologies. - Proficiency in data modeling tools and technologies such as Erwin, Oracle SQL, or other relevant tools. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. If you are passionate about data and have a desire to drive successful outcomes through effective data modeling, we invite you to apply to join our dynamic team at KPI Partners in one of our locations in Hyderabad, Bangalore, or Pune.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

gurugram

Work from Office

Team Description The Commercial Data Office (CoDO) team, within Global Commercial Services (GCS), is focused on powering the best customer experience and driving value through data. With continuous changes in the regulatory environment and data innovation, CoDO plays a key role in supporting GCS sustainable growth balanced with controls and risk management. How will you make an impact in this role As the Analyst of the Commercial Data Risk Management team, you will be responsible for working with diverse set of stakeholders like the American Express Enterprise Data Office, GCS Control Management, business teams, technology teams, and product/platform teams to implement and strengthen data risk management practices and enable secure, reliable, and compliant use of our data assets. We re looking for someone who enjoys a fast-paced environment, operates cross-functionally (both with business and technology stakeholders), and maintains a positive attitude and sense of humor in the face of challenges. The Key job responsibilities will include but are not limited to. Identify potential data risks across systems, processes, and third parties. Support risk assessments to evaluate data retention & deletion, data sharing and data sensitivity risks. Support implementation of data risk controls aligned with internal policies, regulatory requirements and industry best practices. Support the adoption and execution of enterprise data risk management frameworks and standards. Monitor risk indicators and control effectiveness using metrics and dashboards. Support investigation, root cause analysis and corrective action plans of data incidents or breaches. Contribute to promotion & awareness of data risk awareness across Business Unit to strengthen data risk culture. Minimum Qualifications 2+ years of direct work experience in risk management, data governance or a similar role within financial services or banking industry. Strong understanding of data risk, and compliance frameworks. Strong analytical, problem-solving, and critical thinking skills. Project management experience, with partners across a diverse, global set of business and technology teams. Strong written and verbal communication skills, with ability to engage stakeholders across technical and business teams. Attention to detail and strong organizational skills. Confident tackling high importance initiatives that are ambiguous in nature, driving them to completion with minimal direction and demonstrating a sense of ownership. Willing to challenge the status quo, learn new skills, innovate and have fun in the process. Bachelor s degree in computer science, computer engineering, other technical or business subject areas.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

bengaluru

Work from Office

As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking- Data Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Hands on code development to enable our AI/ML platform, ensuring robustness, scalability, and high performance. Adopt the best practices in software engineering, machine learning operations (MLOps), and data governance. Maintain consistent code check-ins every sprint to ensure continuous integration and development. Executes using Platform engineering to enable the Gen AI platform and develop the Gen AI Use cases ,LLM fine tuning and multi agent orchestration. Communicate technical concepts and solutions effectively across all levels of the organization. Required Qualifications, Capabilities, and Skills Formal training or certification on software engineering concepts and 3+ years applied experience Extensive practical experience with Python and AWS cloud services, including EKS, EMR, ECS, and DynamoDB. Hands-on experience in DataBricks ML lifecycle development. Advanced knowledge in software engineering, AI/ML, machine learning operations (MLOps), and data governance. Real-time model serving experience with Seldon Preferred qualifications, capabilities, and skills Ray Or AWS SM is a plus.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

mumbai

Work from Office

Roles & Responsibilities Implement, configure, and support SAP FICO / SD modules to meet business requirements. Work closely with stakeholders to gather, analyze, and document financial processes and needs. Design, test, and deploy SAP solutions, ensuring optimal performance and compliance. Provide end-user training and support to enhance SAP FICO adoption. Collaborate with cross-functional teams to integrate SAP FICO with other business modules. Troubleshoot and resolve SAP FICO/SD related issues in a timely manner. Should be having technical awareness about the Development activity i.e., FS preparation, debugging knowledge, User Exits, Enhancements, LSMW, BDC applications, BAPIS, BADI development, background jobs. Knowledge of SD area includes Sales Order Management, Shipping, Billing, Pricing, Credit Management, Availability check, Variant Configuration, pricing rate determination, Shipment and Shipment Cost, GL Account Determination, Batch Management. Must have exposure in different sales processes of the entire OTC cycle like Credit/Debit Memo Processing, Consignment Sales, 3rd party sales, Intercompany sales, Scheduling Agreements, Stock Transfer Orders along with thorough Taxation knowledge. Knowledge of FICO includes bank integration related issues, GST development, accounts payable and receivables, assets accounting, GL maintenance etc. Create and maintain required documentation including design, functional specs and test cases. Experience in Integrating business process requirements with the technical implementation of SAP Master Data Governance. Skills & Competencies: Strong understanding of financial processes, accounting principles, and reporting. Experience with integration of SAP FICO with other SAP modules (MM, SD, PP, etc.). Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and collaboratively in a team environment. SAP FICO certification is an advantage. Knowledge of Treasury module would be an added advantage.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

bengaluru

Work from Office

KPMG India is looking for Senior - Data Governance to join our dynamic team and embark on a rewarding career journey. Develop and implement data governance frameworks and policies. Ensure compliance with regulatory requirements and internal standards. Monitor and maintain data quality across the organization. Collaborate with data owners to establish data stewardship programs. Conduct data governance training and awareness sessions. Identify and address data-related issues and risks. Support data governance initiatives and projects.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies