Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
7 - 11 Lacs
Hyderabad, Telangana, India
On-site
As a Data Stewardship Business Analyst you will play a pivotal role ensuring the quality of our data across all domains, which will directly influence patients who use our life-saving products. Key tasks include managing business and technical metadata and ensuring data made available through our platforms are Findable, Accessible, Interoperable and Reusable (FAIR). If you are passionate about data governance and want to make a significant impact, we encourage you to apply. What will you do in this role: Manage business and technical metadata in Collibra to ensure accuracy, consistency, and accessibility. Implement and manage metadata processes to enhance data discoverability in our Data Marketplace. Develop and maintain standards for data quality metrics and monitor performance. Support initiatives, such as configuring data quality rules, executing data quality checks and analyzing results to improve data quality and resolve issues. Document and register data access rules for data products in our Data Marketplace Assist in designing and implementing data governance frameworks, policies, and procedures to ensure data integrity, security, and compliance. Collaborate with cross-functional teams to establish data stewardship and ownership roles. Support the establishment and maintenance of master data and reference data management processes for consistent use across the organization. Utilize Collibra for data governance activities and train teams to enhance data governance initiatives. Project manage data governance projects. What should you have: Bachelors degree in Computer Science, Information Technology, or a related field, or equivalent experience. Hands-on professional who has been in the technology industry for minimum 7-11 years in a data governance role. Proficiency in using Collibra Data Catalog for data governance activities Expertise in metadata management and implementing metadata cataloging processes. Proven experience in supporting the design and implementation of data governance frameworks, policies, and procedures. Experience in master and reference data management and collaborating with business units. Excellent communication and stakeholder engagement skills, including conducting training sessions and presenting reports to senior management. Nice to have, but not essential: Understanding of data quality management and experience in developing and maintaining standards. Advanced degree in a related field. DAMA CDMP certification, or equivalent Experience in a research-intensive biopharmaceutical company.
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer Senior Location: Chennai Work Type: Hybrid Position Description: This fast-paced job position is intended for people who like to build analytics platforms and tooling which deliver real value to the business. Applicants should have a strong desire to learn new technologies and be interested in providing guidance which will help drive the adoption of these tools. The Analytics Data Management (ADM) Product Engineer will assist with the engineering of strategic data management platforms from Informatica, primarily Enterprise Data Catalog and Apache NiFi. Other technologies include Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; IBM: Information Server, Cloud Pak for Data (CP4D); Google: Cloud Data Fusion. This person will also collaborate with Infrastructure Architects to design and implement environments based on these technologies for use in the client&aposs enterprise data centers. Platforms may be based on-premises, or hosted in Google Cloud offering Skills Required: Informatica Skills Preferred: Cloud Infrastructure Experience Required: Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; Apache NiFi Experience Required Informatica Products: Installation, configuration, administration, and troubleshooting. Specific experience with Informatica Data Catalog is essential. Apache Nifi : Strong Java development experience to create custom NiFi processors and expertise in deploying and managing NiFi applications on Red Hat OS environments. Google Cloud Platform (GCP): Provisioning, administration, and troubleshooting of products. Specific experience with DataPlex or Google Cloud Data Fusion (CDF) is highly preferred. Experience Range: 5-8 years Experience Preferred: Summary of Responsibilities: Engineer, test, and modernize data management platforms primarily Informatica Enterprise Data Catalog and Apache NiFi. Enable cloud migrations for Analytics platforms Define, document, and monitor global (Follow-the-Sun) support procedures (Incident Management, Request Management, Event Management, etc). Provide Asia-Pacific (IST) 2nd level support for these products. Responsibilities Detail: Installing and configuring products, Working with platform teams support to resolve issues, Working with vendor support to resolve issues, Thoroughly testing product functionality on the platform; Developing custom installation guides, configurations, and scripts that are consistent with the client&aposs IT security policy; Providing 2nd level support regarding product related issues; Developing new tools and processes to ensure effective implementation and use of the technologies. Implementing, monitoring/alerting, and analyzing usage data to ensure optimal performance of the infrastructure. Maintaining a SharePoint site with relevant documentation, FAQs, processes, etc. necessary to promote and support the use of these technologies. Required Skills Ability collect and clearly document requirements. Ability to prioritize work and manage multiple assignments. Ability to create & execute detailed project plans and test plans. Education Required: Bachelor&aposs Degree Education Preferred: Bachelor&aposs Degree TekWissen Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will be responsible for designing and implementing Azure Synapse Analytics solutions for data processing and reporting. Your role will involve optimizing ETL pipelines, SQL pools, and Synapse Spark workloads to ensure efficient data processing. It will also be crucial for you to uphold data quality, security, and governance best practices while collaborating with business stakeholders to develop data-driven solutions. Additionally, part of your responsibilities will include mentoring a team of data engineers. To excel in this role, you should have 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Your expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes will be essential. Experience with Fabric is strongly desirable, and possessing strong leadership, problem-solving, and stakeholder management skills is crucial. Knowledge of Power BI, Python, or Spark would be a plus. You should also have deep knowledge of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and proficiency in writing complex SQL queries. Furthermore, you are expected to have knowledge and experience in Master Data/metadata management, including Data Governance, Data Quality, Data Catalog, and Data Security. Your ability to manage a complex and rapidly evolving business, actively lead, develop, and support team members will be key. As an Agile practitioner and advocate, you must be highly dynamic in your approach, adapting to constant changes in risks and forecasts. Your role will involve ensuring data integrity within the dimensional model by validating data and identifying inconsistencies. You will also work closely with Product Owners and data engineers to translate business needs into effective dimensional models. This position offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and receive competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. Ideally, you should hold a Bachelors/masters degree in software engineering, Computer Science, or a related area. Our company offers a range of benefits, including hybrid working arrangements, an annual performance-related bonus, Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture. MRI Software is dedicated to delivering innovative applications and hosted solutions that empower real estate companies to elevate their business. With a strong focus on meeting the unique needs of real estate businesses globally, we have grown to include offices across various countries with over 4000 team members supporting our clients. MRI is proud to be an Equal Employment Opportunity employer.,
Posted 4 days ago
4.0 - 8.0 years
8 - 15 Lacs
Mumbai
Work from Office
Min. 4 yrs. of hands-on exp. with Talend Data Integration and Data Quality. Expert in Talend Studio, DQ, and Stewardship Console with knowledge of data governance. Exp. working with relational databases & cloud platforms - preferred GCP. Required Candidate profile Knowledge of data profiling, cleansing, matching, & metadata management tools. Proficient in SQL for complex queries & data transformation; familiar with Git& CI/CD Excellent communication skills.
Posted 4 days ago
6.0 - 10.0 years
22 - 35 Lacs
Hyderabad, Chennai, Delhi / NCR
Work from Office
We are seeking a highly skilled and experienced Collibra Senior Developer. The candidate will be responsible for designing, developing, and implementing Collibra solutions. This role requires strong technical expertise in Collibra, data management, and integration with various data platforms. Key Responsibilities: 1. Design, configure, and implement Collibra Data Governance Center (DGC) solutions to meet business requirements. 2. Develop and maintain workflows, data models, and integrations within the Collibra platform. 3. Collaborate with business stakeholders, data stewards, and IT teams to gather requirements and translate them into technical solutions. 4. Customize Collibra assets, domains, and communities to align with organizational data governance frameworks. 5. Develop and manage Collibra Connect integrations with other enterprise systems. 6. Provide technical leadership and mentorship to junior developers and team members. 7. Troubleshoot and resolve issues related to Collibra configurations, workflows, and integrations. 8. Ensure compliance with data governance policies, standards, and best practices. 9. Prepare and maintain technical documentation, including solution designs, configuration guides, and user manuals. 10. Support user training and adoption of Collibra solutions across the organization. 11. Experience developing Collibra workflows using BPMN and integrating Collibra with other platforms via APIs. 12. Proficiency in Java, Python, or similar programming languages for integration and automation tasks. 13. Strong communication and interpersonal skills, with the ability to work collaboratively across teams.
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
Our client is a global IT services company with offices in India and the United States, specializing in digital transformation, IT collaborations, and utilizing technology to make a positive impact on businesses. They are currently seeking an experienced Informatica Architect to join their team. As an Informatica Architect, you will be responsible for leading data governance, data catalog, and data quality efforts. With over 7 years of expertise in data quality and data cataloging, you will work closely with the Data & Analytics lead to ensure the integrity and quality of critical data within the product. Your role will involve developing efficient data processes using tools such as Informatica, Alation, Altan, or Collibra. Key responsibilities include overseeing the data elements of a complex product catalog, designing and developing data catalog and data assets on leading tools, managing the Enterprise Glossary, configuring data catalog resources, implementing Critical Data Elements, and ensuring compliance with Data Governance Standards. The ideal candidate will have 7-8 years of enterprise IICS data integration and management experience, along with practical experience configuring data governance resources and hands-on experience with Informatica CDQ and Data Quality. A strong understanding of data quality, data cataloging, and data governance best practices is essential. Preferred qualifications include administration and management of Collibra/Alation data catalog tool, configuration of data profiling and data lineage, and working with Data Owners and stewards to understand catalog requirements. If you have the required qualifications and experience, we invite you to apply online through our portal or via email at careers@speedmart.co.in. Join us in driving digital transformation and making a difference in the world of business.,
Posted 6 days ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The responsibilities of the role involve designing and implementing Azure Synapse Analytics solutions for data processing and reporting. You will be required to optimize ETL pipelines, SQL pools, and Synapse Spark workloads while ensuring data quality, security, and governance best practices are followed. Collaborating with business stakeholders to develop data-driven solutions and mentoring a team of data engineers are key aspects of this position. To excel in this role, you should possess 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes is essential. Experience with Fabric is strongly desirable. Strong leadership, problem-solving, and stakeholder management skills are required. Additionally, knowledge of Power BI, Python, or Spark is a plus. Deep understanding of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and writing complex SQL queries are important competencies. Familiarity with Best Authorization and security practices for Azure components, Master Data/metadata management, and data governance is crucial. Being able to manage a complex and rapidly evolving business and actively lead, develop, and support team members is vital. An Agile mindset and the ability to adapt to constant changes in risks and forecasts are expected. Thorough knowledge of data warehouse architecture, principles, and best practices is necessary. Expertise in designing star and snowflake schemas, identifying facts and dimensions, and selecting appropriate granularity levels is also required. Ensuring data integrity within the dimensional model by validating data and identifying inconsistencies is part of the role. You will work closely with Product Owners and data engineers to translate business needs into effective dimensional models. Joining MRI Software offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and access competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. The ideal candidate should hold a Bachelor's/Master's degree in software engineering, Computer Science, or a related area. The benefits of this position include hybrid working arrangements, an annual performance-related bonus, 6x Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture at MRI Software. MRI Software is a company that delivers innovative applications and hosted solutions to empower real estate companies to enhance their business. With a flexible technology platform and an open and connected ecosystem, we cater to the unique needs of real estate businesses globally. With offices across various countries and a diverse team, we provide expertise and insight to support our clients effectively. MRI Software is proud to be an Equal Employment Opportunity employer.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Catalogue Technical Specialist/Analyst, you will be responsible for supporting the technical integration efforts of Alation into other enterprise systems that may contain critical data lineage or data quality information. Your daily tasks will include providing operational support for the integration catalog, managing active directory roles, maintaining integrations, and enhancing capabilities. Collaborating with Integration owners, you will identify, define, and capture integration and other key data within the integration catalog. Your role will also involve optimizing integration descriptions, keywords, and categories for effective search and discovery, as well as resolving issues related to the integration catalog promptly. Maintaining the data catalog to ensure accurate and up-to-date metadata for all data assets will be a crucial part of your responsibilities. You will establish and enforce data quality standards and guidelines across the organization, conducting regular data quality assessments and audits to address any data issues that arise. Additionally, you will act as a point of contact for data catalog-related inquiries, providing timely resolutions. Generating reports and dashboards to offer insights into data catalog usage, data quality, and metadata completeness will be an essential aspect of your role. You will also monitor and analyze data quality metrics to identify and resolve any anomalies or discrepancies, analyzing metadata to identify trends, patterns, and areas for improvement. Your experience as a self-directed individual with over 3 years of experience in integration, data & analytics, or related roles will be beneficial for this position. Experience in integration design and development on IPaaS platforms, DevOps, and CI/CD approach for integration deployment is preferred. Hands-on experience with catalog tools such as Alation or similar, as well as integrating with the ServiceNow platform, will be advantageous. Moreover, your proven experience in implementing and managing data lineage, catalog, or other solutions in complex enterprise environments, along with expertise in databases, business intelligence tools, and ETL tools, will be valuable. A strong understanding of data catalogs and their capabilities, including data dictionaries, business glossaries, business lineage, technical lineage, and data management workflows, is essential for this role. Understanding multiple system integrations, data flow, and data schema changes will also be part of your responsibilities. In summary, as a Catalogue Technical Specialist/Analyst, you will play a pivotal role in ensuring the smooth integration of Alation into various enterprise systems, maintaining data quality standards, and providing support for data catalog-related activities to enhance organizational data management practices.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join our fast-growing data team at the forefront of cloud data architecture and innovation. We are focused on building scalable, secure, and modern data platforms using cutting-edge Snowflake and other modern data stack technologies. If you are passionate about creating high-performance data infrastructure and solving complex data challenges in a cloud-native environment, this opportunity is perfect for you. As a Senior Data Engineer specializing in Snowflake and the modern data stack, your role will involve architecting and implementing enterprise-grade cloud-native data warehousing solutions. This hands-on engineering position offers significant architectural influence, where you will collaborate extensively with dbt, Fivetran, and other modern data tools to create efficient, maintainable, and scalable data pipelines using ELT-first approaches. Your responsibilities will include showcasing technical expertise in Snowflake Mastery, dbt Proficiency, Data Ingestion, SQL & Data Modeling, Cloud Platforms, Orchestration, Programming, and DevOps. Additionally, you will be expected to contribute to Data Management by understanding data governance frameworks, data quality practices, and data visualization tools. Preferred qualifications and certifications include a Bachelor's degree in Computer Science or related field, substantial hands-on experience in data engineering with a focus on cloud data warehousing, and relevant certifications such as Snowflake SnowPro and dbt Analytics Engineering. Your work will revolve around designing and implementing robust data warehouse solutions, architecting ELT pipelines, building automated data ingestion processes, maintaining data transformation workflows, and developing data modeling best practices. You will optimize Snowflake warehouse performance, implement data quality tests and monitoring, build CI/CD pipelines, and collaborate with analytics teams to support self-service data access. Valtech offers an international network of data professionals, continuous development opportunities, and a culture that values freedom and responsibility. We are committed to creating an equitable workplace that supports individuals from diverse backgrounds to thrive, grow, and achieve their goals. If you are ready to push the boundaries of innovation and creativity in a supportive environment, we encourage you to apply and join the Valtech team.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Specialist, you will be responsible for utilizing your expertise in ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, and various other tools to contribute to the successful implementation of data projects. Your role will involve working with technologies such as Cloud Trace, Cloud Logging, Cloud Storage, and Datafusion to build and maintain a modern data platform. To excel in this position, you should possess a minimum of 5 years of experience in the data engineering field, with a focus on GCP cloud data implementation suite including BigQuery, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, and Cloud Storage. Your strong understanding of very large-scale data architecture and hands-on experience in data warehouses, data lakes, and analytics platforms will be crucial for the success of our projects. Key Requirements: - Minimum 5 years of experience in data engineering - Hands-on experience in GCP cloud data implementation suite - Strong expertise in GBQ Query, Python, Apache Airflow, and SQL (BigQuery preferred) - Extensive hands-on experience with SQL and Python for working with data If you are passionate about data and have a proven track record of delivering results in a fast-paced environment, we invite you to apply for this exciting opportunity to be a part of our dynamic team.,
Posted 1 week ago
5.0 - 10.0 years
15 - 20 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate Job Summary We are seeking a hands-on GCP Data Engineer with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 5+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.
Posted 1 week ago
6.0 - 11.0 years
0 - 3 Lacs
Hyderabad, Pune, Delhi / NCR
Hybrid
Greetings, We have a opening for one of our clients " Deloitte " for the position " Collibra Data Governance Developer" Role & responsibilities We are seeking a highly skilled and experienced Collibra Senior Developer. The candidate will be responsible for designing, developing, and implementing Collibra solutions. This role requires strong technical expertise in Collibra, data management, and integration with various data platforms. Key Responsibilities: 1. Design, configure, and implement Collibra Data Governance Center (DGC) solutions to meet business requirements. 2. Develop and maintain workflows, data models, and integrations within the Collibra platform. 3. Collaborate with business stakeholders, data stewards, and IT teams to gather requirements and translate them into technical solutions. 4. Customize Collibra assets, domains, and communities to align with organizational data governance frameworks. 5. Develop and manage Collibra Connect integrations with other enterprise systems. 6. Provide technical leadership and mentorship to junior developers and team members. 7. Troubleshoot and resolve issues related to Collibra configurations, workflows, and integrations. 8. Ensure compliance with data governance policies, standards, and best practices. 9. Prepare and maintain technical documentation, including solution designs, configuration guides, and user manuals. 10. Support user training and adoption of Collibra solutions across the organization. 11. Experience developing Collibra workflows using BPMN and integrating Collibra with other platforms via APIs. 12. Proficiency in Java, Python, or similar programming languages for integration and automation tasks. 13. Strong communication and interpersonal skills, with the ability to work collaboratively across teams. Work timing: 11am-8pm Location : Hyderabad / Mumbai /Delhi/NCR / Bengaluru / Kolkata / Pune / Chennai .
Posted 1 week ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Hybrid
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: Technical GCP Data Architect/Lead Location: Madurai Experience: 12+ Years Notice Period: Immediate Job Summary We are seeking a hands-on Technical GCP Data Architect/Lead with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 10+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.
Posted 1 week ago
5.0 - 12.0 years
0 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Familiarity with Data Management Standards Ability to work with high volumes of detailed technical & business metadata. Experience with documenting Data Element metadata (Business Elements vs. Technical Data Elements) Experience with understanding how data transformations materialize and determine appropriate controls required to ensure high-level of data quality. Ability to understand and document application and/or data element level flows (i.e., lineage). Ability to analyze both Process and Datasets to identity meaningful actionable outcomes. Understand and implement changes to business processes. Develop and influence business processes necessary to support data governance related outcomes. Manage and influence across vertical organizations to achieve common objectives. Intermediate to Expert level knowledge of MS products such as Excel, PowerPoint, Word, Skype, & Outlook Working knowledge of Metadata tools such as Collibra or equivalent. Familiarity with Data Analytics / BI tools such as Tableau, MicroStrategy etc. Communication Skills: Create both visually and verbally engaging informative materials for departmental leadership, business partners, executives, and stakeholders. Ability to tailor communication of topics to various levels of the organization (e.g., technical audiences vs. business stakeholders). Desired Skills (nice-to-have): General knowledge of Banking industry.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Management Technical Lead/Product Owner at Air Products, you will be responsible for leading the technical support and implementation of various Data Management tools such as Alation Enterprise Data Catalog, SAP Information Steward, Precisely Management Studio, Qlik suite, SAC (SAP Analytics Cloud), SAP Datasphere, and HANA. Your role will involve possessing technical knowledge of these applications, including upgrades and maintenance, while collaborating effectively with global teams to build relationships with key stakeholders and drive business value through the use of data tools. In this hybrid role based in Pune, India, your primary responsibilities will include serving as the main point of contact for technical support, defining and prioritizing the technical product backlog in alignment with business objectives, collaborating with cross-functional teams, and leading the planning, execution, and delivery of technical environments. Your expertise will be crucial in providing technical guidance, training, and support to end-users, ensuring successful deployment and utilization of data management platforms. To excel in this role, you should have up to 8+ years of experience in Applications Development and/or Business Intelligence/Database work, with a focus on requirements analysis. A Bachelor's degree in computer science, Information Systems, or related field is required, with a preference for a Master's degree. Your technical skills should include experience with terraform and Azure DevOps for provisioning infrastructure, along with a deep understanding of data catalog concepts and data integration. Your ability to troubleshoot technical issues, translate business requirements into technical solutions, and communicate effectively with stakeholders will be essential. Experience with agile/scrum methodologies, strong analytical and problem-solving skills, and knowledge of data privacy considerations are also desired. By joining the Air Products team, you will contribute to building a cleaner future through safe, end-to-end solutions and driving innovation in the industrial gas industry. If you are a self-motivated and detail-oriented professional with a passion for data management and analytics solutions, we invite you to consider this exciting opportunity to grow with us at Air Products and be part of our mission to reimagine what's possible in the world of energy and environmental sustainability.,
Posted 1 week ago
1.0 - 4.0 years
2 - 4 Lacs
Hyderabad
Remote
Job description We have a vacancy with below details, Role : Analyst, Data Sourcing Metadata -Cloud Designations: Analyst Experience -1-4 Notice Period : Immediate to 60 days ( Currently serving) Work Mode : WFH(Remote) Working Days : 5 days Mandatory Skills : Data Management, SQL, Cloud tools(AWS/Azure/GCP),ETL Tools (Ab initio, Collibra, Informatica),Data Catalog, Data Lineage, Data Integration, Data Dictionary, Maintenance, RCA, Issue Analysis Required Skills/Knowledge: Bachelors Degree, preferably in Engineering or Computer Science with more than 1 years hands-on Data Management experience or in lieu of a degree with more than 3 years experience Minimum of 1 years experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problemsolving abilities, ability to work independently and manage multiple tasks, and attention to detail Desired Characteristics: AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). • Familiarity with Collibra
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise, digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our team to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact is seeking a highly accomplished and visionary Assistant Vice President (AVP), Google Cloud Platform Pre-Sales Solution Architect, specializing in Data and Artificial Intelligence. This pivotal role will be instrumental in driving Genpact%27s growth in the GCP ecosystem by leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering executive-level client relationships. You will operate at the intersection of business strategy and cutting-edge technology, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: . Executive Solutioning & Strategy: Lead the end-to-end technical pre-sales cycle for Genpact%27s most strategic data and AI opportunities on GCP. Engage at the CXO level and with senior business and IT stakeholders to deeply understand their strategic objectives, pain points, and competitive landscape. . Architectural Leadership: Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures on Google Cloud Platform. This includes expertise in BigQuery, Dataflow, Dataproc, Vertex AI (MLOps, Generative AI), Cloud AI services, Looker, Pub/Sub, Cloud Storage, Data Catalog, and other relevant GCP services. . Value Proposition & Storytelling: Develop and deliver highly impactful presentations, workshops, and proof-of-concepts (POCs) that clearly demonstrate the business value and ROI of Genpact%27s data and AI solutions on GCP. Craft compelling narratives that resonate with both technical and non-technical audiences. . Deal Ownership & Closure: Work collaboratively with sales teams to own the technical solutioning and commercial structuring of deals from qualification to closure. Lead the estimation, negotiation, and transition of deals to the delivery organization, ensuring alignment and seamless execution. . Technical Deep Dive & Expertise: Provide deep technical expertise on Google Cloud%27s Data & AI portfolio, staying at the forefront of new service offerings, product roadmaps, and competitive differentiators. Act as the subject matter expert in client discussions and internal enablement. . Cross-Functional Collaboration: Partner effectively with Genpact%27s sales, delivery, product development, and industry vertical teams to ensure that proposed solutions are innovative, deliverable, and aligned with market demands and Genpact%27s capabilities. . Thought Leadership: Contribute to Genpact%27s market presence and intellectual property through whitepapers, conference presentations, industry events, and client advisory sessions. Position Genpact as a leader in data-driven transformation on GCP. . Team Mentorship & Enablement: Provide mentorship and technical guidance to junior pre-sales architects and delivery teams, fostering a culture of continuous learning and excellence in GCP Data & AI. Qualifications we seek in you! Minimum Qualifications . progressive experience in data, analytics, artificial intelligence, and cloud technologies, with a strong focus on technical pre-sales, solution architecture, or consulting leadership roles. . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. . Deep and demonstrable expertise across the Google Cloud Data & AI stack: o Core Data Services: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. o AI/ML Services: Vertex AI (including MLOps, Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow, Recommendation AI. o BI & Visualization: Looker, Data Studio. . Proven track record of successfully leading and closing multi-million dollar deals involving complex data and AI solutions on cloud platforms. . Exceptional executive presence with the ability to engage, influence, and build trusted relationships with C-level executives and senior stakeholders. . Strong commercial acumen and experience in structuring complex deals, including pricing models, risk assessment, and contract negotiation. . Outstanding communication, presentation, and storytelling skills, with the ability to articulate complex technical concepts into clear, concise business benefits. . Demonstrated ability to lead cross-functional teams and drive consensus in dynamic and ambiguous environments. . Bachelor%27s degree in Computer Science, Engineering, or a related technical field. Master%27s degree or MBA preferred. . Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). . Ability to travel as required to client sites and internal meetings. Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 11 Lacs
Mumbai, Maharashtra, India
On-site
Position: Data Analyst Location: Mumbai Experience Range: 3 to 6 Years Notice Period: Immediate to 30 Days Must have skills: data Analyst , data catalog, Data analytics, data governance along with collibra. What You'll Do - As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following Review, analyze, and resolve data quality issues across Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions. Coordinate the onboarding of data from various internal / external sources into the central repository. Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls. End-to-end analysis of business processes, data flow s, and data usage to improve business productivity through re-engineering and data governance. Manage change control process and participate in user acceptance testing (UAT) activities.
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,
Posted 3 weeks ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Architect Location: Madurai/Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 3 weeks ago
3.0 - 8.0 years
0 - 2 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Dear Candidate, I represent the Talent Acquisition team of First Meridian Global a Microsoft Gold Partner and Cloud Solution Provider to Microsoft ,Deloitte &Adithya Birla . To give you a quick brief, we are a Young, Promising and Rapidly growing company with varied expertise across domains and professionals across multiple cities in India and Global presence. NOTE : Please revert back with your confirmation as interested in this opportunity along with updated CV and let us know your availability for Teams Meeting Discussion to schedule along with PAN CARD Soft copy (Full Name), Passport size photo, PF service history and LWD Screenshot & DOB with Alternate Contact No to email : tuppari.pradeep@firstmeridianglobal.com Please visit our webpage www.affluentgs.com for a detailed information about us. Role: Workday HCM Consultant Notice period: Immediate Joiners Only Experience: 3+ Years Email: tuppari.pradeep@firstmeridianglobal.com Location: Pan India Work Mode: Hybrid Interview Round: Virtual Job Description: - The OCI Data Catalog PoC Specialist will be responsible for designing, executing, and documenting a Proof of Concept (PoC) for Oracle Cloud Infrastructure (OCI) Data Catalog as part of the clients broader Data Governance strategy. The specialist will demonstrate the capabilities of OCI Data Catalog, assess its fit for the client’s requirements, and provide recommendations for production implementation. Key Responsibilities Lead the end-to-end delivery of the OCI Data Catalog PoC, including requirements gathering, solution design, configuration, and demonstration. Collaborate with client stakeholders to understand data governance objectives, data sources, and cataloguing needs. Configure and integrate OCI Data Catalog with relevant data sources (e.g., Oracle Autonomous Database, Object Storage, on-premises databases). Develop and execute test cases to showcase metadata harvesting, data lineage, search, classification, and data stewardship features. Integrate catalog output with Marketplace application to export and automate the metadata sharing. Document PoC outcomes, lessons learned, and recommendations for next steps. Provide knowledge transfer and training to client teams on OCI Data Catalog capabilities and usage. Troubleshoot issues and liaise with Oracle support as needed during the PoC. Required Skills & Experience 3+ years of experience in data governance, data management, or cloud data solutions. Hands-on experience with Oracle Cloud Infrastructure (OCI), especially OCI Data Catalog. Familiarity with data catalog concepts: metadata management, data lineage, data classification, and stewardship. Experience integrating data catalogs with various data sources (cloud and on-premises). Strong analytical, problem-solving, and communication skills. Ability to document technical findings and present to both technical and business stakeholders. Please revert me with Passport Size Photograph and PAN Card copy along with below details: tuppari.pradeep@firstmeridianglobal.com Full Name as per Pan Card: PAN Card Number : Current Organization (Employer): Payroll Company- Company website- Current CTC: Expected CTC: Reason for change: Current location: Notice period: Total Experience: Relevant experience: Alternate Number: Any Offers: If Yes-Details: Skype ID: DOB: Highest Educational Qualification with college Name : Alternate Phone Number : PFB the policies of the company This is a fulltime opportunity with our company. Leave benefits will start from day one of your joining. It is 12 paid leaves+ 6 sick leaves and calculated on prorate basis. Plus you get 10-12 national holidays as per holiday calendar. You and your immediate family will be enrolled into Medical insurance. It is 1 Month of Notice period from either parties. Regards Pradeep T tuppari.pradeep@firstmeridianglobal.com
Posted 3 weeks ago
3.0 - 6.0 years
18 - 30 Lacs
Mumbai
Work from Office
Hello Connections, Greetings from Teamware Solutions !! We are #Hiring for Top Investment Bank Position: Data Analyst Location: Mumbai Experience Range: 3 to 6 Years Notice Period: Immediate to 30 Days Must have skills: data Analyst , data catalog, Data analytics, data governance along with collibra. What You'll Do - As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following 1. Review, analyze, and resolve data quality issues across IM Data Architecture 2. Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions. 3. Coordinate the onboarding of data from various internal / external sources into the central repository. 4. Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls. 5. End-to-end analysis of business processes, data flow s, and data usage to improve business productivity through re-engineering and data governance. 6. Manage change control process and participate in user acceptance testing (UAT) activities. What Were Looking For 1. Minimum 3- 6 years experience in data analysis, data catalog & Collibra. 2. Experience in data analysis and profiling using SQL is a must 3. Knowledge in coding, Python is a plus 4. Experience in working with cataloging tools like Collibra 5. Experience working with BI reporting tools like Tableau, Power BI is preferred. Preferred Qualifications: 1. Bachelors Degree required and any other relevant academic course a plus. 2. Fluent in English Apply now : francy.s@twsol.com
Posted 3 weeks ago
7.0 - 9.0 years
25 - 30 Lacs
Navi Mumbai
Work from Office
Key Responsibilities: Lead the end-to-end implementation of a data cataloging solution within AWS (preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc.). Establish and manage metadata frameworks for structured and unstructured data assets in the data lake and data warehouse environments. Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark or Glue/EMR customs jobs. Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. Implement role-based access controls and ensure compliance with data privacy and regulatory standards. Create detailed documentation and deliver training/workshops for internal stakeholders on using the data catalog.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough