Home
Jobs

28 Powerdesigner Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from TATA Consultancy Services!! TCS is hiring for Data Modeler - Architect Experience Range: 10+ Years Job Location: Hyderabad (Adibatla), Chennai Job Summary: Detail-oriented and analytical Data Modeler to design, implement, and maintain logical and physical data models that support business intelligence, data warehousing, and enterprise data integration needs. The ideal candidate will work closely with business analysts, data architects, and software engineers to ensure data is organized effectively and support scalable, high-performance applications. Required Skills: • Strong understanding of relational, dimensional, and NoSQL data modeling techniques. • Proficient in data modeling tools (e.g., Erwin, Enterprise Architect, PowerDesigner, SQL Developer Data Modeler). • Experience with Advanced SQL and major database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL). • Familiarity with cloud data platforms (e.g., AWS Redshift, Google BigQuery, Azure SQL, Snowflake). • Excellent communication and documentation skills. • Knowledge of data governance and data quality principles. • Experience with data warehousing concepts and tools (e.g., ETL pipelines, OLAP cubes). • Familiarity with industry standards such as CDM (Common Data Model), FHIR, or other domain-specific models Key Responsibilities: • Design and develop conceptual, logical, and physical data models. • Translate business requirements into data structures that support analytics, reporting, and operational needs. • Work with stakeholders to understand and document data needs and flows. • Optimize and maintain existing data models for performance and scalability. • Ensure data models are consistent with architectural guidelines and standards. • Develop and maintain metadata repositories and data dictionaries. • Collaborate with data architects and engineers to implement models within databases and data platforms. • Assist in data quality analysis and improvement initiatives. • Document data models and data mapping specifications. Regards Bodhisatwa Ray Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Modeler JD  Proven experience as a Data Modeler or in a similar role (8 years depending on seniority level).  Proficiency in data modeling tools (e.g., ER/Studio, Erwin, SAP PowerDesigner, or similar).  Strong understanding of database technologies (e.g., SQL Server, Oracle, PostgreSQL, Snowflake).  Experience with cloud data platforms (e.g., AWS, Azure, GCP).  Familiarity with ETL processes and tools.  Excellent knowledge of normalization and denormalization techniques.  Strong analytical and problem-solving skills. Show more Show less

Posted 5 days ago

Apply

13.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering. Show more Show less

Posted 6 days ago

Apply

10.0 years

5 - 10 Lacs

Bengaluru

On-site

Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Data Architect (C2) Job Summary The Data Architect will provide technical expertise in analysis, design, development, rollout and maintenance of enterprise data models and solutions Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective. Understands and leverages best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Provides data understanding and coordinate data related activities with other data management groups such as master data management, data governance and metadata management. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Essential Duties Design and develop conceptual / logical / physical data models for building large scale data lake and data warehouse solutions Understanding of data integration processes (batch or real-time) using tools such as Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop, etc. Create functional & technical documentation – e.g. data integration architecture documentation, data models, data dictionaries, data integration specifications, data testing plans, etc. Collaborate with business users to analyse and test requirements Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Architecture Assist with and support setting the data architecture direction (including data movement approach, architecture / technology strategy, and any other data-related considerations to ensure business value,) ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Education & Experience 5-10 years of Enterprise Data Modelling Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Expert proficiency in Data Contracts, Data Modelling, and Data Vault 2.0. Experience with major database platforms (e.g. Oracle, SQL Server, Teradata, etc.) Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) 3-5 years of management experience required 3-5 years consulting experience preferred Bachelor’s degree or equivalent experience, Master’s Degree Preferred Experience in data analysis and profiling Strong data warehousing and OLTP systems from a modelling and integration perspective Strong understanding of data integration best practices and concepts Strong development experience under Unix and/or Windows environments Strong SQL skills required scripting (e.g., PL/SQL) preferred Strong Knowledge of all phases of the system development life cycle Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, Google Cloud) Preferred Skills & Experience Comprehensive understanding of relational databases and technical documentation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to transform business requirements into technical requirement documents. Ability to run conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Can create documentation and presentations such that the they “stands on their own.” Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Data Modeler Primary Skill: Data Modelling, Database design, Erwin, Dimension Modelling SecondarySkill : Datalake, Lakhouse design, Datawarehouse design Location: Hyderabad Industry: Insurance Employment Type: Permanent Functional Area: Solutions & Delivery Experience: 6-8 Years Are you a seasoned Data Modeller with expertise in Data & Analytics Projects? Do you stay ahead of the curve with the latest technologies and eager to expand your knowledge? Do you thrive in dynamic, fast-paced environments and have a passion for delivering high-quality solutions? If so, we have an exciting opportunity for you! As a Data Modeler at ValueMomentum, you will be responsible for designing and implementing scalable, high-performance Modern Data & Analytics solutions in an agile environment. You will work closely with cross-functional teams to create reusable, testable, and sustainable data architectures that align with business needs. This role will directly impact the quality of data systems and analytics solutions, helping organizations unlock the full potential of their data. Why ValueMomentum? Headquartered in New Jersey, US, ValueMomentum is one of the fastest-growing software & solutions firms focused on the Healthcare, Insurance, and Financial Services domains. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver value and drive momentum for our customers’ initiatives. At ValueMomentum, we value continuous learning, innovation, and collaboration. As an MS Fabric Architect, you will have the opportunity to work with cutting-edge technologies and make a significant impact on our data-driven solutions. You will collaborate with a talented team of professionals, shape the future of data architecture, and contribute to the success of our clients in industries that are transforming rapidly. If you're ready to take your career to the next level, apply today to join our dynamic team and help us drive innovation in the world of Modern Data & Analytics! Key Responsibilities: Design and develop conceptual, logical, and physical data models meeting business requirements and strategic initiatives. Design, Develop and maintain Insurance Data models. Collaborate with business stakeholders and data engineers to understand data needs and translate them into effective data models. Analyze and evaluate existing data models and databases to identify opportunities for optimization, standardization, and improvement. Define data standards, naming conventions, and data governance policies to ensure consistency, integrity, and quality of data models. Develop and maintain documentation of data models, data dictionaries, STTM, and metadata to facilitate understanding and usage of data assets. Implement best practices for data modelling, including normalization, denormalization, indexing, partitioning, and optimization techniques. Work closely with database administrators to ensure proper implementation and maintenance of data models in database management systems. Stay abreast of industry trends, emerging technologies, and best practices in data modelling, database design, and data management. Must-have Skills: 6–8 years of hands-on experience in data modelling, including database design, dimensional modelling (star/snowflake schemas). Hands-on experience in implementing Data Models for Policy, Claims, and Finance subject areas within the Property & Casualty (P&C) Insurance domain. Proficiency in data modelling tools such as erwin, ER/Studio, or PowerDesigner. Strong SQL skills and experience working with relational databases, primarily SQL Server. Exposure to design principles and best practices for Data Lakes and Lakehouse architectures. Experience with big data platforms (e.g., Spark). Strong documentation skills including data dictionaries, STTM, ER models etc. Familiarity with data warehouse design principles, ETL processes, and data integration techniques. Knowledge of cloud-based data platforms and infrastructure. Nice-to-have Skills: Expertise in advanced data modelling techniques for Real-time/Near real-time data solutions Experience with NoSQL data modelling. Handson experience on any BI tool. Your Key Accountabilities: Collaborate with cross-functional teams to align data models with business and technical requirements. Define and enforce best practices for data modelling and database design. Provide technical guidance on database optimization and performance tuning. Draft technical guidelines, documentation, and data dictionaries to standardize data modelling practices. What We Offer: Career Advancement: Individual Career Development, coaching, and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal setting, continuous feedback, and year-end appraisal. Reward & recognition for extraordinary performers. Benefits: Comprehensive health benefits, wellness, and fitness programs. Paid time off and holidays. Culture: A highly transparent organization with an open-door policy and a vibrant culture. a If you’re enthusiastic about Data & Analytics and eager to make an impact through your expertise, we invite you to join us. Apply now and become part of a team that's driving the future of data-driven decision-making! Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyze, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and : Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect,design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to : SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability have : Writing code in programming language & working experience in Python, Pyspark, Databricks, Scala or Similar Data Pipeline Development & Management Design, develop, and maintain ETL (Extract, Transform, Load) pipelines using AWS services like AWS Glue, AWS Data Pipeline, Lambda, and Step Functions. Implement incremental data processing using tools like Apache Spark (EMR), Kinesis, and Kafka. Work with AWS data storage solutions such as Amazon S3, Redshift, RDS, DynamoDB, and Aurora. Optimize data partitioning, compression, and indexing for efficient querying and cost optimization. Implement data lake architecture using AWS Lake Formation & Glue Catalog. Implement CI/CD pipelines for data workflows using Code Pipeline, Code Build, and GitHub to have : Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Location- All EXL Locations Experience- 10 to 15 Years Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyze, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory , Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Must have: Writing code in programming language & working experience in Python, Pyspark, Databricks, Scala or Similar Data Pipeline Development & Management Design, develop, and maintain ETL (Extract, Transform, Load) pipelines using AWS services like AWS Glue, AWS Data Pipeline, Lambda, and Step Functions . Implement incremental data processing using tools like Apache Spark (EMR), Kinesis, and Kafka. Work with AWS data storage solutions such as Amazon S3, Redshift, RDS, DynamoDB, and Aurora. Optimize data partitioning, compression, and indexing for efficient querying and cost optimization. Implement data lake architecture using AWS Lake Formation & Glue Catalog. Implement CI/CD pipelines for data workflows using Code Pipeline, Code Build, and GitHub Actions Good to have: Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar. Key skills: key Skills: Python, Pyspark, AWS, Databricks, SQL. Show more Show less

Posted 1 week ago

Apply

7.0 - 9.0 years

6 - 10 Lacs

Noida

On-site

Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What we’re looking for: Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You will have an opportunity to: ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE engineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

PRINCIPAL CONSULTANT – Finance Data – Data Modeller Exp – 10+ Yrs Location – Bangalore, Pune, Hyderabad, Chennai, Gurgaon Mode of Work – Hybrid ROLE DESCRIPTION As a Finance Data Modeler, you'll work at the forefront of designing and development the finance data models in the Finance Data space which is crucial for the Finance Data program in building a self-service platform for the distribution of trusted data for Finance. In this role, you'll collaborate with Senior stakeholders in across the bank, breaking down complex challenges and developing innovative approaches and solutions. You'll be joining an exceptionally talented and supportive team, united by a shared mission to shape the future of data modelling and architecture design in the financial data space. We offer competitive compensation and the opportunity to make a tangible impact, powered by Organization 's recognized brand and industry-leading capabilities. If you have an entrepreneurial spirit and are passionate about data privacy, we'd love to hear from you. Let's discuss how your expertise can help us push the boundaries of what's possible KEY RESPONSIBILITIES Assist in the design and development of conceptual, logical, and application data models in line with the organization’s Future State Finance Data Asset Strategy. Collaborate with Finance business teams to enhance understanding, interpretation, design, and implementation. Support Finance business and change teams in transitioning to target state data models and Data Asset delivery, focusing on improving current data feeds and resolving data issues. Develop data modelling schemas that align with Enterprise data models and support Finance Data Assets. Participate in data modelling and data architecture governance forums. Ensure Finance data models align with Enterprise data models and adhere to Enterprise Architecture principles and standards Serve as a subject matter expert in Finance data modelling. Contribute to model development planning and scheduling. Continuously improve the data modelling estate, ensuring adherence to risk management, control measures, security protocols, and regulatory compliance standards. Advise and support Finance modelling data requirements for new use cases and data changes. Create and maintain a range of data modelling documents, including model requirements, data flow diagrams, data catalogues, data definitions, design specifications, data models, traceability matrices, data quality rules, and more. Translate Finance business requirements into data modelling solutions and Finance Data Assets. Conduct continuous audits of data models and refine them as needed, reporting any challenges, issues, or risks to senior management. Seek opportunities to simplify, automate, rationalize, and improve the efficiency of Finance IT and modelling solutions. Update and maintain key modelling artefacts, such as Confluence, SharePoint documents, reports, roadmaps, and other domain artefacts. Provide data modelling and technical advice, maintaining ongoing relationships. Provide timely feedback to ensure that model development or modification meets business needs. Communicate data modelling solutions to both technical and non-technical audiences, ensuring the communication style is appropriate for the intended audience. Support cross-program and cross-group Data Assets execution and delivery strategies, opportunities, and problem resolution. KEY ACCOUNTABILITIES Continuously improve the data modelling estate, ensuring adherence to risk management, control measures, security protocols, and regulatory compliance standards. Conduct continuous audits of data models and refine them as needed, reporting any challenges, issues, or risks to senior management. Ensure Finance data models align with Enterprise data models and adhere to Enterprise Architecture principles and standards. Provide timely feedback to ensure that model development or modification meets business needs. Support cross-program and cross-group Data Assets execution and delivery strategies, opportunities, and problem resolution. THE PREFERRED CANDIDATE WILL: Be a self-motivator working in a team of Organization and client colleagues Be good at problem solving Be comfortable working closely with stakeholders (both technical and non-technical) to perceive new and better ways of working Be adaptable and able to work across multiple Business Functions in parallel under pressure to tight deadlines Be able to demonstrate strong collaboration skills to align and define strategies that meet the program requirements MANDATORY SKILLS AND EXPERIENCE Business Minimum of 5 years of experience in data management and modelling within the Financial Services sector, ideally in a Treasury or Finance role, or a related front office environment. Demonstrated expertise in designing data models (conceptual, logical, and application/messaging) with appropriate phasing, transitions, and migrations. Strong understanding of managing data as a product across various enterprise domains and technology landscapes. Solid knowledge of business, data, application, and technology architectural domains. Excellent communication skills with the ability to influence and present data models and concepts to both technical and business stakeholders. Proven ability to work effectively in a matrixed environment, collaborating with data modellers from other domains to create shared and reusable data assets. Technical Experience with Agile and Scrum methodologies in a large-scale Agile environment, including active participation in daily standups and progress reporting. Knowledge of reference and master data management. Proficiency with data modelling tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio, etc. Understanding of data modelling standards and technical documentation using ERD, UML, or BIAN. Ability to deliver solutions that provide organizational benefits. Strong analytical and problem-solving skills, with the ability to work independently and take ownership of key deliverables. PREFERRED SKILLS AND EXPERIENCE Proven experience in a large, global banking setting is highly valued. Familiarity with data standards, governance, strategy, and lineage is a plus. Exposure to cloud platforms (GCP, AWS, Azure) and big data solutions is advantageous. Knowledge of issue and data quality management, prioritization, business case development, remediation planning, and solution delivery. Experience with data governance initiatives like lineage, masking, retention policies, and data quality. Strong analytical and problem-solving skills, with the ability to work independently and take ownership of key deliverables. Open-minded approach to problem-solving, ensuring pragmatic and effective designs. Familiarity with ETL architectures and tools, including data virtualization and API integration, is desirable. Show more Show less

Posted 2 weeks ago

Apply

7.0 - 9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What We’re Looking For Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You Will Have An Opportunity To ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE e ngineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Data Architect / Delivery Lead Job Summary: The Data Architect / Delivery Lead will provide technical expertise in the analysis, design, development, rollout, and maintenance of enterprise data models and solutions, utilizing both traditional and emerging technologies such as cloud, Hadoop, NoSQL, and real-time data processing. In addition to technical expertise, the role requires leadership in driving cross-functional teams, ensuring seamless project delivery, and fostering innovation within the team. The candidate must excel in managing data architecture projects while mentoring teams in data engineering practices, including PySpark , automation, and big data integration. Essential Duties Data Architecture Design and Development: Design and develop conceptual, logical, and physical data models for enterprise-scale data lakes and data warehouse solutions, ensuring optimal performance and scalability. Implement real-time and batch data integration solutions using modern tools and technologies such as PySpark, Hadoop, and cloud-based solutions (e.g., AWS, Azure, Google Cloud). Utilize PySpark for distributed data processing, transforming and analyzing large datasets for improved data-driven decision-making. Understand and apply modern data architecture philosophies such as Data Vault, Dimensional Modeling, and Data Lake design for building scalable and sustainable data solutions. Leadership & Delivery Management: Provide leadership in data architecture and engineering projects, ensuring the integration of modern technologies and best practices in data management and transformation. Act as a trusted advisor, collaborating with business users, technical staff, and project managers to define requirements and deliver high-quality data solutions. Lead and mentor a team of data engineers, ensuring the effective application of PySpark for data engineering tasks, and supporting continuous learning and improvement within the team. Manage end-to-end delivery of data projects, including defining timelines, managing resources, and ensuring timely, high-quality delivery while adhering to project methodologies (e.g., Agile, Scrum). Data Movement & Integration: Provide expertise in data integration processes, including batch and real-time data processing using tools such as PySpark, Informatica PowerCenter, SSIS, MuleSoft, and DataStage. Develop and optimize ETL/ELT pipelines, utilizing PySpark for efficient data processing and transformation at scale, particularly for big data environments (e.g., Hadoop ecosystems). Oversee data migration efforts, ensuring high-quality and consistent data delivery while managing data transformation and cleansing processes. Documentation & Communication: Create comprehensive functional and technical documentation, including data integration architecture documentation, data models, data dictionaries, and testing plans. Collaborate with business stakeholders and technical teams to ensure alignment and provide technical guidance on data-related decisions. Prepare and present technical content and architectural decisions to senior management, ensuring clear communication of complex data concepts. Skills and Experience: Data Engineering Skills: Extensive experience in PySpark for large-scale data processing, data transformation, and working with distributed systems. Proficient in modern data processing frameworks and technologies, including Hadoop, Spark, and Flink. Expertise in cloud-based data engineering technologies and platforms such as AWS Glue, Azure Data Factory, or Google Cloud Dataflow. Strong experience with data pipelines, ETL/ELT frameworks, and automation techniques using tools like Airflow, Apache NiFi, or dbt. Expertise in working with big data technologies and frameworks for both structured and unstructured data. Data Architecture and Modeling: 5-10 years of experience in enterprise data modeling, including hands-on experience with ERwin, ER/Studio, PowerDesigner, or similar tools. Strong knowledge of relational databases (e.g., Oracle, SQL Server, Teradata) and NoSQL technologies (e.g., MongoDB, Cassandra). In-depth understanding of data warehousing and data integration best practices, including dimensional modeling and working with OLTP systems and OLAP cubes. Experience with real-time data architectures and cloud-based data lakes, leveraging AWS, Azure, or Google Cloud platforms. Leadership & Delivery Skills: 3-5 years of management experience leading teams of data engineers and architects, ensuring alignment of team goals with organizational objectives. Strong leadership qualities such as innovation, critical thinking, communication, time management, and the ability to collaborate effectively across teams and stakeholders. Proven ability to act as a delivery lead for data projects, driving projects from concept to completion while managing resources, timelines, and deliverables. Ability to mentor and coach team members in both technical and professional growth, fostering a culture of knowledge sharing and continuous improvement. Other Essential Skills: Strong knowledge of SQL, PL/SQL, and proficiency in scripting for data engineering tasks. Ability to translate business requirements into technical solutions, ensuring that the data solutions support business strategies and objectives. Hands-on experience with metadata management, data governance, and master data management (MDM) principles. Familiarity with modern agile methodologies, such as Scrum or Kanban, to ensure iterative and successful project delivery. Preferred Skills & Experience: Cloud Technologies: Experience with cloud data platforms such as AWS Redshift, Google BigQuery, or Azure Synapse for building scalable data solutions. Leadership: Demonstrated ability to build and lead cross-functional teams, drive innovation, and solve complex data problems. Business Consulting: Consulting experience working with clients to deliver tailored data solutions, providing expert guidance on data architecture and data management practices. Data Profiling and Analysis: Hands-on experience with data profiling tools and techniques to assess and improve the quality of enterprise data. Real-Time Data Processing: Experience in real-time data integration and streaming technologies, such as Kafka and Kinesis. Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant - Senior Data Modeler - Life Sciences. We are seeking a Senior Data Modeler with deep expertise in the Life Sciences domain to provide thought leadership across data deliveries. The ideal candidate will have extensive experience in designing scalable, future-proof data models that support critical functions such as clinical research, drug development, regulatory compliance, supply chain management, and commercial operations. This role will focus on data modeling and domain expertise, ensuring models align with operational efficiency, analytics, and automation. Key Responsibilities . Deep Domain Expertise: . Develop conceptual, logical, and physical data models for core Life Sciences processes, including clinical trials, regulatory submissions, pharmacovigilance, manufacturing, and sales & marketing. . Align data models with evolving business needs to support real-time and batch processing for transactional (OLTP) and analytical (OLAP) systems. . Bridge business processes with data architecture, ensuring models enhance operational efficiency and enable advanced analytics. . Collaboration & Leadership: . Collaborate with key Life Sciences domain stakeholder groups including decision-makers (like R&D executives, regulatory affairs leaders), clinical data analysts, clinical researchers, and compliance teams. . Engage with these personas to gather and interpret data requirements, ensuring models are tailored to drive actionable insights and support effective decision-making. . Provide technical guidance and mentorship to junior data modelers and analysts. . Facilitate review sessions and present model designs to stakeholders. . Compliance & Regulatory Considerations: . Ensure data models adhere to Life Sciences industry regulations, including FDA, EMA, HIPAA, GDPR, 21 CFR Part 11, and GxP guidelines. . Incorporate data governance principles to ensure auditability, security, and regulatory reporting compliance. . Technical Excellence: . Utilize industry-standard data modeling tools (e.g., Erwin, PowerDesigner, or equivalent) to develop and maintain models. . Stay current with emerging trends and technologies in data management, data warehousing, and analytics. . Collaborate with ETL developers and database administrators to ensure seamless data integration and performance optimization. . Process Improvement: . Identify opportunities for process improvements and automation within business data workflows. . Support continuous improvement initiatives to streamline business operations and reporting. Qualifications we seek in you! Minimum Qualifications . Bachelor&rsquos degree in business information systems (IS), computer science or related field, or equivalent-related IT experience . Extensive experience in data modeling and data architecture, specifically in the Life Sciences domain. . Prior experience working closely with various key diverse stakeholder groups to capture detailed data requirements . Proven expertise in designing and managing data models for clinical research, regulatory compliance, pharmacovigilance, supply chain, and commercial operations. Preferred Qualifications/ Skills . Strong hands-on experience in relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). . Expertise in data modelling principles/methods including conceptual, logical & physical data models. . Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. . Strong knowledge of data modelling, and related tools (Erwin or ER Studio or PowerDesigner or others) required. . Strong understanding of Life Sciences domain business processes, including clinical research, regulatory compliance, pharmacovigilance, supply chain, and commercial operations, etc. . Ability to clearly translate complex data models into actionable business insights. . Excellent analytical, problem-solving, and communication skills. . Strong interpersonal skills and the ability to work collaboratively with both technical and business teams. . Knowledge of compliance and regulatory requirements (e.g., FDA, EMA, HIPAA, GDPR, 21 CFR Part 11, GxP). Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Data Modeller Primary Skills Capital Market Data Modeling JD RBC Capital Markets is looking for an experienced Business Analyst Data Modeler to join our Data and Architecture Services team within the Capital Markets Division. The successful candidate will be responsible for designing and implementing data models that effectively capture trade data across a diverse range of financial products. This role requires a blend of technical expertise in data modeling, deep knowledge of capital markets, and a strong understanding of trade data Responsibility Design and develop sophisticated data models for trade data related to various financial instruments, ensuring alignment with business needs and regulatory Requirements Collaborate with business analysts, data engineers, and IT teams to gather requirements and translate business needs into technical specifications. Create and maintain logical and physical data models, ensuring optimal performance and compliance with internal and external standards. Ensure data models are flexible and scalable to support the introduction of new products and adapt to changes in market practices. Bachelor or Masters degree in Computer Science, Information Systems, Finance, or a related field. Minimum of 5 years of experience in data modeling, with a strong preference for candidates with capital markets experience. Expert knowledge of financial products, trade lifecycle, and market data. Demonstrated experience in modeling trade data using industry-standard protocols and schemas such as FpML (Financial Products Markup Language), ISDA CDM (International Swaps and Derivatives Association Common Domain Model), FIX (Financial Information eXchange), SWIFT, or ISO20022. Proficiency in data modeling tools (e.g., ERwin, PowerDesigner, IBM Data Architect) and familiarity with database technologies (SQL, NoSQL). Experience with data warehousing, ETL processes, and big data platforms Show more Show less

Posted 2 weeks ago

Apply

13.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Assistant Vice President - Information Management Principal Responsibilities Working closely with global and regional teams to deliver data management and governance projects. Support strategic modelling capabilities focused on regulatory requirements and strategic solutions on Big Data & Cloud technologies Delivery of capabilities to allow the Global RBWM Risk function to enable business transformation and effectively operate data management and governance. Closely aligning and managing expectation of stakeholders from a Global and Regional perspective Supporting the project managers, business analysts and Head of Data Management in the delivery of data projects Influencing and collaborating with stakeholders and business partners, building strong relationships to ensure consensus and influencing change outcomes. Fostering open and honest communication which anticipates stakeholder expectations. Supporting and facilitating workshops across multiple geographies Requirements Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge. Expert knowledge of data modeling approaches including Relational Modeling, Ralph Kimball Methodology, Bill Inmon’s Corporate Information Factory, NoSQL Modeling, etc. Experience in implementing logical and physical common data models across a complex global data ecosystem is a nice to have. Working knowledge of Industry Modeling frameworks especially within the banking and financial domains e.g. IBM IFW, FSDM, FSLDM Sound knowledge of Relational and Non-relational Database Platforms required. Practical experience in the use of Data Modeling Tools. IBM InfoSphere Data Architect preferred; working knowledge of other tools such as Erwin, PowerDesigner, E/R Studio acceptable Knowledge of other programing languages like Python/Java/R/SQL/SPSS/SAS is preferable. Experience of large global system infrastructure projects a business change environment and good understanding of data infrastructure and architecture would be an advantage. The latter may include exposure to solution or enterprise architecture. Experience with Cloud based technologies (on AWS and/or GCP) across both cloud agnostic and proprietary tools (MATLAB, redshift etc) Knowledge of ETL Processes- Extract Transform Load & Data Modelling requirements Experience of process re-engineering and process management. Knowledge of data governance and management principles and processes. Knowledge of “Big Data” and Cloud Computing concepts and architecture. Knowledge of BCBS239, IFRS-9, Stress Testing or experience of any other Banking regulatory environment is a big plus. Experience in various aspects of Data Quality Management will be advantageous Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD*** Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Colt provides network, voice and data centre services to thousands of businesses around the world, allowing them to focus on delivering their business goals instead of the underlying infrastructure. Location : India (Gurugram / Bangalore), UK Why We Need This Role This is a pivotal role in shaping our data landscape, ensuring alignment with business objectives, and driving innovation through effective data management practices. You will lead a team of skilled data architects, collaborate with cross-functional stakeholders, and define the strategic direction for data initiatives. What You Will Do Data Strategy Development: Develop and articulate a comprehensive data strategy that aligns with the organization’s vision, mission, and long-term goals. Collaborate with senior leadership, including the Chief Data Officer (CDO), to define data-related priorities and roadmaps. Understands disruptive forces and the business’s economic & financial levers that affect transformation, to effectively guide technology investment decisions. Facilitates business and IT alignment through a collaborative, supportive and consultative manner. Formulates, translates, advocates and supports strategy to achieve the organization’s targeted business outcomes. Leads the analysis of business and operating models, market trends and the technology industry to determine their potential impact on the enterprise’s business strategy, direction and architecture. Provides perspective on the readiness of the organization to change and innovate. Data Architecture Leadership Own and drive the future state data architecture, ensuring scalability, flexibility, and adherence to industry best practices. Establish and maintain data architecture standards, guidelines, and principles across the organization. Work closely with technology teams to implement architectural changes and enhancements. Data Modeling And Design Ensure that data modeling (conceptual, logical, physical) is of high quality and consistency. Lead the development and maintenance of logical data models (LDMs) and associated physical models. Collaborate with development teams to ensure understanding and adherence to architectural and modeling standards. Stakeholder Engagement Partner with the and Data Management team to drive the group’s data strategy. Collaborate with business units to extract greater value from data assets. Engage with key stakeholders to identify technical opportunities for enhancing data product delivery. Provides consultative advice to business leaders and organizational stakeholders with ctionable recommendations to make investment decisions about the next business and operating model of their organization, using technology to make that change happen. Leads and facilitates interaction with business leaders, product managers and product owners in a business-driven conversation over the risks and implications of product decisions Plan and Manage the IT Portfolio Works closely with the (PMO) project management office to ensure the execution of plans corresponds with the promised outcomes throughout the project or product lifecycle. Presents gap analysis and/or IT investment roadmaps that reflect the status of the existing data estate Leads analysis of the data environment to detect critical deficiencies and recommend solutions for improvement. Leads the development of an implementation plan for the architecture based on business requirements and the varying IT strategies Team Leadership Build and lead a federated team of Data Architects within the function and across the organization. Guide and mentor team members, fostering a culture of excellence and continuous learning. Quality Assurance Ensure the quality of data designs proposed by the team. Uphold data management principles and best practices. Future-Proofing Stay abreast of industry trends, emerging technologies, and regulatory changes related to data management. Contribute to the organization’s data architecture vision. Facilitate Innovation Scans for major disruptive technology and nontechnology trends (trendspotting) that affect business. Contextualizes technology trends based on social, economic, political and other nontechnology trends. Identifies technology-enabled innovation opportunities that enables business strategy and deliver expected business outcomes. Experience What we're looking for: Master’s or bachelor’s degree in business, computer science, computer engineering, electrical engineering, system analysis or a related field of study, or equivalent experience. Ten or more years of experience in data architecture with a proven track record of designing and implementing complex data solutions. Ten or more years of business experience in strategic and operations planning and/or business analysis. Certifications required – TOGAF, Certified Architect(CA), Zachmann, SAFE agile Skills Knowledge of business ecosystems, SaaS, infrastructure as a service (IaaS), platform as a service (PaaS), SOA, APIs, open data, microservices, event-driven IT and predictive analytics. Familiarity with information management practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., Google BigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience with data integration and ETL tools (e.g., Talend, Informatica). Excellent analytical and technical skills. Excellent planning and organizational skills. Knowledge of all components of holistic enterprise architecture. What We Offer Colt is a growing business that is investing in its people. We offer skill development, learning pathways and accreditation to help our people perform at their best, regardless of role and location. In addition to offering competitive salaries and incentive plans, a range of benefits and local rewards packages are offered to staff. Colt recognizes the importance of a work life balance. Some Benefit Examples Are Flexible working and relaxed dress code Two days annually to spend on volunteering opportunities Access to Online-learning Platform Business mentoring Option of parking slots in the Colt Campus Lunch vouchers Show more Show less

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies