Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 12 years
15 - 30 Lacs
Pune
Work from Office
About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Data Modeller. Experience: 7+ Years Skill Set: Data Modelling and SQL. Location: Pune, Hyderabad, Gurgaon Position in brief: We are more looking into technical with a piece of functional knowledge. at least 5 years of hands-on data modeling (conceptual, logical, and physical), data profiling, and data analysis skills SQL It should be Basic to intermediate level/ added advantage someone good with writing complex SQL queries ETL - should have an idea of how ETL process works/ should provide any ETL attributes and partition-related info as part of the data mapping document. Any tool experience is okay- ER Studio, ERWin, Sybase Power Designer. Detailed Job Description We are looking for a passionate Data Analyst/Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight, and insight into business decisions. Responsibilities: Be responsible for gathering requirements from the business team and translating to technical requirements. Should be able to drive the projects and provide guidance wherever needed. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. The candidate must be able to work independently and collaboratively. Work with management to prioritize business and information needs. Requirements: Bachelors or masters degree in computer/data science technical or related experience. 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL/Big data platform technologies, and ETL and data ingestion protocols). Proven working experience as a data analyst/data modeler or a similar role. Technical expertise in designing data models, database design and data analysis. Prior experience involving in the migration of data from legacy systems to new solutions. Good knowledge of metadata management, data modelling, and related tools (Erwin or ER Studio or others) is required. Experience gathering and analysing system/business requirements and providing mapping documents for technical teams. Strong analytical skills with the ability to collect, organize, analyze, and disseminate. significant amounts of information with attention to detail and accuracy Hands-on experience with SQL Problem-solving attitude
Posted 3 months ago
7 - 12 years
15 - 30 Lacs
Gurgaon
Work from Office
About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Data Modeller. Experience: 7+ Years Skill Set: Data Modelling and SQL. Location: Pune, Hyderabad, Gurgaon Position in brief: We are more looking into technical with a piece of functional knowledge. at least 5 years of hands-on data modeling (conceptual, logical, and physical), data profiling, and data analysis skills SQL It should be Basic to intermediate level/ added advantage someone good with writing complex SQL queries ETL - should have an idea of how ETL process works/ should provide any ETL attributes and partition-related info as part of the data mapping document. Any tool experience is okay- ER Studio, ERWin, Sybase Power Designer. Detailed Job Description We are looking for a passionate Data Analyst/Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight, and insight into business decisions. Responsibilities: Be responsible for gathering requirements from the business team and translating to technical requirements. Should be able to drive the projects and provide guidance wherever needed. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. The candidate must be able to work independently and collaboratively. Work with management to prioritize business and information needs. Requirements: Bachelors or masters degree in computer/data science technical or related experience. 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL/Big data platform technologies, and ETL and data ingestion protocols). Proven working experience as a data analyst/data modeler or a similar role. Technical expertise in designing data models, database design and data analysis. Prior experience involving in the migration of data from legacy systems to new solutions. Good knowledge of metadata management, data modelling, and related tools (Erwin or ER Studio or others) is required. Experience gathering and analysing system/business requirements and providing mapping documents for technical teams. Strong analytical skills with the ability to collect, organize, analyze, and disseminate. significant amounts of information with attention to detail and accuracy Hands-on experience with SQL Problem-solving attitude
Posted 3 months ago
3 - 5 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : UNIX Shell Scripting, Teradata BI, Ab Initio, MongoDB Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : BE or BTECH Summary :As a Data Modeler, you will be responsible for working with key business representatives, data owners, end users, application designers, and data architects to model current and new data using UNIX Shell Scripting as the primary skill. Your typical day will involve working with MongoDB, Teradata BI, and Ab Initio to develop and maintain data models. Roles & Responsibilities: Develop and maintain data models using UNIX Shell Scripting as the primary skill. Collaborate with key business representatives, data owners, end users, application designers, and data architects to model current and new data. Ensure data models are optimized for performance and scalability. Perform data analysis and profiling to identify data quality issues and recommend solutions. Develop and maintain technical documentation related to data models and data integration processes. Professional & Technical Skills: Proficiency in UNIX Shell Scripting as the primary skill. Experience with MongoDB, Teradata BI, and Ab Initio as must-to-have skills. Strong understanding of data modeling concepts and techniques. Experience with data analysis and profiling tools. Experience with data integration and ETL processes. Qualification BE or BTECH
Posted 3 months ago
5 - 9 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. Your typical day will involve deploying infrastructure and platform environments, creating a proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: Design, build, test, and deploy cloud application solutions that integrate cloud and non-cloud infrastructure. Deploy infrastructure and platform environments, creating a proof of architecture to test architecture viability, security, and performance. Collaborate with cross-functional teams to ensure successful delivery of cloud-based solutions. Provide technical guidance and support to team members and stakeholders.Overall 5+ years of experience working in DWH, Data Analytics projects MUST have done fresh data modeling for 2 or more projects Have strong understanding of Datawarehouse concept Well versed with Data modeling tool like Erwin Experience of Rationalization of data model to move to Self service mode Good to have Cloud (Azure) exposure or experience Professional & Technical Skills: Data Modeling Techniques and Methodologies Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Strong understanding of cloud architecture and infrastructure. Experience with containerization technologies such as Docker and Kubernetes. Experience with scripting languages such as Python or Bash. Additional Information: The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful cloud-based solutions. This position is based at our Bengaluru office. Qualification BE
Posted 3 months ago
3 - 8 years
8 - 10 Lacs
Pune
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : AWS Glue Good to have skills : Teradata BI, MongoDB, Ab Initio Minimum 3 year(s) of experience is required Educational Qualification : BE or BTech Summary :As an AWS Glue Data Modeler, you will be responsible for modeling current and new data, working with key business representatives, data owners, end users, application designers, and data architects. Your typical day will involve working with AWS Glue, Teradata BI, MongoDB, and Ab Initio to develop and maintain data models for our clients. Roles & Responsibilities: Design and develop data models using AWS Glue for current and new data. Collaborate with key business representatives, data owners, end users, application designers, and data architects to ensure data models meet business requirements. Develop and maintain data models using Teradata BI, MongoDB, and Ab Initio. Ensure data models are scalable, efficient, and maintainable. Provide technical guidance and support to team members and stakeholders. Professional & Technical Skills: Must To Have Skills:Experience with AWS Glue. Good To Have Skills:Experience with Teradata BI, MongoDB, and Ab Initio. Strong understanding of data modeling concepts and techniques. Experience with database design and development. Experience with ETL tools and processes. Experience with data warehousing and data integration. Additional Information: The candidate should have a minimum of 5 - 6 years of experience in AWS Glue. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Pune office. Qualification BE or BTech
Posted 3 months ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Minimum 15 years of full time education Summary :As an Application Designer, you will be responsible for assisting in defining requirements and designing applications to meet business process and application requirements using Apache Kafka. Your typical day will involve working with cross-functional teams, analyzing requirements, and designing scalable and reliable applications. Roles & Responsibilities: Collaborate with cross-functional teams to analyze business requirements and design scalable and reliable applications using Apache Kafka. Design and develop Kafka-based solutions for real-time data processing and streaming. Ensure the performance, scalability, and reliability of Kafka clusters and applications. Implement security and access control measures for Kafka clusters and applications. Stay updated with the latest advancements in Kafka and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Apache Kafka. Good To Have Skills:Experience with Apache Spark, Apache Flink, and other big data technologies. Experience in designing and developing Kafka-based solutions for real-time data processing and streaming. Strong understanding of Kafka architecture, configuration, and performance tuning. Experience in implementing security and access control measures for Kafka clusters and applications. Solid grasp of distributed systems and microservices architecture. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Kafka. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful Kafka-based solutions. This position is based at our Mumbai office. Qualification Minimum 15 years of full time education
Posted 3 months ago
4 - 9 years
16 - 25 Lacs
Bengaluru
Hybrid
Roles & Responsibilities: Experience in working with SMEs to build Conceptual Data models Convert business requirements into technical requirements, supported by an efficient data model and detailed mapping Build Logical/Physical Models in Azure using best practices to ensure business requirements are met within the agreed SLAs Hands on experience of implementing the following methodologies Dimensional models (Ralph Kimball) Corporate Information Factory (Bill Inmon) Data Vault Experience of building Data Lake / Data Warehouse on Azure Good knowledge on ADLS, ADF, ADB & SQL Optimize logical/physical models to support new requirements Optimize logical/physical models with required metadata using Industry standard tools (Preferably ER/Studio) Develop and implement best practices and processes around naming standards to ensure consistency across the models & versioning of models Perform reverse engineering of physical models from existing databases and scripts Recommend opportunities to reuse existing data models for new use cases Analyse challenges around data integration / data mastering and propose appropriate solutions Work with development team to implement data strategy, build efficient data flows and testing of data loads Work with consumption teams to help explain the data model usage and amend the model to support efficient exploitation of data Creation, review and maintenance of data modelling and data mapping artefacts Contribute to the E2E Data Architecture & Design (Ingestion -> Transformation -> Reporting) Identifying ways to improve data reliability, efficiency, and quality.
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Coimbatore
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Network Infrastructures Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will be responsible for defining the data requirements and structure for the application. You will model and design the application data structure, storage, and integration. Your typical day will involve collaborating with cross-functional teams, analyzing data needs, and implementing effective data solutions. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design and develop data architecture strategies and solutions. Create and maintain data models and database designs. Ensure data integrity and security. Collaborate with stakeholders to understand data requirements and translate them into technical specifications. Professional & Technical Skills: Must To Have Skills:Proficiency in Network Infrastructures. Experience with data modeling and database design. Strong understanding of data integration and ETL processes. Knowledge of data governance and data management best practices. Experience with data warehousing and data visualization tools. Good To Have Skills:Experience with cloud-based data platforms. Additional Information: The candidate should have a minimum of 5 years of experience in Network Infrastructures. This position is based in Noida. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 3 months ago
5 - 10 years
18 - 32 Lacs
Bengaluru, Hyderabad
Hybrid
Exp -5+ Years Location - Bangalore & Hyderabad Shift -2-11PM Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include • Analyze and translate business needs into long term solution data models. • Evaluate existing data systems and recommend improvements. • Define rules to translate and transform data across data models. • Work with the development team to create conceptual data models and data flows. • Develop best practices for data coding to ensure consistency within the system. • Review modifications of existing systems for cross compatibility. • Implement data strategies and develop physical data models. • Update and optimize local and metadata models. • Utilize canonical data modeling techniques to enhance data system efficiency. • Evaluate implemented data systems for variances, discrepancies, and efficiency. • Troubleshoot and optimize data systems to ensure optimal performance. • Strong expertise in relational and dimensional modeling (OLTP, OLAP). • Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). • Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). • Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. • Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). • Familiarity with ETL processes, data integration, and data governance frameworks. • Strong analytical, problem-solving, and communication skills. Qualifications: • Bachelor's degree in Engineering or a related field. • 5 to 9 years of experience in data modeling or a related field. • 4+ years of hands-on experience with dimensional and relational data modeling. • Expert knowledge of metadata management and related tools. • Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. • Knowledge of transactional databases and data warehouses. Preferred Skills: • Experience in cloud-based data solutions (AWS, Azure, GCP). • Knowledge of big data technologies (Hadoop, Spark, Kafka). • Understanding of graph databases and real-time data processing. • Certifications in data management, modeling, or cloud data engineering. • Excellent communication and presentation skills. • Strong interpersonal skills to collaborate effectively with various teams.
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Pune
Work from Office
Job Title:Solutions IT Developer - Kafka SpecialistLocation:Toronto/ Offshore -Pune About The Role ::We are seeking a seasoned Solutions IT Developer with a strong background in Apache Kafka to join our developer advocacy function in our event streaming team. The ideal candidate will be responsible for Kafka code reviews with clients, troubleshooting client connection issues with Kafka and supporting client onboarding to Confluent Cloud. This role requires a mix of software development expertise along with a deep understanding of Kafka architecture, components, and tuning. Responsibilities:1. Support for Line of Business (LOB) Users: - Assist LOB users with onboarding to Apache Kafka (Confluent Cloud/Confluent Platform), ensuring a smooth integration process and understanding of the platforms capabilities. 2. Troubleshooting and Technical Support: - Resolve connectivity issues, including client and library problems, to ensure seamless use of our Software Development Kit (SDK), accelerators and Kafka client libraries. - Address network connectivity and access issues - Provide a deep level of support for Kafka library, offering advanced troubleshooting and guidance. - Java 11, 17 and Spring Boot (Spring Kafka, Spring Cloud Stream Kafka Spring Cloud Stream) experience 3. Code Reviews and Standards Compliance: - Perform thorough code reviews to validate client code against our established coding standards and best practices. - Support the development of Async specifications tailored to client use cases, promoting effective and efficient data handling.4. Developer Advocacy: - Act as a developer advocate for all Kafka development at TD, fostering a supportive community and promoting best practices among developers.5. Automation and APIs: - Manage and run automation pipelines for clients using REST APIs as we build out GitHub Actions flow.6. Documentation and Knowledge Sharing: - Update and maintain documentation standards, including troubleshooting guides, to ensure clear and accessible information is available. - Create and disseminate knowledge materials, such as how-tos and FAQs, to answer common client questions in general chats related to Kafka development. Role Requirements:Qualifications:- Bachelors degree in Computer Science- Proven work experience as a Solutions Developer or similar role with a focus on Kafka design and development Skills:- In-depth knowledge of Java 11, 17 and Spring Boot (Spring Kafka, Spring Cloud Stream Kafka Spring Cloud Stream)- Deep knowledge of Apache Kafka, including Kafka Streams and Kafka Connect experience- Strong development skills in one or more high-level programming languages (Java, Python).- Familiarity with Kafka API development and integration.- Understanding of distributed systems principles and data streaming concepts.- Experience with source control tools such as Git, and CI/CD pipelines.- Excellent problem-solving and critical-thinking skills.Preferred:- Kafka certification (e.g., Confluent Certified Developer for Apache Kafka).- Experience with streaming data platforms and ETL processes.- Prior work with NoSQL databases and data warehousing solutions. Experience:- Minimum of 4 years of hands-on experience with Apache Kafka.- Experience with large-scale data processing and event-driven system design. Other Requirements:- Good communication skills, both written and verbal.- Ability to work independently as well as collaboratively.- Strong analytical skills and attention to detail.- Willingness to keep abreast of industry developments and new technologies.
Posted 3 months ago
3 - 7 years
5 - 9 Lacs
Pune
Work from Office
Data Modelling,erwin,SQL
Posted 3 months ago
2 - 5 years
4 - 7 Lacs
Kochi
Work from Office
Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations. Therefore, the process of data modeling involves professional data modelers working closely with business stakeholders, as well as potential users of the information system. There are three different types of data models produced while progressing from requirements to the actual database to be used for the information system Preferred technical and professional experience Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations
Posted 3 months ago
5 - 10 years
20 - 35 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Should be very good in Erwin data Modelling Should have good knowledge about data quality and data catalog Should have good understanding in data lineage Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy. Locations: Chennai/Hyderabad/Bangalore/Pune/Jaipur/Gurgaon Mode: Hybrid(2 days in office)
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Design, develop, and maintain conceptual, logical, and physical data models to support business needs and objectives across various domains.Collaborate with stakeholders, including business analysts, data architects, and application developers, to gather and define data requirements.Implement best practices for data modeling, including normalization, de-normalization, and dimensional modeling, to support efficient data storage and retrieval.Develop and maintain data dictionaries, data lineage documentation, and metadata repositories to support data governance and standardization efforts.Perform data mapping and data profiling to ensure data quality and consistency across systems and environments.Work closely with ETL developers to design data integration strategies, ensuring seamless data flow between source and target systems.Knowledge of Star/Snowflake shcema.Knowledge of Cloud technologies Azure and/or GCP and/or AWS.Experience on Erwin Data Modeler Tool is must. Professional & Technical Skills: Must To Have Skills:Proficiency in Data Modeling Techniques and Methodologies Good To Have Skills:Experience with Oracle Procedural Language Extensions to SQL (PLSQL) Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education
Posted 3 months ago
15 - 20 years
45 - 50 Lacs
Mumbai
Work from Office
Responsibilities 1.Architecture Review:Conduct comprehensive assessments of existing Data, AI, and Automation architecture landscapes across various Financial Institutions and Regulators. Identify gaps compared to leading architecture patterns and best practices. 2.Gap Analysis and Pitch Development:Construct detailed pitches for transformation programs aimed at bridging identified gaps. Articulate the value proposition of proposed solutions to stakeholders. 3.Technical Solution Architecture:Lead the formulation of technical solution architectures for proposals and pitches. Ensure that solutions are innovative, scalable, and aligned with industry standards, while also establishing differentiators against competitors. 4.Technology Stack Evaluation:Evaluate current technology stacks used by banks and assist in selecting appropriate products and partner ecosystems that enhance the overall architecture. 5.Execution Oversight:Oversee the implementation of solution architectures during the execution phase. Review progress against architectural designs and ensure adherence to established guidelines and standards. 6.Stakeholder Collaboration:Collaborate with cross-functional teams, including business analysts, developers, and project managers, to ensure alignment between business needs and technical solutions. 7.Documentation and Reporting:Maintain clear documentation of architectural designs, decisions made, and execution progress. Provide regular updates to stakeholders on project status and any challenges encountered. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 1.Educational Background:A bachelors or masters degree in computer science, Information Technology, Engineering, or a related field is required. 2.Experience:At least 15 years of experience in IT consulting or a related field with a strong focus on conceptualizing solution architecture in the banking sector. 3. Banking Knowledge:Expertise in leading or crafting analytical solutions or products within the banking sector. 4.Skills: a.Proficiency in designing scalable architectures that leverage Data, AI, and automation technologies. b.Strong understanding of cloud computing platforms (e.g., AWS, Azure, GCP ) and their application in banking solutions. c.Experience with various architectural scenarios in Banking i.Very Large-Scale Data Management and high performing solution architectures. ii.Low latency Near Real time and Real Time Processing iii.High reliability d.Familiarity with programming languages, API libraries and communication protocols in Banking. 5.Professional Skills: a.Excellent skills with a strong ability to identify gaps in existing architectures. b.Strong communication skills to effectively convey complex technical concepts to non-technical stakeholders. c.Capabilities to assess, guide and course correct engagements execution decisions pertaining to solution architecture d.Understanding for regulatory guidelines on banking system interoperability and security. Preferred technical and professional experience 1.Familiarity with emerging trends in banking such as digital banking, embedded banking, and regulatory compliance requirements. 2.Certifications:Relevant certifications such as TOGAF (The Open Group Architecture Framework), AWS Certified Solutions Architect, or similar credentials that demonstrate expertise in architecture design. 3.Experience with Analytical Solutions:Prior experience in leading analytical solutions or products within the banking industry is highly desirable. 4.Understanding of Security Principles:Knowledge of security frameworks relevant to banking applications to ensure compliance with regulatory standards.
Posted 3 months ago
3 - 7 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Graduation Summary :As a Data Modeler, you will be responsible for working with key business representatives, data owners, end users, application designers, and data architects to model current and new data using Apache Kafka. Your typical day will involve designing and implementing data models, ensuring data quality and integrity, and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities: Design and implement data models using Apache Kafka, ensuring data quality and integrity. Collaborate with cross-functional teams, including key business representatives, data owners, end users, application designers, and data architects, to model current and new data. Develop and maintain data dictionaries, data flow diagrams, and other documentation related to data modeling. Ensure compliance with data security and privacy policies and regulations, including GDPR and CCPA. Professional & Technical Skills: Must To Have Skills:Experience with Apache Kafka. Good To Have Skills:Experience with other data modeling tools and technologies, such as ERwin or ER/Studio. Strong understanding of data modeling concepts and techniques, including conceptual, logical, and physical data models. Experience with data analysis and profiling tools, such as Talend or Informatica. Solid grasp of SQL and other database technologies, including Oracle, MySQL, and SQL Server. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Kafka. The ideal candidate will possess a strong educational background in computer science, information systems, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduation
Posted 3 months ago
5 - 10 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Graduate Data Modelling: Collaborate with cross-functional teams to understand business requirements and translate them into effective and scalable data models Develop and maintain data models using industry-leading practices, with a strong emphasis on Data Mesh and Data Vault 2 methodologies Ensure that data models align with standards and guidelines defined by Data architects and are adaptable to the evolving needs of the business Responsible for the development of the conceptual, logical, and physical data models, the implementation of Data Mesh, Data Fabric on target platforms (Google Big Query) using ERWIN. Domain Expertise: Acquire a deep understanding of various business domains and their associated data, processes and systems, ensuring that data models are reflective of the domain-specific context and requirements Data Mesh Implementation: Work closely with the Data Mesh architecture principles to ensure the decentralised ownership and domain-oriented approach to data Define and implement data products, aligning with the Data Mesh principles of domain-driven decentralized data ownership Ensure that data is structured in order to easily conform to security controls and obligations, that relate to the data Data Vault 2 Implementation: Design and implement Data Vault 2.0 compliant data warehouses and hubs. Ensure that the Data Vault model provides flexibility, scalability, and resilience in handling complex and evolving business requirements. Ensure that every artifact built is optimised and monitored and that cost is always considered Support, guide and mentor team members, in the domain Collaboration: Prior experience working in an agile squad environment, with minimal supervision. Expert technical advice, presentations to and education of audiences (technical and business) within Enterprise Data and Architectures, and within the business, including data stewards and enterprise architects, regarding enterprise conformance and Data Vault modelling concepts Collaborate with solution architects, data engineers, data scientists, and other stakeholders to understand data usage patterns, deal with production and Data Quality issues and optimize data models for performance. Provide guidance and support to development teams in the implementation of data models within the Data Mesh and Data Vault 2 frameworks. Documentation: Create and maintain comprehensive documentation of data models, ensuring that they are accessible to relevant stakeholders. Keep abreast of industry trends, emerging technologies, and best practices related to data modelling and integration. Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaried, Change Registers etc.)Other skills beneficial for the role are Certification in Data Vault 2.0 or related technologies Experience with tools such as Apache Kafka, Apache Flink, or similar data streaming platforms Familiarity with Google Cloud Platform services or AWS Platform Services with respect to Data and AI/ML Proficiency and experience with Erwin Data Modeller Experience or exposure to data catalogues such as Collibra and Abinitio would be highly beneficial Qualification Graduate
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2