Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 3.0 years
2 - 4 Lacs
Ahmedabad
Work from Office
Design, develop and maintain complex SQL queries, stored procedures, views, triggers and functions. Work on database design, normalization and scheme creation to meet business needs. Optimize queries for performance, scalability, and maintainability.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You are looking for a skilled and analytical Data Analyst with expertise in data modeling, data analysis, and Python programming. As a Data Analyst, you will be responsible for designing data models, conducting in-depth analysis, and creating automated solutions to facilitate business decision-making and reporting. Your key responsibilities will include designing and implementing conceptual, logical, and physical data models to support analytics and reporting. You will analyze large datasets to uncover trends, patterns, and insights that drive business decisions. Additionally, you will develop and maintain Python scripts for data extraction, transformation, and analysis. Collaboration with data engineers, business analysts, and stakeholders to comprehend data requirements is essential. Creating dashboards, reports, and visualizations to effectively communicate findings will be part of your role. Ensuring data quality, consistency, and integrity across systems, as well as documenting data definitions, models, and analysis processes, will also be key responsibilities. The ideal candidate for this position should have strong experience in data modeling, including ER diagrams, normalization, and dimensional modeling. Proficiency in Python for data analysis using Pandas, NumPy, Matplotlib, etc., is required. A solid understanding of SQL and relational databases is necessary, along with experience in data visualization tools such as Power BI, Tableau, or matplotlib/seaborn. You should be able to translate business requirements into technical solutions and possess excellent analytical, problem-solving, and communication skills. Virtusa values teamwork, quality of life, professional and personal development. Joining Virtusa means becoming part of a global team of 27,000 individuals who are dedicated to your growth. You will have the opportunity to work on exciting projects and leverage state-of-the-art technologies throughout your career with us. At Virtusa, collaboration and a team-oriented environment are paramount, providing great minds with a dynamic space to cultivate new ideas and promote excellence.,
Posted 1 week ago
5.0 - 9.0 years
8 - 13 Lacs
Hyderabad
Work from Office
About The Role Role Overview:Develop efficient SQL queries and maintain views, models, and data structures across federated and transactional DB to support analytics and reporting. SQL (Advanced) Python for data exploration and scripting Shell scripting for lightweight automationKey Responsibilities: Write complex SQL queries for data extraction and transformations Build and maintain views, materialized views, and data models Enable efficient federated queries and optimize joins across databases Support performance tuning, indexing, and query optimization effortsPrimary: Expertise in MS SQL Server / Oracle DB / PostgresSQL , Columnar DBs like DuckDB , and federated data access Good understanding of Apache Arrow columnar data format, Flight SQL, Apache Calcite Secondary: Experience with data modelling, ER diagrams, and schema design Familiarity with reporting layer backend (e.g., Power BI datasets) Familiarity with utility operations and power distribution is preferred Experience with cloud-hosted databases is preferred Exposure to data lake in cloud ecosystems is a plusOptional Familiar with Grid CIM (Common Information Model; IEC 61970, IEC 61968) Familiarity with GE ADMS DNOM (Distribution Network Object Model) GE GridOS Data Fabric
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
At EY, you have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. Your unique voice and perspective are essential to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. As a Senior Aera Developer, you will be a part of the Supply Chain Tech group of EY GDS consulting Team. Your role involves translating business needs into technical specifications, performing data analysis and manipulation, and simplifying business concepts through data modeling. You will be responsible for developing reporting systems, writing/customizing code in various Aera modules, and evaluating and improving Aera Skills. Additionally, you will generate quality reports, develop data visualizations, and work with clients throughout the implementation lifecycle. To succeed in this role, you must have experience as an Aera Skill Builder, expertise in BI reporting and data warehouse concepts, strong data modeling skills, and proficiency in Aera skill builder modules. You should be skilled in creating dynamic visualizations, configuring Aera skills, applying security concepts, and handling report performance and administration. Aera Skill Builder and Aera Architect certification is required. Ideal candidates will have a strong knowledge of Aera Skill Build concepts, expertise in data handling, experience in SQL tuning and optimization, and the ability to interact with customers to understand business requirements. Good communication skills, problem-solving abilities, and a proactive approach to learning new technologies are also important. In this role, you will drive Aera Skill Development tasks and have the opportunity to work with a market-leading, multi-disciplinary team. EY offers a supportive environment, coaching, and feedback from engaging colleagues, opportunities for skills development and career progression, and the freedom to handle your role in a way that suits you. EY is committed to building a better working world by creating long-term value for clients, people, and society, and by fostering trust in the capital markets. Through the expertise of diverse teams worldwide, EY provides trust, assurance, and support for clients to grow, transform, and operate effectively across various industries.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
At EG, we are dedicated to developing industry-specific software solutions for our customers, allowing them to focus on their profession while we handle the technology aspect. Our team, consisting of industry peers and backed by the stability and innovation of EG, is committed to tackling significant challenges such as resource efficiency and sustainability. With a global workforce of over 3000 employees, including a team of 850+ based in Mangaluru, India, we prioritize a people-first culture that encourages innovation, collaboration, and continuous learning. Join our COE Team, working at the forefront of AI innovation, where we shape AI-powered digital transformation across various business domains. As a Senior Data Scientist with at least 7 years of experience, you will have the opportunity to be part of a core team designing and delivering end-to-end AI solutions that directly impact products across multiple industries. You will have the autonomy to bring your innovations to life while collaborating with exceptional minds in the field. Responsibilities: - Design scalable AI architectures including Traditional AI, GenAI, and Agentic AI - Build AI use cases deployed across hybrid environments such as on-premises infrastructure, Azure, and AWS cloud ecosystems - Enable business units to adopt and maintain AI solutions - Conduct feasibility studies and define technology roadmaps for AI initiatives - Support integration of AI accelerators into existing business applications Your role will involve engaging with product/business teams to translate requirements into AI/ML solutions, leading feasibility analysis, designing solution architectures, and developing, training, and evaluating AI/ML models for various business use cases. You will collaborate with cross-functional teams to deploy end-to-end data pipelines and AI solutions using cloud platforms like Azure and AWS. Additionally, you will evangelize AI best practices, drive adoption of emerging AI technologies, and provide mentorship and training to upskill Business Unit teams and junior data scientists. Required Skills: - Bachelor's in Computer Science and Engineering or related field; advanced degree preferred - 7+ years of experience in AI/ML with expertise in GenAI, LLMs, RAG, and related technologies - Proficiency in Python, ML frameworks (TensorFlow, PyTorch, Keras), NLP, and applied AI - Full-stack engineering exposure with backend in Python and experience in building scalable applications using Azure and AWS - Understanding of OOP/OOD, data structures, design patterns, and Agile/DevSecOps methodologies - Experience with visualization tools like Power BI, Streamlit, etc., and excellent interpersonal and communication skills Join us at EG for a professional, innovative environment where you can work on real-world problems and large-scale AI solutions alongside talented colleagues. We offer flexibility, ownership of ideas, extensive opportunities for personal and professional growth, and a strong focus on employee well-being, inclusion, and benefits.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing process flows and data mappings for transactional and reporting systems. Your role will involve recommending and creating Business Objects Universes to meet customer and user requirements. It is essential to understand concepts, best practices, and functions supporting Data Warehouse driven Reporting Solutions. You will provide report creation/debug assistance, project updates, and manage FleetTrak/ADO efficiently. Having a solid IT background, you should possess a comprehensive understanding of Business Processes with analytical and proactive problem-solving skills. Mentoring other BI Analysts and IT staff on best practices will be part of your responsibilities. Collaboration with DBAs to recommend ERD augmentation, understanding internal department needs, scoping projects, and delivering results are crucial aspects. Your attention to detail, ability to learn and work independently, and within a team, along with effective communication and writing skills are highly valued. Besides, you will be expected to perform all assigned duties and special projects, demonstrating a strong team player attitude. Education and/or Training required for this role include a Bachelor's degree or equivalent work experience. Relevant work experience of 4+ years in SAP Business Objects Universe Designer (IDT) and Web Intelligence, SAP Business Objects Business Intelligence suite, Oracle SQL, PL/SQL, SQL Server is essential. Familiarity with ER diagrams and experience in an Azure/Power BI working environment is a plus. Your planning, organizing, and managerial knowledge should reflect a willingness to assist others, approachability to peers and management, and proficiency in creating relational and dimensional database architecture. Strong analytical skills for effective problem-solving, ability to grasp issues quickly, and make educated judgments are vital. Regarding communicating and influencing skills, excellent verbal and written communication abilities are required, with a focus on conveying technical issues to a diverse non-technical audience effectively.,
Posted 2 weeks ago
8.0 - 13.0 years
5 - 9 Lacs
Chennai
Work from Office
This position is for Tax processing application development. The candidate should pose the relevant technical skills to develop code for various flagship and technical migration projects. Candidate should develop a good understanding of the existing application (functional and technical) Responsibilities Direct Responsibilities Oracle Developer will be performing: Consistent work experience of 8 years in Oracle SQL and PL/SQL development. Develop schemas, tables, indexes, sequences, constraints, functions, and procedures, Packages, Collections, Users and Roles. Understand business requirements and accordingly develop database models. Provide optimal design solutions to improve system quality and efficiency. Follow best practices for database design. Perform capacity analysis and oversee database tuning. Maintain technical documentations for reference purposes. Write complex codes and queries and participate in code reviews. Perform design review, modify codes and test upgrades. Work closely with other developers to improve applications and establish best practices. An ability to understand front-end users requirements(Java) and a problem-solving attitude In-depth understanding of data management (e.g. permissions, recovery, security and monitoring). Provide training and knowledge sharing with the development team. Maintains all databases required for development, testing, Pre-Production and production usage. Takes care of the Database design and implementation. Implement and maintain database security (create and maintain users and roles, assign privileges). Performance tuning and monitoring, proactively propose solutions. Perform Data anonymization Technical & Behavioral Competencies Knowledge and/or experience of the financial services industry Good understanding of software development life cycle and Agile/Iterative methodology Technical competency in the following: - Experience in SQL and PL/SQL development - Oracle Database - Good understanding of ER Diagrams, Data Flows - Good to have experience on DB design & modelling - Hands-on on performance tuning tools and debugging Ability to perform technical analysis, design and identify impacts (functional/technical) Prior experience in High Volume / Mission critical Systems is a plus Contributing Responsibilities Work in duet with our offshore and in-site technical team to coordinate the database initiatives. Perform detailed technical analysis with impacts (technical/functionally) and prepare Technical specification document. Mentor and carry out database peer code reviews of development team. Bug fixing & performance optimization Keep the development team up-to-date about the best-practices and in-site feedbacks. Challenge the time-response performance and the maintainability of the treatments/queries. Maintains data standards and security measures and anonymize Production data and import in Development and Testing environments. Performance tuning and Monitoring of All databases and proactively propose solutions in case of issues. Develop and unit test the code : Develop the Code to suffice the business requirements Unit test the code and bug fix all the defects arising out of the unit testing Properly check in the code to avoid issues arising out of configuration management Deploy and Integrate test the application Developed : Deploy the developed code into the IST Environments and perform the Integration testing by working with the cross teams Fix all the defects arising out of IST and UAT Testing. Keep the development team up-to-date about the best practices and feedback. Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Adaptability Ability to deliver / Results driven Creativity & Innovation / Problem solving Ability to share / pass on knowledge Transversal Skills: Ability to manage / facilitate a meeting, seminar, committee, training Ability to understand, explain and support change Analytical Ability Ability to develop others & improve their skills Ability to develop and adapt a process Education Level: Bachelor Degree or equivalent Experience Level At least 5 years Other/Specific Qualifications (if required) -
Posted 3 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 4 weeks ago
1.0 - 5.0 years
3 - 5 Lacs
Gurugram
Work from Office
Please check the below link https://docs.google.com/document/d/11dbtBAD35xb_TyXS02C18SxMzzTW9J4w/edit Required Candidate profile Please check the below link https://docs.google.com/document/d/11dbtBAD35xb_TyXS02C18SxMzzTW9J4w/edit
Posted 4 weeks ago
5.0 - 8.0 years
15 - 18 Lacs
Pune, Chennai, Bengaluru
Hybrid
Urgent Requirement!!! Notice Period-Immediate to 15days Location- Bengaluru/Chennai/Pune Experience-5+yrs , all relevant *NOTE- Kindly go through the JD before applying The following are the Key responsibilities for this position: Take ownership of the project and work independently in a collaborative environment. Engaged in the project workshops, collaborating with stakeholders and software developers to design the database schema and data models. Establish security measures for safeguarding private data, including encryption and access controls, and conduct routine database audits to find any possible vulnerabilities. Consistently monitors database performance and identifies bottlenecks that may impact the application's performance and responsiveness. Develop and oversee a strategy for disaster recovery (DRM) to protect prevent data loss in the event of disasters or system failures. Implement data governance procedures and making sure that pertinent data regulations and standards are followed. Understanding the project's data security requirements, encryption techniques, and compliance regulations such as GDPR, CCPA, or HIPAA, and creating a strategy to support the project. Manage and maintain databases, conduct routine maintenance tasks, and ensure data availability and integrity. Collaborate with data analysts to provide them with well-structured data and insights needed for business intelligence and analytics. Creating, implementing, and evaluating database-driven applications while following security and performance best practices. Supporting Azure SQL, NO SQL, and other database needs for the project. Programming skills such as SQL, Python, PowerShell, and others to interact with databases and incorporate them into applications. Conduct continuous R&D to provide the team with innovative database features, capabilities, and applications that will enhance the solutions. Skills and Abilities: Extensive and proven DBA expertise in large-scale SQL Server settings, including Azure managed instance databases. Proficient in delivering high- and low-level design (HLD and LLD) for projects, including ER diagrams, data models, and data pipeline flow. Proficient in the principles of database design, i ncluding managing of high-volume instances of Azure SQL configurations that feature high availability, including the use of database performance optimization and monitoring tools. Exceptional analytical and problem-solving abilities, enabling the rapid identification of the root cause of a problem. Proficiency in creating best practice policies and procedures for data security and integrity, along with the monitoring and restriction of database access. Experience managing data warehouses, data lakes, data bricks , and other cloud-based data storage solutions. Experience in the Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO) of disaster recovery are applied to the management of all database instance. Proficiency with SQL, Python, PowerShell scripting , or related technologies for creating application dashboards for proactive monitoring and authoring automation scripts. Proficiency in overseeing cloud-based data storage systems, including data warehouses, data lakes, and data bricks. Experience with the principles and practices of master-slave replication and database storage engines. Proficiency with container technologies, Docker, Kubernetes, Azure DevOps pipelines, CI/CD, and data pipelines in cloud environments. Strong understanding of database security best practices. Knowledge of Azure cloud infrastructure and its capabilities. Working knowledge of the Agile/Waterfall approach and how data pipeline fits into it. Working knowledge of microservices, monolith architecture, and web apps based on MVC, MVVM, and SPA. Knowledge of security regulations, such as Identity & Access Management (IAM), Customer Identity & Access Management (CIAM), and security authentication technologies. Interested candidates can share thier updated resume with neha.beriwal@sapwood.net along with the following details: Current CTC Expected CTC Notice Period Current and Preferred Location Thanks and Regards Neha Beriwal Sr. Consultant Sapwood Ventures
Posted 1 month ago
4.0 - 9.0 years
15 - 25 Lacs
Bengaluru
Remote
Hi, Job Description Summary The Sr. Engineer, Data Modeler will be responsible for shaping and managing the data models and architecture, enabling the organization to store, analyze, and leverage large-scale healthcare data efficiently. This includes developing and implementing reliable, scalable, and effective data models for various data warehouse solutions using cutting-edge tools such as Fivetran, DBT, Snowflake, AWS, Atlan, Erwin ER Diagrams , and Sigma Computing. This role will collaborate with a diverse set of stakeholders to develop a comprehensive data architecture that supports decision-making, reporting, analytics, and data governance. This role requires significant experience with dimensional models, RDBMS, cloud platforms, and ETL processes. This role will be responsible for defining and designing data models that support data governance, data quality, and master data management (MDM), while also working with stakeholders to implement data-driven solutions that enhance business outcomes in the healthcare sector. A strong focus will be placed on creating a trusted data environment by ensuring accurate data mapping, implementing Golden Record practices. Job Description Data Modeling & Architecture: Design and implement conceptual, logical, and physical data models, including Entity-Relationship (ER) models, Star Schema, Snowflake Schema, Data Vault Modeling, and Dimensional Modeling. Lead the design of normalized and denormalized structures to meet business requirements and ensure optimal performance of the Data Warehouse and Data Marts. Collaborate with business and technical teams to map business requirements to data models, ensuring that Master Data Management (MDM) processes and Golden Record concepts are well-defined. Build and maintain a comprehensive Business Glossary and Data Dictionary to standardize definitions and ensure consistency across the organization. Data Lineage & Mapping: Ensure that data lineage is accurately defined, visualized, and documented across the Data Warehouse environment. Oversee the data mapping process to track the flow of data from source to destination, ensuring consistency, integrity, and transparency of data throughout its lifecycle. Data Governance & Quality: Implement Data Governance processes to manage data access, quality, security, and compliance. Define and enforce Data Quality standards and practices, including Data Cleansing, to ensure data integrity and accuracy within the data warehouse environment. Work with stakeholders to establish governance frameworks for Data Lineage, ensuring data traceability, and transparency across the platform. Work with data architects and IT leadership to establish guidelines for data access, data security, and lifecycle management. Real-Time Data Ingestion & Change Data Capture (CDC): Design and implement real-time data ingestion pipelines using Kafka, AWS Kinesis, or Snowpipe to enable streaming data integration into the data warehouse. Implement Change Data Capture (CDC) mechanisms to efficiently capture and propagate data changes from operational systems using tools such as Fivetran or AWS, Lambda. Ensure low-latency processing, incremental updates, and data availability for real-time analytics and reporting. Quality Assurance & Continuous Improvement: Ensure high standards for data quality through rigorous testing, data validation, and performance optimization. Continuously evaluate and improve data modeling processes, tools, and methodologies. Automation & Process Improvement: Work with Data Engineers and development teams to improve data platform automation and enhance the data modeling lifecycle. Continuously monitor, test, and optimize data models and pipelines to ensure scalability, flexibility, and performance of the Data Warehouse. Documentation & Reporting: Maintain clear and up-to-date documentation for data models, data lineage, data mappings, and architectural decisions. Create and present technical diagrams, such as Entity-Relationship Diagrams (ERDs), to stakeholders and ensure alignment with business objectives. Platform Design & Deployment: Develop data architecture for the analytics platform on Snowflake and integrate with other AWS tools for robust data management. Work closely with data engineers to automate data pipeline deployments and updates using Fivetran, DBT, and cloud-based solutions. Stakeholder Collaboration: Partner with Product Manager, and other technical teams to define requirements and deliver optimal data architecture solutions. Conduct regular meetings to communicate technical decisions and ensure alignment with business goals and strategy. Contribute to proposal creation and RFP submissions, ensuring technical feasibility and best practices. Documentation & Reporting: Document all design decisions and data models, adhering to existing guidelines and ensuring clear communication across teams. Create presentations and visual data architecture diagrams for internal and external stakeholders. Perform other duties that support the overall objective of the position. Education Required: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Masters degree or certifications in Data Architecture, Cloud Technologies, or related areas is a plus. Or, any combination of education and experience which would provide the required qualifications for the position. Experience Required: 6+ to 10 years in Data Modeling and Data 6+ to 10 years of hands-on experience in data modeling, data architecture, or information architecture with a focus on large-scale data warehouses. 6+ years of experience with dimensional models and relational database management systems (SQL Server, Oracle, DB2, etc.). 5+ years of experience with cloud technologies, especially AWS services and tools. Experience with ETL tools and automation (e.g., Fivetran, DBT ). Experience with data governance, data quality frameworks, and metadata management. Preferred: Experience in healthcare data modeling and data warehousing. Expertise in AWS environments. Hands-on experience with data integration and cloud automation tools. Familiarity with business intelligence tools (e.g., Sigma Computing). Understanding of healthcare-specific data governance, regulatory frameworks, and security compliance (e.g., HIPAA). Knowledge, Skills & Abilities: Knowledge of: Solid understanding of Data Vault, Star Schema, Snowflake Schema, and dimensional modeling. Proficient in SQL and experience with cloud-based data warehouse solutions such as Snowflake. Familiarity with AWS cloud services, Sigma Computing, Atlan, and Erwin ER diagrams. Skill in: Excellent communication skills to engage with both technical and non-technical stakeholders. Strong analytical and problem-solving skills to design scalable and efficient data models. Ability to: Ability to take ownership of deliverables, manage multiple tasks, and work effectively within an Agile methodology. Proven leadership ability to coach and mentor junior team members.
Posted 1 month ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 1 month ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough