Jobs
Interviews

436 Data Modelling Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 8 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Position: Senior Engineer Golang Experience: 3-5 years Location: Bangalore, Hyderabad, Pune, Mumbai, Chennai, Ahmedabad Mode of work: Hybrid (2 days WFO) Mode of Interview: 2 Rounds (Virtual, F2F) Notice Period: Immediate-15 days We are looking for a highly skilled Senior Backend Developer with solid experience in developing and maintaining scalable backend systems using Go. You'll be part of a core engineering team building robust APIs and distributed services. Key Responsibilities: Develop scalable and high-performance backend services. Write clean, efficient, and testable code. Optimize systems for latency, reliability, and cost. Collaborate closely with front-end engineers and product teams. Handle data modelling and database performance tuning. Required Skills: Strong in Go Solid understanding of: RESTful API design SQL (PostgreSQL, MySQL) NoSQL (MongoDB, Redis)

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Posted 1 month ago

Apply

8.0 - 13.0 years

12 - 18 Lacs

Hyderabad

Work from Office

Data Engineering Team As a Lead Data Engineer for India, you will be accountable for leading the technical aspects of product engineering by being hands on, working on the enhancement, maintenance and support of the product on which your team is working, within your technology area. You will be responsible for your own hands-on coding, provide the design thinking and design solutions, ensuring the quality of your teams output, representing your team in product-level technical forums and ensuring your team provides technical input to and aligns with the overall product road-map. How will you make an impact? You will work with Engineers in other technology areas to define the overall technical direction for the product on alignment with Groups technology roadmap, standards and frameworks, with product owners and business stakeholders to shape the product's delivery roadmap and with support teams to ensure its smooth operation. You will be accountable for the overall technical quality of the work produced by India that is in line with the expectation of the stakeholders, clients and Group. You will also be responsible for line management of your team of Engineers, ensuring that they perform to the expected levels and that their career development is fully supported. Key responsibilities o Produce Quality Code o Code follows team standards, is structured to ensure readability and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o Critical paths through code are covered by appropriate tests o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Data updates are monitored and complete within SLA o Technical designs follow team and group standards and frameworks, is structured to ensure reusability, extensibility and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Estimates are consistently challenging, but realistic o Most tasks are delivered within estimate o Complex or larger tasks are delivered autonomously o Sprint goals are consistently achieved o Demonstrate commitment to continuous improvement of squad activities o The product backlog is consistently well-groomed, with a responsible balance of new features and technical debt mitigation o Other Engineers in the Squad feel supported in their development o Direct reports have meaningful objectives recorded in Quantium's Performance Portal, and understand how those objectives relate to business strategy o Direct reports' career aspirations are understood / documented, with action plans in place to move towards those goals o Direct reports have regular catch-ups to discuss performance, career development and their ongoing happiness / engagement in their role o Any performance issues are identified, documented and agreed, with realistic remedial plans in place o Squad Collaboration o People Management o Produce Quality Technical Design o Operate at high level of productivity Key activities Build technical product/application engineering capability in the team by that is in line with the Groups technical roadmap, standards and frameworks Write polished code, aligned to team standards, including appropriate unit / integration tests Review code and test cases produced by others, to ensure changes satisfy the associated business requirement, follow best practices, and integrate with the existing code-base Provide constructive feedback to other team members on quality of code and test cases Collaborate with other Lead / Senior Engineers to produce high-level designs for larger pieces of work Validate technical designs and estimates produced by other team members Merge reviewed code into release branches, resolving any conflicts that arise, and periodically deploy updates to production and non-production environments Troubleshoot production problems and raise / prioritize bug tickets to resolve any issues Proactively monitor system health and act to report / resolve any issues Provide out of hours support for periodic ETL processes, ensuring SLAs are met Work with business stakeholders and other leads to define and estimate new epics Contribute to backlog refinement sessions, helping to break down each epic into a collection of smaller user stories that will deliver the overall feature Work closely with Product Owners to ensure the product backlog is prioritized to maximize business value and manage technical debt Lead work breakdown sessions to define the technical tasks required to implement each user story Contribute to sprint planning sessions, ensuring the team takes a realistic but challenging amount of work into each sprint and each team member will be productively occupied Contribute to the teams daily stand-up, highlighting any delays or impediments to progress and proposing mitigation for those issues Contribute to sprint review and sprint retro sessions, to maintain a culture of continuous improvement within the team Coach / mentor more junior Engineers to support their continuing development Set and periodically review delivery and development objectives for direct reports Identify each direct reports longer-term career objectives and, as far as possible, factor this into work assignments Hold fortnightly catch-ups with direct reports to review progress against objectives, assess engagement and give them the opportunity to raise concerns about the product or team Work through the annual performance review process for all team members Conduct technical interviews as necessary to recruit new Engineers The superpowers youll be bringing to the team: 8+ years of experience in design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP) or similar cloud platforms. 2. Strong experience with SQL 3. Values delivering high-quality, peer-reviewed, well-tested code 4. Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time 5. Knowledge of DevOps functions and to contribute to CI / CD pipelines 6. Strong knowledge of data warehousing and data modelling and techniques like dimensional modelling etc 7. Strong hands-on experience with BigQuery/Snowflake, Airflow/Argo, Dataflow, Data catalog, VertexAI, Pub/Sub etc or equivalent products in other cloud platforms 8. Solid grip over programming languages like Python or Scala 9. Hands on experience in manipulating SPARK at scale with true in-depth knowledge of SPARK API 10. Experience working with stakeholders and mentoring experience for juniors in the team is good to have 11. Recognized as a go-to person for high-level designs and estimations 12. Experience working with source control tools (GIT preferred) with good understanding of branching / merging strategies 13. Experience in Kubernetes and Azure will be an advantage 14. Understanding of GNU/Linux systems and Bash/scripting 15. Bachelors degree in Computer Science, Information Technology or a related discipline 16. Comfortable working in a fast moving, agile development environment 17. Excellent problem solving / analytical skills 18. Good written / verbal communication skills 19. Commercially aware, with the ability to work with a diverse range of stakeholders 20. Enthusiasm for coaching and mentoring junior engineers 21. Experience in lading teams, including line management responsibilities

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Chennai, Bengaluru

Work from Office

KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Senior Data Engineer (Remote, Contract 6 Months) Databricks, ADF, and PySpark. We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune

Posted 1 month ago

Apply

6.0 - 11.0 years

0 - 2 Lacs

Kochi, Bengaluru

Work from Office

We are looking for a Senior Developer with solid hands-on experience in Optimizely SaaS CMS and a strong front-end and back-end tech stack. If you're ready to work on impactful enterprise projects and drive next-gen digital experiences, wed love to hear from you! Role Details: -Position: Senior Developer -Experience: 6+ Years -Location: Kochi / Bangalore -Work Mode: Hybrid -Notice Period: Immediate Joiners Preferred -Client and Budget: Will discuss -Interview- [1st Round Virtual/ 2nd RoundF2F/Virtual] Key Must-Haves: Mandatory hands-on experience with Optimizely SaaS CMS Strong expertise in Next.js (SSR & SSG), React, TypeScript Proficient in Node.js and front-end performance optimization Experience with Optimizely Suite CMS, Commerce, CDP, DAM Skilled in .NET / C#, ASP.NET MVC, and RESTful API integrations Optimizely CDP: Data modeling, segmentation, personalization Why Join us: -Work on cutting-edge CMS & personalization solutions -Hybrid flexibility collaborate in dynamic tech hubs: Kochi/Bangalore -High-growth role with a competitive package -Exposure to enterprise-level digital transformation projects Interested? Apply now or DM us directly. Know someone who fits? Tag them! anzia.sabreen@bct-consulting.com

Posted 1 month ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 1 month ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

Pune

Hybrid

Design, implement, optimize ETL/ELT pipelines to ingest, transform, and load data into AWS Redshift from various sources Strong background in Python scripting, AWS services (Lambda, S3, Redshift),Data Integration & Pipeline Development Required Candidate profile 6 + years of exp. in BI development, data engineering. • Python/R scripting for data processing/ automation. • AWS services: Lambda, S3, and Redshift. • Data warehousing • Proficiency in SQL

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 1 month ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Jaipur

Work from Office

Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as Markets in Financial Directive MiFID II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classes: Rates, Credit, Commodities, Equities and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 5-8years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred.

Posted 1 month ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Pune, Greater Noida

Work from Office

Responsibilities:- Candidate should have strong experience on Duck creek. Candidate should have strong experience on Policy. Candidate should strong experience on Duck creek Example Platform 6X & 7X. Good understanding of underwriting, rating, insurance Rules, Forms, Example Author, Server, Express, Forms, Rating, Batch Processing, Task Creation, Transact, Address Validation. Good Knowledge of Policy life cycle and various Policy Transactions. Good Knowledge of Duck Creek Policy System and workflows. Experience in P&C insurance domain. Good Knowledge of Manuscripts, data model and Inheritance model. Good Understanding of business, functional requirements and policy workflow of the total application and project. Understanding the clients requirement properly then going for the development in the core areas of DCT. Must have excellent Communication Skills. Mandate Skill- .Net, Duckcreek Policy / PAS / Policy Center, Example, Author, Pages, Rating, Forms, Insurance-P&C Education / Qualification- BE/ B.Tech / BCA / B.Sc. / M.CA / M. TECH / Any Graduate

Posted 1 month ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

Hyderabad

Work from Office

Role Description: The Data Strategy and Governance Lead will operationalize the Enterprise Data Council vision across specific domains (Research, Clinical Trials, Commercial, etc.). He/She will coordinate activities at the tactical level, interpreting Enterprise Data Council direction and defining operational level impact deliverables and actions to build data foundations in specific domains. The Data Strategy and Governance Lead will partner with senior leadership and other Data Governance functional leads to align data initiatives with business goals. He/she will establish and enforce data governance policies and standards to provide high-quality data, easy to reuse and connect to accelerate AI innovative solutions to better serve patients. Roles & Responsibilities: Responsible for data governance and data management for a given domain of expertise (Research, Development, Supply Chain, etc.). Manage a team of Data Governance Specialists and Data Stewards for a specific domain. Responsible for operationalizing the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Provides expert guidance on business process and system design to support data governance and data/information modelling objectives. Maintain documentation and act as an expert on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance with data privacy, security, and regulatory policies for the assigned domains Publish metrics to measure effectiveness and drive adoption of Data Governance policy and standards, that will be applied to mitigate identified risks across the data lifecycle (e.g., capture / production, aggregation / processing, reporting / consumption). Establish enterprise level standards on the nomenclature, content, and structure of information (structured and unstructured data), metadata, glossaries, and taxonomies. Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Architecture, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Functional Skills: Must-Have Skills: Technical skills with in-depth knowledge of Pharma processes with preferred specialization in a domain (e.g., Research, Clinical, Commercial, Supply Chain, Finance, etc.). Aware of industry trends and priorities and can apply to governance and policies. In-depth knowledge and experience with data governance principles and technology; can design and implement Data Governance operating models to drive Amgens transformation to be a data driven organization. In-depth knowledge of data management, common data models, metadata management, data quality, reference & master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Good-to-Have Skills: Experience adopting industry standards in data products. Experience managing industry external data assets (e.g. Claims, EHR, etc.) Ability to successfully execute complex projects in a fast-paced environment and in managing multiple priorities effectively. Ability to manage projects or departmental budgets. Experience with modelling tools (e.g., Visio). Basic programming skills, experience in data visualization and data modeling tools. Experience working with agile development methodologies such as Scaled Agile. Soft Skills: Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: 12 to 15 years of Information Systems experience 4 years of managerial experience directly managing people and leadership experience leading teams, projects, or programs.

Posted 1 month ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Bengaluru

Hybrid

Skill required : Data Engineers- Azure Designation : Sr Analyst/ Consultant Job Location : Bengaluru Qualifications: BE/BTech Years of Experience : 4 - 11 Years OVERALL PURPOSE OF JOB Understand client requirements and build ETL solution using Azure Data Factory, Azure Databricks & PySpark . Build solution in such a way that it can absorb clients change request very easily. Find innovative ways to accomplish tasks and handle multiple projects simultaneously and independently. Works with Data & appropriate teams to effectively source required data. Identify data gaps and work with client teams to effectively communicate the findings to stakeholders/clients. Responsibilities : Develop ETL solution to populate Centralized Repository by integrating data from various data sources. Create Data Pipelines, Data Flow, Data Model according to the business requirement. Proficient in implementing all transformations according to business needs. Identify data gaps in data lake and work with relevant data/client teams to get necessary data required for dashboarding/reporting. Strong experience working on Azure data platform, Azure Data Factory, Azure Data Bricks. Strong experience working on ETL components and scripting languages like PySpark, Python . Experience in creating Pipelines, Alerts, email notifications, and scheduling jobs. Exposure on development/staging/production environments. Providing support in creating, monitoring and troubleshooting the scheduled jobs. Effectively work with client and handle client interactions. Skills Required: Bachelors' degree in Engineering or Science or equivalent graduates with at least 4-11 years of overall experience in data management including data integration, modeling & optimization. Minimum 4 years of experience working on Azure cloud, Azure Data Factory, Azure Databricks. Minimum 3-4 years of experience in PySpark, Python, etc. for data ETL . In-depth understanding of data warehouse, ETL concept and modeling principles. Strong ability to design, build and manage data. Strong understanding of Data integration. Strong Analytical and problem-solving skills. Strong Communication & client interaction skills. Ability to design database to store huge data necessary for reporting & dashboarding. Ability and willingness to acquire knowledge on the new technologies, good analytical and interpersonal skills with ability to interact with individuals at all levels.

Posted 1 month ago

Apply

9.0 - 12.0 years

30 - 35 Lacs

Mumbai, Pune, Greater Noida

Work from Office

Notice Period- Immediate-15 Days Mandate Skill- .Net, Duckcreek Policy / PAS / Policy Center, Example, Author, Pages, Rating, Forms, Insurance-P&C Responsibilities:- Candidate should have strong experience on Duck creek. Candidate should have strong experience on Policy. Candidate should strong experience on Duck creek Example Platform 6X & 7X. Good understanding of underwriting, rating, insurance Rules, Forms, Example Author, Server, Express, Forms, Rating, Batch Processing, Task Creation, Transact, Address Validation. Good Knowledge of Policy life cycle and various Policy Transactions. Good Knowledge of Duck Creek Policy System and workflows. Experience in P&C insurance domain. Good Knowledge of Manuscripts, data model and Inheritance model. Good Understanding of business, functional requirements and policy workflow of the total application and project. Understanding the clients requirement properly then going for the development in the core areas of DCT. Must have excellent Communication Skills. Education / Qualification- BE/ B.Tech / BCA / B.Sc. / M.CA / M. TECH / Any Graduate Work Location- Greater Noida, Mumbai, Pune & Hyderabad

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models.

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 8 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2.Engage with delivery team to ensure right solution is proposed to the customer a.Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b.Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3.Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: Business Analysis. Experience: 8-10 Years.

Posted 1 month ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Hyderabad

Remote

Hi , Greetings from Euclid Innovations !!! We have openings for Technical Business Analyst with one of our Banking based Company as Remote work Mode. Position : Technical Business Analyst Experience : 10+ Years Location : Remote Notice Period: Immediate to 20 Days Max Skills set: FINANCIAL /CAPITAL MARKET and FIXED INCOME, EQUITY, CREDIT, Bond, Investment Banking any Duties and Responsibilities Assist in the Business Analysis phase including the capture and translation of business requirements turning these into functional requirements, and non-functional requirements (i.e. architectural, infrastructure, security, testing, migration, operational, DR). Ensuring appropriate documentation of requirements is captured and recorded (e.g. 'Requirement Story' in JIRA). Analysing end to end business streams to establish data requirements in line with XML/XSD modelling, specifically FpML, of entities across multiple business areas across multiple geographical regions. Logical Data Modelling working closely with both business and IT teams. Identification of common data requirements and helping to drive shared data platforms. Cleaning, mapping, and extending data sets to improve business processes and tools. Coordination and delivery management of solutions across Enterprise Delivery teams in support of end to end testing and production delivery. Adherence to existing global, local and department project standards for documentation, security, testing and release management. Qualifications, Skills and Experience Bachelors Degree or equivalent. Excellent technical analysis and investigatory skills. Ability to work with both business and IT staff in a pressured environment. Business analysis within an Agile development project. Strong data analysis skills to ensure accurate system data extracts and reconciliations working with large datasets. Proven track record of writing structured business requirements and functional specifications. Working knowledge of financial instruments: government bonds, SAS bonds, credit bonds, exchange traded bond futures, interest rate swaps, repos, stock lending and equities. Well-structured and logical approach to working. Good knowledge of Compliance business processes. Proven track record of supporting Back/Middle Office systems. Proven experience of developing mutually beneficial relationships with business stake holders, users, software solution providers, and other IT teams. Proven experience of full involvement in project life cycles within Investment Banking. Proven experience performing system testing and guiding users with building their functional plans for user testing. Ability to handle multiple work streams and assignments simultaneously. Proven experience of issue resolution through data mining and investigation if interested share profile to aruna.c@euclidinnovations.com

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 9 Lacs

Bengaluru

Work from Office

-Prepare, monitor and generate appropriate mathematical models and leverage WFM Tools to generate staffing requirements -Oversee the overall Capacity planning for HC Please Call/ Whatsapp @ 6002281943/7575955995/8559900185 Required Candidate profile -At least 2-3 years in a WFM planning role -Strong process and mathematical orientation -Experience of data modelling, simulations and scenario planning -Strong Communicator and decision maker

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver No Performance Parameter Measure 1. Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Tableau.

Posted 1 month ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver No Performance Parameter Measure 1. Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Tableau.

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Pune

Work from Office

Data/ETL Architect/Data Modeler: Develop conceptual, logical, and physical data models to ensure accurate, scalable, and optimized data structures aligned with business requirements. Collaborate with business and technical teams to define data flow, transformation rules, and ensure alignment with data governance and quality standards. Design end-to-end ETL architecture and data integration solutions. Technologies - SQL, ETL, Big Data, ER Studio Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Hybrid

Your day at NTT DATA The Senior Data Scientist is an advanced subject matter expert, tasked with taking accountability in the adoption of data science and analytics within the organization. The primary responsibility of this role is to participate in the creation and delivery of data-driven solutions that add business value using statistical models, machine learning algorithms, data mining, and visualization techniques. What youll be doing Key Responsibilities: Designs, develops, and programs methods, processes, and systems to consolidate and analyze unstructured, diverse big data sources to generate actionable insights and solutions for client services and product enhancement. Designs and enhances data collection procedures to include information that is relevant for building analytic systems. Responsible for ensuring that data used for analysis is processed, cleaned and, integrally verified and build algorithms necessary to find meaningful answers. Designs and codes software programs, algorithms, and automated processes to cleanse, integrate and evaluate large datasets from multiple disparate sources Provides meaningful insights from large data and metadata sources; interprets and communicates insights and findings from analysis and experiments to product, service, and business managers. Directs scalable and highly available applications leveraging the latest tools and technologies. Accountable for creatively visualizing and effectively communicating results of data analysis, insights, and ideas in a variety of formats to key decision-makers within the business. Creates SQL queries for the analysis of data and visualizes the output of the models. Responsible for ensuring that industry standards best practices are applied to development activities. Knowledge and Attributes: Advanced understanding of data modelling, statistical methods and machine learning techniques. Strong ability to thrive in a dynamic, fast-paced environment. Strong quantitative and qualitative analysis skills. Desire to acquire more knowledge to keep up to speed with the ever-evolving field of data science. Curiosity to sift through data to find answers and more insights. Advanced understanding of the information technology industry within a matrixed organization and the typical business problems such organizations face. Strong ability to translate technical findings clearly and fluently to non-technical team business stakeholders to enable informed decision-making. Strong ability to create a storyline around the data to make it easy to interpret and understand. Self-driven and able to work independently yet acts as a team player. Academic Qualifications and Certifications: Bachelors degree or equivalent in Data Science, Business Analytics, Mathematics, Economics, Engineering, Computer Science or a related field. Relevant programming certification preferred. Agile certification preferred. Required Experience: Advanced demonstrated experience in a data science position in a corporate environment and/or related industry. Advanced demonstrated experience in statistical modelling and data modelling, machine learning, data mining, unstructured data analytics, natural language processing. Advanced demonstrated experience in programming languages (R, Python, etc.). Advanced demonstrated experience working with and creating data architectures. Advanced demonstrated experience with extracting, cleaning, and transforming data and working with data owners to understand the data. Advanced demonstrated experience visualizing and/or presenting data for stakeholder use and reuse across the business.

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Your day at NTT DATA The Database Administrator is a seasoned subject matter expert, responsible for ensuring the availability, integrity and performance of critical data assets. This role works closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. The Database Administrator works closely with closely with Change Control, Release Management, Asset and Configuration Management and Capacity and Availability Management to establish the needs of users, monitoring user access and security and assists in controlling access to databases environments through permissions and privileges. What you'll be doing Key Responsibilities: Performs the installation, configuration, and maintenance of database management systems (DBMS), including SQL Server, Oracle, MySQL, or others, as required. Collaborates with software developers/architects to design and optimize database schemas, data models, and database-related applications. Assists with the mapping out of the conceptual design for a planned database. Participates in the writing of database documentation, including data standards, data flow diagrams, standard operating procedures and definitions for the data dictionary (metadata). Monitors database performance, identifies performance bottlenecks, and optimizes queries and indexing for optimal database performance. Continuously monitors database systems to ensure availability, proactively identify potential issues, and takes appropriate actions. Designs and implements robust backup and disaster recovery strategies to ensure data availability and business continuity. Monitors production databases regularly or respond to any database issues by bringing down the database or taking the database offline. Proactively supports the development of database utilities and automated reporting. Works closely with the Change Control and Release Management functions to commission and install new applications and customizing existing applications in order to make them fit for purpose. Plans and executes database software upgrades and applies patches to keep systems up-to-date and secure. Communicates regularly with technical, applications and operational employees to ensure database integrity and security. Ensures data integrity and consistency by performing regular data validation, integrity checks, and data cleansing activities. Works collaboratively with cross-functional teams, including developers, system administrators, network engineers, and business stakeholders, to support database-related initiatives. Provides technical support to end-users, assists with database-related enquiries, and conducts training sessions as needed. Performs any other related task as required. Knowledge and Attributes: Seasoned proficiency in database administration tasks, including database installation, configuration, maintenance, and performance tuning. Seasoned knowledge of SQL (Structured Query Language) to write complex queries, stored procedures, and functions. Seasoned understanding of database security principles, access controls, and data encryption methods. Seasoned working knowledge in database backup and recovery strategies to ensure data availability and business continuity. Ability to monitor database performance, identify and resolve issues, and optimize database operations. Ability to manage multiple projects concurrently while maintaining a high level of attention to detail on each project. Ability to learn new technologies as needed to provide the best solutions to all stakeholders. Ability to communicate complex IT information in simplified form depending on the target audience. Effective communication and collaboration skills to work with cross-functional teams and stakeholders. Seasoned proficiency understanding of the principles of data architecture and data services. Seasoned knowledge of application development lifecycle and data access layers. Excellent problem-solving skills to troubleshoot database-related issues and implement effective solutions. Excellent analytical skills related to working with unstructured datasets Ability to manipulate, process and extract value from large, disconnected datasets. Academic Qualifications and Certifications: Bachelors degree or equivalent in computer science, engineering, information technology or related field. Relevant certification, such as MCSE DBA, oracles associate or equivalent preferred. Relevant certifications such as Microsoft Certified; Azure Database Administrator Associate; Oracle Certified Professional (OCP) - Database Administrator; MySQL Database Administrator; PostgreSQL Certified Professional preferred. Completion of database management courses covering topics like database administration, data modelling, SQL, and performance tuning can provide foundational knowledge. Required Experience: Seasoned experience working as a Database Administrator within an Information Technology organization. Seasoned experience with database backup and recovery best practices. Seasoned experience running and creating health assessment reports. Seasoned experience working with suppliers to deliver solutions. Seasoned experience in Oracle Enterprise. Seasoned experience in Microsoft SQL Server. Seasoned experience managing databases.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies