Jobs
Interviews

426 Data Modelling Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

10 - 20 Lacs

Kolkata, Pune, Bengaluru

Hybrid

Job Title: ETL Data Modeller Experience: 8 to 10 years Location: Pan India Employment Type: Full-Time Notice Period : Immediate to 30 days Technology: Data Modelling, IICS/any leading ETL tool, SQL, Python (Nice to have) Key Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Chennai

Work from Office

Data Modelling Skillset: Data Science/Data Modelling + exp in ML/AI, OMR Model or OMR Reporting, Quant Modelling or Quantitative Analysis and Sanction Screening/Transaction Monitoring/Fin Crime exp is must (please use these keywords while sourcing) Location: Chennai (Mandatory) - open to consider candidates willing to relocate to Chennai Notice Period: 0 to 30 days Exp Range: 2 to 10 years (SE to Manager)

Posted 3 weeks ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Thiruvananthapuram

Work from Office

We’re looking for a highly skilled Senior Power BI Developer to design, develop, and maintain BI solutions that empower data-driven decisions across the organization. You will play a key role in turning complex data into insightful visual stories a

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 3 weeks ago

Apply

8.0 - 12.0 years

37 - 45 Lacs

Noida, Hyderabad

Work from Office

Position Summary MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science. Role Value Proposition The Business analyst data modeler has an important role in the data and analytics (D&A) organization. The role ensures, data is structured, organized and is represented effectively aligned to the needs of the organization. The role helps design logical & physical models which include implementation of robust data models that accurately capture, store, and manage data end to end. Job Responsibilities Perform data modeling activity (Logical, Physical) using data modeling tool CA Erwin Data Modeler Ability to gather, understand & analyze business requirements accurately Ability to analyze data using SQL Partnering with other teams to understand data needs & translating it into effective data models. Ability to collaborate with stakeholders to provide domain-based solutions. Experience in implementing industry data modeling standards, best practices, and emerging technologies in data modeling Hands-on experience with API development and integration (REST, SOAP, JSON, XML). Education, Technical Skills & Other Critical Requirement 1. 8 - 10 years of overall experience with minimum 5+ years in the field of Data modeling, business & functional requirements gathering, designing data strategies and data flows, data/ETL/BI architecture, delivering logical/physical data models in collaboration with business/application teams and BSAs 2. 3+ years Hands on experience in CA Erwin 3. 5+ years of experience in SQL, data modelling and data analysis 4. Ability to communicate effectively 5. Strong SQL skills 6 Hands-on experience with ERWIN tool 7. Familiarity with Agile best practices 8. Strong collaboration and facilitation skills

Posted 3 weeks ago

Apply

3.0 - 7.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Warm Greetings from Rivera Manpower Services, Job Title : Senior BI Analyst Call and book your Interviews 9986267393 /7829336034 Job Description Job Summary: Designs and develops data and knowledge based solutions using procedures, tools & methodologies. Creates data models, logical and physical databases, data dictionaries, access methods, etc. to support business objectives. Principal Responsibilities: Data Analysis & Requirements Gathering: Assess and analyze data needs at both system and sub-system levels to support business objectives. Power BI & SQL Development: Design, build, and optimize Power BI dashboards, reports, and relational databases using SQL, SSRS, and SSAS for advanced data modeling and integration. Business Intelligence Enablement : Train and mentor teams on Power BI and other Microsoft BI tools, fostering a data-driven culture across the organization. Quality Assurance & UAT: Implement rigorous quality control and user acceptance testing (UAT) processes for regular and ad hoc reporting. Cross-Functional Collaboration: Work seamlessly across geographical and cultural boundaries, ensuring consistency in data reporting and insights delivery. Technical & Business Translation: Translate complex business requirements into scalable technical solutions, making data accessible and impactful for stakeholders. Coaching & Mentorship: Guide junior colleagues in both technical skills and commercial awareness, ensuring a strong and capable BI team. Continuous Improvement: Stay ahead of BI trends and enhance reporting structures to meet evolving business demands. What You Bring: Proven experience in SQL, Power BI, SSRS, and SSAS , with a strong focus on data modeling, visualization, and performance optimization . Ability to work in a fast-paced, evolving business environment , adapting to new challenges and opportunities. Strong communication skills to translate complex technical concepts into business-friendly insights. Experience mentoring and coaching junior team members to develop both technical and analytical skills. Key Competencies & Impact Deep Expertise & Industry Knowledge: Strong foundation in core principles, theories, and concepts within the discipline, with a well-rounded understanding of industry best practices, techniques, and standards . Problem-Solving & Innovation: Develops creative, data-driven solutions for a variety of challenges, applying critical thinking and analytical skills to drive results. Evaluates policies, practices, and precedents to determine the best course of action . Independent Decision-Making: Operates with a high level of autonomy , exercising sound judgment and discretion. While guidance may be provided for new or complex tasks, work is primarily self-directed and assessed for overall effectiveness. Cross-Functional Collaboration: Works closely with management, team members, and stakeholders across departments , ensuring alignment and knowledge sharing. May serve as a key representative for the department in internal and external engagements . Business Impact & Accountability: Decisions and actions directly influence the success of the department and broader organization . Errors or delays could impact project timelines, revenue generation, or resource allocation , making attention to detail and strategic thinking essential. Work Experience: Typically 3+ years with bachelor's or equivalent. Experience using SQL Server normalized/de-normalized database Development, Implementation and Administrate SSRS reports - Essential Ability to write a complex code in SQL - Essential Experience on Power BI and/or Tableau - Essential Education and Certification(s): Bachelor's degree or equivalent experience from which comparable knowledge and job skills can be obtained. Distinguishing Characteristics: Supports data requirements for a business application. Performs data modeling and analysis to support data applications in one functional area.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Work from Office

3 to 8 years IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation & Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT & Dimensions tables, ELT,OLAP. Experience with SQL, UNIX, complex queries, Stored Procedures and Data Warehouse best practices. Ensure correctness and completeness of Data loading (Full load & Incremental load). Excellent communication skills, organized and effective in delivering high-quality solutions using ODI. Location - Bengaluru (Pan India)

Posted 3 weeks ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Bengaluru

Remote

LEAD ANALYST: As a Lead Analyst , you will play a strategic role in leading data-driven consulting engagements, designing advanced analytics solutions, and delivering actionable insights to clients. You will collaborate with cross-functional teams, manage BI projects, and enable clients to make data-backed business decisions. Key Responsibilities: Client Consulting & Strategy Partner with clients to understand business challenges, define business objectives, and develop data-driven strategies. Translate business problems into analytics solutions by leveraging BI dashboards, predictive modelling, and AI-driven insights. Act as a trusted advisor by delivering compelling presentations and actionable recommendations to senior stakeholders. Business Intelligence & Data Visualization Design, develop, and manage scalable BI dashboards and reporting solutions using tools like Power BI and Tableau. Drive data accuracy, consistency, and security in reporting solutions across different client engagements. Enable self-service BI for clients by setting up robust data visualization and exploration frameworks. Advanced Analytics & Insights Generation Perform deep-dive analysis on business performance metrics, customer behaviour, and operational trends. Define, develop and track key performance indicators (KPIs) to measure business success and identify improvement opportunities. Project & Stakeholder Management Lead multiple analytics and BI projects, ensuring timely delivery and alignment with client expectations. Work cross-functionally with data engineers, business consultants, and technology teams to deliver holistic solutions. Communicate findings through executive reports, data stories, and interactive presentations. Team Leadership & Development Build and grow a team of BI developers, data analysts, and business consultants. Foster a data-driven culture by providing training and upskilling opportunities for internal teams. Contribute to thought leadership by publishing insights, whitepapers, and case studies. Key Qualifications & Skills: • Education : Bachelor's or Masters degree in Business Analytics, Data Science, Computer Science, or a related field.• Experience : 6+ years in business intelligence, analytics, or data consulting roles. • Technical Expertise : Strong proficiency in SQL, Python, Excel, and other data manipulation techniques. Hands-on experience with BI tools like Power BI/Tableau. Knowledge of data engineering and data modelling concepts, ETL processes, and cloud platforms (Azure/AWS/GCP). Familiarity with predictive modelling and statistical analysis. Consulting & Business Acumen: Strong problem-solving skills and ability to translate data insights into business impact. Experience working in a consulting environment, managing client relationships and expectations. Excellent communication and storytelling skills, leveraging PowerPoint to present complex data insights effectively. Project & Stakeholder Management: Ability to manage multiple projects and collaborate across teams in a fast-paced environment. Strong leadership and mentorship capabilities, fostering a culture of learning and innovation LEAD BUSINESS ANALYST: We are seeking a highly experienced and strategic Lead Business Analyst with over 10 years of proven expertise in business analysis, data analytics, and project delivery. The ideal candidate will have deep knowledge in risk, data governance, and KPI frameworks, with a successful track record of driving complex data-driven projects, compliance transformations, and performance automation. --- Key Responsibilities Business Analysis & Strategy Collaborate with stakeholders to gather, define, and analyze business requirements across projects. Develop Business Requirement Documents (BRDs) and functional specifications aligned with business goals. Project Delivery & Data Analytics Lead cross-functional teams to deliver data-centric projects such as scorecard creation, dashboards, and EDW redesign. Manage end-to-end project lifecycle, ensuring timely delivery of business insights and performance dashboards. Process Optimization & Automation Drive process enhancements by automating KPIs, Daily reports, and workflows. Conduct gap analysis, root cause analysis, and impact assessments to improve decision-making accuracy. Stakeholder & Client Engagement Serve as a point of contact for internal and external stakeholders, ensuring business objectives are translated into actionable analytics. Deliver high-impact demos and training sessions to clients and internal teams. --- Key Requirements 10+ years of experience in business analysis, preferably in EDW projects. Hands-on expertise with data analytics, data quality assessment, and KPI frameworks Technical proficiency in SQL Server, PowerBI/Tableau, Jira Strong documentation, stakeholder management. Experience with AI/ML product features and data governance practices is a plus --- Key Competencies Strategic Thinking and Problem Solving Strong Analytical and Communication Skills Agile and Cross-functional Team Leadership Data Strategy, Quality, and Visualization Critical Thinking and Decision-Making

Posted 3 weeks ago

Apply

6.0 - 11.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Role & responsibilities JD Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Preferred candidate profile

Posted 3 weeks ago

Apply

3.0 - 8.0 years

13 - 15 Lacs

Chennai

Work from Office

Focus on POWER PAGES, POWER APPS, CANVAS APPS, WORKING WITH CONNECTORS & CUSTOM CONNECTORS, APPLICATION SECURITY, AUTHORIZATION MECHANICSM, AZURE, CLOUD SERVICES, DATA VERSE, DATA MODELLING, CO-PILOT, SQL, POWER QUERY, DATA INTEGRATION, DATA EXTRACTN Required Candidate profile BE/MCA/MSc 3+yrs exp as POWER PLATFORM DEVELOPER Strong skills in POWER PAGES, POWER APPS, APPLICATION SECURITY, AZURE, CLOUD SERVICES, DATA VERSE, CO-PILOT, SQL, POWER QUERY Relocate to CHENNAI Perks and benefits Excellent Perks. Call Ms Divya @ 7010384865 now

Posted 4 weeks ago

Apply

8.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Job Summary: We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering environment. Key Responsibilities: Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL workflows Mandatory Skills: ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data workflows Good-to-Have Skills: API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling)

Posted 4 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Pune

Work from Office

About the Position: Develop, test and maintain high-quality software using Python programming language. Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. Collaborate with cross-functional teams to identify and solve complex problems. Write clean and reusable code that can be easily maintained and scaled. Technical and Professional Requirements: Sound knowledge of the Python programming framework & libraries Familiarity with database technologies such as SQL and NoSQL Experience with popular Python frameworks such as Django, Flask or Pyramid. Knowledge of data science and machine learning concepts and tools. Expertise in how to combine several data sources into one system. Implement security and data protection solutions. Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Preferred Skills: Knowledge of Cloud technologies like GCP/AWS/Azure Ability to document data Job Responsibilities: Data Management - Being able to trace usage of legacy data assets and propose best migration options. Data Modelling Building and Optimizing ETL Jobs Ability to communicate effectively with team members - as we will be moving data around and porting existing data jobs, interaction with the team will be vital. Documentation of data relationships and context Educational Requirements: Any graduate with 60% or above

Posted 4 weeks ago

Apply

2.0 - 4.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Role Overview: Youll lead efforts in developing, training, and deploying machine learning models with a focus on data processing and computer vision. Using your expertise in Python, OpenCV, and other advanced tools, you will work on complex data science projects and help us push the boundaries of AI technology. Data Modelling & Processi Leverage OpenCV and advanced computer vision techniques for tasks like image recognition, object detection, and OCR. Model Development: Design, build, and train machine learning models using deep learning frameworks such as TensorFlow, PyTorch, and Keras. Computer Vision Projects: Leverage OpenCV and advanced computer vision techniques for tasks like image recognition, object detection, and OCR. ML Algorithms & Frameworks: Implement ML algorithms and leverage frameworks for NLP, computer vision, and other AI projects. Testing & Experimentation: Conduct machine learning tests and experiments, refining models based on performance metrics. API Development: Utilize Flask or Django to create APIs for model integration with applications. Data Annotation & Augmentation: Apply data augmentation techniques for robust model performance, especially in image-based datasets. Collaboration & Reporting: Work closely with cross-functional teams and provide detailed insights on model performance and data metrics. Technical Skills Required: Programming Languages: Proficiency in Python, with strong skills in C/C++ for performance-based applications. ML Frameworks: Expertise in TensorFlow, PyTorch, and Keras. Libraries & Tools: Strong command of OpenCV, Numpy, Scikit-learn, and Matplotlib. Data Handling & Processing: Hands-on experience with data processing tools such as Pandas and SQL. Natural Language Processing (NLP): Experience with NLP techniques using libraries like NLTK. Web Scraping & Automation: Familiarity with Beautiful Soup for data extraction and Selenium for automation. Software Development Tools: Knowledge of Django, Flask, Jupyter Notebook, and Linux environments. Analytical Skills: Strong understanding of optimization techniques, regularization methods, and model architecture. Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 2-4 years of experience in data science, machine learning, and model development. Demonstrated expertise in computer vision and natural language processing. Strong problem-solving and communication skills for cross-functional collaboration. Why Join Us? Cutting-Edge Projects: Work on innovative AI and machine learning projects. Professional Growth: Access to training and career advancement. Dynamic Environment: Collaborate with a team passionate about data and technology.

Posted 1 month ago

Apply

6.0 - 10.0 years

35 - 60 Lacs

Bengaluru

Work from Office

A rapidly growing product-first technology company is seeking a results-driven Senior Product Manager to lead its next-generation data lake and analytics capabilities. This role is a perfect fit for professionals passionate about creating scalable, AI-ready data solutions that drive meaningful impact across global enterprise ecosystems. Key Responsibilities Own end-to-end delivery of core data platform modules that power analytics, dashboards, and AI-driven insights Collaborate with engineering, BI, and business stakeholders to define clean and extensible data models Translate complex requirements into well-structured product documentation, ensuring alignment across cross-functional teams Standardize metadata, lineage, and documentation practices to drive clarity and trust across data assets Design for scalability, performance, and security in a cloud-native, multi-tenant SaaS environment Contribute to long-term architectural evolution and BI tool integration efforts Ensure data products are easily discoverable, user-friendly, and enterprise-ready Apply modern tools and best practices such as dbt, data catalogs, and cloud data lakes Ideal Candidate Profile 6- 8 years in product management with focus on data platforms, analytics, or SaaS products Proven expertise in data modeling, ELT/ETL development, and delivering cloud-native data solutions Hands-on experience with BI and warehouse platforms such as Snowflake, Redshift, BigQuery, Tableau, ThoughtSpot, or Power BI Demonstrated ability to write concise PRDs and collaborate across technical and non-technical teams Familiarity with embedded analytics and semantic modeling is highly desirable Strong conceptual understanding of metadata governance, scalability, and usability in enterprise data systems

Posted 1 month ago

Apply

7.0 - 10.0 years

15 - 22 Lacs

Kochi, Bengaluru

Work from Office

Exp: 6+ years Location: Kochi/Bangalore Work Mode: Hybrid Key highlights from the JD to be emphasized: Mandatory hands-on experience in Optimizely SaaS CMS . Strong expertise in Next.js and Node.js as part of the tech stack. Experience with Optimizely suite including CMS, Commerce, CDP, and DAM . Optimizely CMS and/or Optimizely Commerce, Solid expertise in Next.js (including SSR and SSG), React, TypeScript, and front-end performance optimization. Strong experience with .NET / C#, ASP.NET MVC, and API integrations Optimizely CDP (data modeling, segmentation, personalization)

Posted 1 month ago

Apply

12.0 - 16.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Solution Architect - Manager (IC Role)- BLR - J49182 You will have Deep Technical Expertise: Hands on experience designing , architecting , specifying and developing large scale complex systems Specialist skills in cloud native architectures , design , automation , workflow and event driven systems Quality Focus: A DevSecOps mindset with great attention to detail Proven Track Record: Proven experience of leading and delivering projects , common services and unified architectures. Demonstrable experience leading and mentoring others. Built software that includes user facing web applications Communication: Outstanding communication and presentation skills Programming Skills: Heavily used modern object-oriented languages such has C# or Java Enterprise Expertise: Expertise in software design patterns , clean code , and clean architecture principles. Knowledge of building REST APIs and have experience of messaging Data Modelling: Worked with defining data models and interacting with database Collaborative Approach: A passion to work in an Agile Team working collaboratively with others and adopt best practices Continuous Delivery: Used source control and continuous integration tools as part of a team Security Practices: An understanding of application security controls like SAST , DAST, Penetration Testing , etc. You may have AI Systems: Built systems leveraging generative AI and machine learning. Cloud Experience: Experience with Docker , Kubernetes or other serverless application delivery platforms Proven Track Record: Worked with React , Angular , Blazor , ASP.NET MVC or other modern web UI frameworks Data Modelling: Used Entity Framework or other popular ORM tools Quality Focus: Used GitHub Copilot and other tools to increase development productivity Data Modelling: Used NoSQL databases such as cosmos DB , Mongo or Cassandra , Enterprise Expertise: Experience with messaging such as Service Bus , MQ or Kafka Data Analysis: Experience with Data Analytics and Business Intelligence Collaborative Approach: Experience of pair and mob programming In this role you will Deep Technical Expertise: Work where needed alongside our leads , principal engineers , product owners to design software architecture and build AI enabled tools for mission-critical applications used by Fortune 500 companies , ensuring scalability and resilience Integrate emerging technologies like AI-driven development , Web Components , etc. Create architecture design and diagrams for core platform and common services. Provide mentoring to other developers within Engineering department. Architect and build highly distributed microservices , leveraging event-driven architectures , AI- powered automation , and cutting-edge cloud technologies like Kubernetes and serverless computing Proven Track Record: Contribute to the blueprint for our software ecosystem , shaping how teams build applications for years to come Communication: Communicate and collaborate effectively with development team leads to help accelerate the delivery of products Collaborative Approach: Work collaboratively in a LEAN Agile team using a Scaled SCRUM framework Programming Skills: Take ownership of the development of common services , libraries , reusable components or applications using .Net Use front end Typescript/React , ASP.NET MVC or C#/Blazor Cloud Experience: Build cloud first applications and services with high test coverage on a continuous delivery platform with 100% infrastructure as code Package applications in containers and deploy on Azure Kubernetes Service , Azure Container Apps or other Azure compute services Data Modelling: Use entity framework code first data with Azure SQL or a NoSQL Databases Security Practices : Comply with secure coding & infrastructure standards and policies Continuous Delivery : Assist with supporting your application using modern DevSecOps tools Quality Focus : Continuously improve your technical knowledge and share what you learn with others Qualification- BE-Comp/IT , BE-Other , BTech-Comp/IT , BTech-Other

Posted 1 month ago

Apply

10.0 - 17.0 years

35 - 60 Lacs

Noida, Gurugram, Bengaluru

Hybrid

This is a individual contributor role. Looking candidates from Product/Life Science/ Pharma/Consulting background only. POSITION: Data Architect. LOCATION: NCR/Bangalore/Gurugram. PRODUCT: Axtria DataMAx is a global cloud-based data management product specifically designed for the life sciences industry. It facilitates the rapid integration of both structured and unstructured data sources, enabling accelerated and actionable business insights from trusted data This product is particularly useful for pharmaceutical companies looking to streamline their data processes and enhance decision-making capabilities. JOB OBJECTIVE: To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and governance, and enable advanced analytics through the definition and management of semantic and common data layers. Utilizing Axtria DataMAx and generative AI technologies, the aim is to accelerate business insights and support regulatory compliance, ultimately enhancing decision-making and operational efficiency. Key Responsibilities: Data Modeling: Design logical and physical data models to ensure efficient data storage and retrieval. ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse. Infrastructure Design: Plan and implement the technical infrastructure, including hardware, software, and network components. Data Governance: Ensure compliance with regulatory standards and implement data governance policies to maintain data quality and security. Performance Optimization: Continuously monitor and improve the performance of the data warehouse to handle large volumes of data and complex queries. Semantic Layer Definition: Define and manage the semantic layer architecture and technology stack to manage the lifecycle of semantic constructs including consumption into downstream systems. Common Data Layer Management: Integrate data from multiple sources into a centralized repository, ensuring consistency and accessibility. Deep expertise in architecting enterprise grade software systems that are performant, scalable, resilient and manageable. Architecting GenAI based systems is an added plus. Advanced Analytics: Enable advanced analytics and machine learning to identify patterns in genomic data, optimize clinical trials, and personalize medication. Generative AI: Should have worked with production ready usecase for GenAI based data and Stakeholder Engagement: Work closely with business stakeholders to understand their data needs and translate them into technical solutions. Cross-Functional Collaboration: Collaborate with IT, data scientists, and business analysts to ensure the data warehouse supports various analytical and operational needs. Data Modeling: Strong expertise in Data Modelling, with ability to design complex data models from the ground up and clearly articulate the rationale behind design choices. ETL Processes: Must have worked with different loading strategies for facts and dimensions like SCD, Full Load, Incremental Load, Upsert, Append only, Rolling Window etc.. Cloud Warehouse skills: Expertise in leading cloud data warehouse platformsSnowflake, Databricks, and Amazon Redshift—with a deep understanding of their architectural nuances, strengths, and limitations, enabling the design and deployment of scalable, high-performance data solutions aligned with business objectives. Qualifications: Proven experience in data architecture and data warehousing, preferably in the pharmaceutical industry. Strong knowledge of data modeling, ETL processes, and infrastructure design. Experience with data governance and regulatory compliance in the life sciences sector. Proficiency in using Axtria DataMAx or similar data management products. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Preferred Skills: Familiarity with advanced analytics and machine learning techniques. Experience in managing semantic and common data layers. Knowledge of FDA guidelines, HIPAA regulations, and other relevant regulatory standards. Experience with generative AI technologies and their application in data warehousing.

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Key Responsibilities: Develop APIs and microservices using Spring Boot. Implement integrations using APIGEE for API management. Work with Pivotal Cloud Foundry (PCF) and manage deployments. Leverage both AWS and Azure for cloud integration tasks. Create and manage data models using tools like Erwin, Vision, or Lucidchart. Required Skills: 5+ years of experience in integration development. Proficiency in Spring Boot and APIGEE. Expertise in Pivotal Cloud Foundry (PCF). Strong knowledge of AWS and Azure. Experience with data modeling tools (Erwin, Vision, Lucidchart). Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Pune

Work from Office

We are seeking a skilled Data Engineer with hands-on experience in Azure Data Factory (ADF) and Snowflake development. The ideal candidate will have a solid background in SQL, data warehousing, and cloud data pipelines, with a keen ability to design, implement, and maintain robust data solutions that support business intelligence and analytics initiatives. Key Responsibilities: Design and develop scalable data pipelines using ADF and Snowflake Integrate data from various sources using SQL, GitHub, and cloud-native tools Apply data warehousing best practices and ensure optimal data flow and data quality Collaborate within Scrum teams and contribute to agile development cycles Liaise effectively with stakeholders across the globe to gather requirements and deliver solutions Support data modeling efforts and contribute to Python-based enhancements (as needed) Qualifications: Minimum 5 years of overall Data Engineering experience At least 2 years of hands-on experience with Snowflake At least 2 years of experience working with Azure Data Factory Strong understanding of Data Warehousing concepts and methodologies Experience with Data Modeling and proficiency in Python is a plus Familiarity with version control systems like GitHub Experience working in an agile (Scrum) environment

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

5+ years experience administering Salesforce Marketing Cloud (ExactTarget) or similar marketing automation platforms 5+ years experience of Marketing Cloud components including Automations, Journeys, and Data Extensions Knowledge of Marketing Cloud integrations with other systems Experience supporting multiple business units with varying requirements Strong troubleshooting and problem-solving abilities Excellent communication skills for working directly with customers Ability to prioritize and manage multiple support requests effectively Experience with user management and permission structures Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2.Engage with delivery team to ensure right solution is proposed to the customer a.Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b.Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3.Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter

Posted 1 month ago

Apply

8.0 - 10.0 years

18 - 27 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Hybrid

We are looking for an experienced Senior Cognos Developer/Lead with 910 years of hands-on expertise in IBM Cognos Analytics (v11) . The ideal candidate should have solid experience in report migration , Cognos Framework Manager , and data modeling . This is a lead-level role requiring both technical depth and the ability to guide teams. Must-Have Skills: 9–10 years of Cognos BI experience with leadership exposure Expertise in Cognos 11 (Cognos Analytics) Experience with report migration to Cognos 11 Strong knowledge of Framework Manager and package development Proficient in data modeling (star/snowflake schemas) If you're passionate about BI tools and ready to lead impactful projects, apply now!

Posted 1 month ago

Apply

2.0 - 5.0 years

15 - 18 Lacs

Mumbai, Chennai, Bengaluru

Work from Office

Job Responsibilities Report and Dashboard Development: Design, develop, and maintain interactive dashboards and reports using BI reporting tools like Tableau. Collaborate with stakeholders to define report requirements and KPIs. Data Modeling and ETL: Develop data models and ETL processes to extract, transform, and load data into BI tools. Optimize data pipelines for performance and efficiency. Business Collaboration: Collaborate with business users to understand their data needs and translate them into actionable insights. Communicate complex technical concepts to non-technical stakeholders. Provide training and support to end-users on BI tools and reports. Data Analysis and Insights: Analyze large and complex datasets to identify trends, patterns, and anomalies. Develop and maintain data quality standards and processes. Conduct ad-hoc analysis to address specific business questions. Skill Set 2 to 3 years of relevant work experience Agile/Scrum model of working Strong proficiency in SQL and data modeling techniques. Experience with BI tools (Tableau, Power BI, etc.). Experience with Python or PySpark programming languages Knowledge of data warehousing and ETL processes. Strong analytical and problem-solving skills. Excellent communication and presentation skills. Ability to work independently and as part of team Location-Chennai,Bengaluru,Mumbai,Hyderabad

Posted 1 month ago

Apply

3.0 - 5.0 years

15 - 25 Lacs

Pune

Work from Office

AI/ML Engineer Responsibilities: Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. Transforming data science prototypes and applying appropriate ML algorithms and tools. Ensuring that algorithms generate accurate user recommendations. Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions. Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. Developing ML algorithms to huge volumes of historical data to make predictions. Running tests, performing statistical analysis, and interpreting test results. Documenting machine learning processes. Keeping abreast of developments in machine learning. AI/ML Engineer Requirements: Bachelor's degree in computer science, data science, mathematics, or a related field with at least 3+yrs of experience as an AI/ML Engineer Advanced proficiency with Python and FastAPI framework along with good exposure to libraries like scikit-learn, Pandas, NumPy etc.. Experience in working on ChatGPT, LangChain (Must), Large Language Models (Good to have) & Knowledge Graphs Extensive knowledge of ML frameworks, libraries, data structures, data modelling, and software architecture. In-depth knowledge of mathematics, statistics, and algorithms. Superb analytical and problem-solving abilities. Great communication and collaboration skills.

Posted 1 month ago

Apply

7.0 - 9.0 years

14 - 18 Lacs

Pune

Hybrid

The SQL+Power BI Lead is responsible for designing, developing, and maintaining complex data solutions using SQL and Power BI. They serve as a technical lead, guiding the team in implementing best practices and efficient data architectures. The SQL+PowerBI Lead plays a key role in translating business requirements into effective data and reporting solutions. Design and develop advanced SQL queries, stored procedures, and other database objects to support data extraction, transformation, and loading Create dynamic, interactive PowerBI dashboards and reports to visualize data and provide insights Provide technical leadership and mentorship to junior team members on SQL and PowerBI best practices Collaborate with business stakeholders to understand requirements and translate them into data solutions Optimize database performance and implement security measures to ensure data integrity Automate data integration, extraction, and reporting processes where possible Participate in data architecture planning and decision-making Troubleshoot and resolve complex data-related issues Stay up-to-date with the latest trends, technologies, and best practices in data analytics.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad, Bengaluru

Work from Office

About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies