Jobs
Interviews

436 Data Modelling Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 16.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Solution Architect - Manager (IC Role)- BLR - J49182 You will have Deep Technical Expertise: Hands on experience designing , architecting , specifying and developing large scale complex systems Specialist skills in cloud native architectures , design , automation , workflow and event driven systems Quality Focus: A DevSecOps mindset with great attention to detail Proven Track Record: Proven experience of leading and delivering projects , common services and unified architectures. Demonstrable experience leading and mentoring others. Built software that includes user facing web applications Communication: Outstanding communication and presentation skills Programming Skills: Heavily used modern object-oriented languages such has C# or Java Enterprise Expertise: Expertise in software design patterns , clean code , and clean architecture principles. Knowledge of building REST APIs and have experience of messaging Data Modelling: Worked with defining data models and interacting with database Collaborative Approach: A passion to work in an Agile Team working collaboratively with others and adopt best practices Continuous Delivery: Used source control and continuous integration tools as part of a team Security Practices: An understanding of application security controls like SAST , DAST, Penetration Testing , etc. You may have AI Systems: Built systems leveraging generative AI and machine learning. Cloud Experience: Experience with Docker , Kubernetes or other serverless application delivery platforms Proven Track Record: Worked with React , Angular , Blazor , ASP.NET MVC or other modern web UI frameworks Data Modelling: Used Entity Framework or other popular ORM tools Quality Focus: Used GitHub Copilot and other tools to increase development productivity Data Modelling: Used NoSQL databases such as cosmos DB , Mongo or Cassandra , Enterprise Expertise: Experience with messaging such as Service Bus , MQ or Kafka Data Analysis: Experience with Data Analytics and Business Intelligence Collaborative Approach: Experience of pair and mob programming In this role you will Deep Technical Expertise: Work where needed alongside our leads , principal engineers , product owners to design software architecture and build AI enabled tools for mission-critical applications used by Fortune 500 companies , ensuring scalability and resilience Integrate emerging technologies like AI-driven development , Web Components , etc. Create architecture design and diagrams for core platform and common services. Provide mentoring to other developers within Engineering department. Architect and build highly distributed microservices , leveraging event-driven architectures , AI- powered automation , and cutting-edge cloud technologies like Kubernetes and serverless computing Proven Track Record: Contribute to the blueprint for our software ecosystem , shaping how teams build applications for years to come Communication: Communicate and collaborate effectively with development team leads to help accelerate the delivery of products Collaborative Approach: Work collaboratively in a LEAN Agile team using a Scaled SCRUM framework Programming Skills: Take ownership of the development of common services , libraries , reusable components or applications using .Net Use front end Typescript/React , ASP.NET MVC or C#/Blazor Cloud Experience: Build cloud first applications and services with high test coverage on a continuous delivery platform with 100% infrastructure as code Package applications in containers and deploy on Azure Kubernetes Service , Azure Container Apps or other Azure compute services Data Modelling: Use entity framework code first data with Azure SQL or a NoSQL Databases Security Practices : Comply with secure coding & infrastructure standards and policies Continuous Delivery : Assist with supporting your application using modern DevSecOps tools Quality Focus : Continuously improve your technical knowledge and share what you learn with others Qualification- BE-Comp/IT , BE-Other , BTech-Comp/IT , BTech-Other

Posted 1 month ago

Apply

10.0 - 17.0 years

35 - 60 Lacs

Noida, Gurugram, Bengaluru

Hybrid

This is a individual contributor role. Looking candidates from Product/Life Science/ Pharma/Consulting background only. POSITION: Data Architect. LOCATION: NCR/Bangalore/Gurugram. PRODUCT: Axtria DataMAx is a global cloud-based data management product specifically designed for the life sciences industry. It facilitates the rapid integration of both structured and unstructured data sources, enabling accelerated and actionable business insights from trusted data This product is particularly useful for pharmaceutical companies looking to streamline their data processes and enhance decision-making capabilities. JOB OBJECTIVE: To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and governance, and enable advanced analytics through the definition and management of semantic and common data layers. Utilizing Axtria DataMAx and generative AI technologies, the aim is to accelerate business insights and support regulatory compliance, ultimately enhancing decision-making and operational efficiency. Key Responsibilities: Data Modeling: Design logical and physical data models to ensure efficient data storage and retrieval. ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse. Infrastructure Design: Plan and implement the technical infrastructure, including hardware, software, and network components. Data Governance: Ensure compliance with regulatory standards and implement data governance policies to maintain data quality and security. Performance Optimization: Continuously monitor and improve the performance of the data warehouse to handle large volumes of data and complex queries. Semantic Layer Definition: Define and manage the semantic layer architecture and technology stack to manage the lifecycle of semantic constructs including consumption into downstream systems. Common Data Layer Management: Integrate data from multiple sources into a centralized repository, ensuring consistency and accessibility. Deep expertise in architecting enterprise grade software systems that are performant, scalable, resilient and manageable. Architecting GenAI based systems is an added plus. Advanced Analytics: Enable advanced analytics and machine learning to identify patterns in genomic data, optimize clinical trials, and personalize medication. Generative AI: Should have worked with production ready usecase for GenAI based data and Stakeholder Engagement: Work closely with business stakeholders to understand their data needs and translate them into technical solutions. Cross-Functional Collaboration: Collaborate with IT, data scientists, and business analysts to ensure the data warehouse supports various analytical and operational needs. Data Modeling: Strong expertise in Data Modelling, with ability to design complex data models from the ground up and clearly articulate the rationale behind design choices. ETL Processes: Must have worked with different loading strategies for facts and dimensions like SCD, Full Load, Incremental Load, Upsert, Append only, Rolling Window etc.. Cloud Warehouse skills: Expertise in leading cloud data warehouse platformsSnowflake, Databricks, and Amazon Redshift—with a deep understanding of their architectural nuances, strengths, and limitations, enabling the design and deployment of scalable, high-performance data solutions aligned with business objectives. Qualifications: Proven experience in data architecture and data warehousing, preferably in the pharmaceutical industry. Strong knowledge of data modeling, ETL processes, and infrastructure design. Experience with data governance and regulatory compliance in the life sciences sector. Proficiency in using Axtria DataMAx or similar data management products. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Preferred Skills: Familiarity with advanced analytics and machine learning techniques. Experience in managing semantic and common data layers. Knowledge of FDA guidelines, HIPAA regulations, and other relevant regulatory standards. Experience with generative AI technologies and their application in data warehousing.

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Key Responsibilities: Develop APIs and microservices using Spring Boot. Implement integrations using APIGEE for API management. Work with Pivotal Cloud Foundry (PCF) and manage deployments. Leverage both AWS and Azure for cloud integration tasks. Create and manage data models using tools like Erwin, Vision, or Lucidchart. Required Skills: 5+ years of experience in integration development. Proficiency in Spring Boot and APIGEE. Expertise in Pivotal Cloud Foundry (PCF). Strong knowledge of AWS and Azure. Experience with data modeling tools (Erwin, Vision, Lucidchart). Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Pune

Work from Office

We are seeking a skilled Data Engineer with hands-on experience in Azure Data Factory (ADF) and Snowflake development. The ideal candidate will have a solid background in SQL, data warehousing, and cloud data pipelines, with a keen ability to design, implement, and maintain robust data solutions that support business intelligence and analytics initiatives. Key Responsibilities: Design and develop scalable data pipelines using ADF and Snowflake Integrate data from various sources using SQL, GitHub, and cloud-native tools Apply data warehousing best practices and ensure optimal data flow and data quality Collaborate within Scrum teams and contribute to agile development cycles Liaise effectively with stakeholders across the globe to gather requirements and deliver solutions Support data modeling efforts and contribute to Python-based enhancements (as needed) Qualifications: Minimum 5 years of overall Data Engineering experience At least 2 years of hands-on experience with Snowflake At least 2 years of experience working with Azure Data Factory Strong understanding of Data Warehousing concepts and methodologies Experience with Data Modeling and proficiency in Python is a plus Familiarity with version control systems like GitHub Experience working in an agile (Scrum) environment

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

5+ years experience administering Salesforce Marketing Cloud (ExactTarget) or similar marketing automation platforms 5+ years experience of Marketing Cloud components including Automations, Journeys, and Data Extensions Knowledge of Marketing Cloud integrations with other systems Experience supporting multiple business units with varying requirements Strong troubleshooting and problem-solving abilities Excellent communication skills for working directly with customers Ability to prioritize and manage multiple support requests effectively Experience with user management and permission structures Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2.Engage with delivery team to ensure right solution is proposed to the customer a.Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b.Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3.Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter

Posted 1 month ago

Apply

8.0 - 10.0 years

18 - 27 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Hybrid

We are looking for an experienced Senior Cognos Developer/Lead with 910 years of hands-on expertise in IBM Cognos Analytics (v11) . The ideal candidate should have solid experience in report migration , Cognos Framework Manager , and data modeling . This is a lead-level role requiring both technical depth and the ability to guide teams. Must-Have Skills: 9–10 years of Cognos BI experience with leadership exposure Expertise in Cognos 11 (Cognos Analytics) Experience with report migration to Cognos 11 Strong knowledge of Framework Manager and package development Proficient in data modeling (star/snowflake schemas) If you're passionate about BI tools and ready to lead impactful projects, apply now!

Posted 1 month ago

Apply

2.0 - 5.0 years

15 - 18 Lacs

Mumbai, Chennai, Bengaluru

Work from Office

Job Responsibilities Report and Dashboard Development: Design, develop, and maintain interactive dashboards and reports using BI reporting tools like Tableau. Collaborate with stakeholders to define report requirements and KPIs. Data Modeling and ETL: Develop data models and ETL processes to extract, transform, and load data into BI tools. Optimize data pipelines for performance and efficiency. Business Collaboration: Collaborate with business users to understand their data needs and translate them into actionable insights. Communicate complex technical concepts to non-technical stakeholders. Provide training and support to end-users on BI tools and reports. Data Analysis and Insights: Analyze large and complex datasets to identify trends, patterns, and anomalies. Develop and maintain data quality standards and processes. Conduct ad-hoc analysis to address specific business questions. Skill Set 2 to 3 years of relevant work experience Agile/Scrum model of working Strong proficiency in SQL and data modeling techniques. Experience with BI tools (Tableau, Power BI, etc.). Experience with Python or PySpark programming languages Knowledge of data warehousing and ETL processes. Strong analytical and problem-solving skills. Excellent communication and presentation skills. Ability to work independently and as part of team Location-Chennai,Bengaluru,Mumbai,Hyderabad

Posted 1 month ago

Apply

3.0 - 5.0 years

15 - 25 Lacs

Pune

Work from Office

AI/ML Engineer Responsibilities: Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. Transforming data science prototypes and applying appropriate ML algorithms and tools. Ensuring that algorithms generate accurate user recommendations. Turning unstructured data into useful information by auto-tagging images and text-to-speech conversions. Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. Developing ML algorithms to huge volumes of historical data to make predictions. Running tests, performing statistical analysis, and interpreting test results. Documenting machine learning processes. Keeping abreast of developments in machine learning. AI/ML Engineer Requirements: Bachelor's degree in computer science, data science, mathematics, or a related field with at least 3+yrs of experience as an AI/ML Engineer Advanced proficiency with Python and FastAPI framework along with good exposure to libraries like scikit-learn, Pandas, NumPy etc.. Experience in working on ChatGPT, LangChain (Must), Large Language Models (Good to have) & Knowledge Graphs Extensive knowledge of ML frameworks, libraries, data structures, data modelling, and software architecture. In-depth knowledge of mathematics, statistics, and algorithms. Superb analytical and problem-solving abilities. Great communication and collaboration skills.

Posted 1 month ago

Apply

7.0 - 9.0 years

14 - 18 Lacs

Pune

Hybrid

The SQL+Power BI Lead is responsible for designing, developing, and maintaining complex data solutions using SQL and Power BI. They serve as a technical lead, guiding the team in implementing best practices and efficient data architectures. The SQL+PowerBI Lead plays a key role in translating business requirements into effective data and reporting solutions. Design and develop advanced SQL queries, stored procedures, and other database objects to support data extraction, transformation, and loading Create dynamic, interactive PowerBI dashboards and reports to visualize data and provide insights Provide technical leadership and mentorship to junior team members on SQL and PowerBI best practices Collaborate with business stakeholders to understand requirements and translate them into data solutions Optimize database performance and implement security measures to ensure data integrity Automate data integration, extraction, and reporting processes where possible Participate in data architecture planning and decision-making Troubleshoot and resolve complex data-related issues Stay up-to-date with the latest trends, technologies, and best practices in data analytics.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad, Bengaluru

Work from Office

About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

New Delhi, Chennai, Bengaluru

Work from Office

We are seeking a highly experienced Data Modeler with expertise in Data Modelling, Data Analysis, and Dimensional Modelling The ideal candidate should have hands-on experience with Erwin or Erwin Studio, Data Warehousing (DWH), Snowflake, and SQL The role involves designing and developing data models to support business intelligence and analytics solutions while ensuring data integrity, consistency, and compliance with Banking domain standards Responsibilities include working with Snowflake to optimize cloud-based data models, executing complex SQL queries for data analysis, and resolving data quality issues The candidate should have strong analytical and problem-solving skills, prior experience in the Banking domain, and the ability to work independently in a remote environment

Posted 1 month ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Ahmedabad

Work from Office

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 1 month ago

Apply

12.0 - 18.0 years

1 - 3 Lacs

Bengaluru

Hybrid

Job Description: 12+ years of experience as a Technical Architect, , or similar role with a focus on Azure Data Bricks, Power BI, and ETL. Expertise in designing and implementing data architectures using Azure Data Bricks (ADB). Strong proficiency in Power BI for building scalable reports and dashboards. In-depth knowledge of ETL tools and processes, particularly with Azure Data Factory and other Azure-based ETL solutions. Proficiency in SQL and familiarity with data warehousing concepts (e.g., star schema, snowflake schema). Strong understanding of cloud computing and Azure services, including storage, compute, and security best practices. Experience with data lake architecture, data pipelines, and data governance. Ability to understand complex business requirements and translate them into technical solutions. Strong communication skills with the ability to collaborate across business and technical teams. Leadership and mentoring experience, guiding junior team members to achieve project goals. Preferred Qualifications: Certification in Azure (e.g., Azure Solutions Architect, Azure Data Engineer). Experience with other BI tools or visualization platforms (e.g., Power BI, , PowerApps). Knowledge of programming/scripting languages such as Python, Scala, or DAX. Familiarity with DevOps practices in data pipelines and CI/CD workflows. E xperience with Agile methodologies and project management tools like JIRA or Azure DevOps.

Posted 1 month ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Mumbai

Work from Office

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 1 month ago

Apply

2.0 - 3.0 years

8 - 10 Lacs

Bengaluru

Hybrid

Role & responsibilities Responsibilities: • Design, develop, and maintain Tableau dashboards and reports • Collaborate with business stakeholders to gather and understand requirements and translate the same into effective visualizations that provide actionable insights • Creating wireframes and beta dashboards with a focus on user experience, correctness, and visibility • Optimize Tableau dashboards for performance and usability • Develop and maintain documentation related to Tableau solutions Preferred candidate profile Skills & Requirement (Must Have): • 2-3 years of experience working in developing, publishing maintaining and managing Tableau dashboards • Working knowledge of Tableau administration/architecture • Creating wireframes and beta dashboards with a focus on user experience, • correctness, and visibility • Strong proficiency with SQL and data modelling for analysis and building end to end • data pipelines. • Ability to write complex queries and understanding of database concepts • Ability to be effective in virtual as well as in person setup • Strong at turning data discoveries into analytical insights that drive business outcomes • Strong verbal and written communications skills Nice to have: • Experience in working with clickstream data and web analytics tools like Adobe Omniture, Google analytics • Experience with programming languages like Python & Unix Shell for data pipeline automation and analysis Education: Bachelors degree with at least 2 years of relevant experience in Business Intelligence team Perks and benefits

Posted 1 month ago

Apply

4.0 - 8.0 years

0 - 2 Lacs

Pune

Work from Office

R Primary Skills :Stored procedures, Relational Data Modeling, sql, Enterprise Data Modeling, Data Modeling Secondary Skills :Snowflake Modeling, snowflake warehouse, Snowflake Degree :Bachelor of Computer Science, BCA, BE, BE Computer Engineering, BE-IT, BTECH, M.Tech, MCA Branch :Computer Science and Engineering, Computer Engineering View rights : Job Description: Must Have Skills: 4-8 years of overall experience 4 years' experience in designing, implementing, and documenting data architecture and data modeling solutions, which include the use of Azure SQL and Snowflake databases and SQL procedures. Knowledge of relational databases and data architecture computer systems, including SQL Be responsible for the development of conceptual, logical, and physical data models, the implementation of operational data store (ODS), data marts, and data lakes on target platforms (Azure SQL and Snowflake databases). Knowledge of ER modeling, big data, enterprise data, and physical data models Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have a strong knowledge of Data Quality and Data Governance. Must have knowledge of ETL. Professional Skills: Solid written, verbal, and presentation communication skills. Strong team and individual player. Maintains composure during all types of situations and is collaborative by nature. High standards of professionalism, consistently producing high-quality results. Self-sufficient, independent requiring little supervision or intervention. Demonstrate flexibility and openness to bring creative solutions to address issues. ole & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

Pune

Hybrid

Role Synopsis This role is a senior-level position within a data science team, responsible for providing leadership, strategic direction, and technical expertise in the field of data science and machine learning. This role involves leading all aspects of and guiding a team of data scientists while collaborating closely with multi functional departments, such as engineering, product management, and business collaborators. The Lead Data Scientist plays a pivotal role in crafting data-driven strategies and solutions that chip in to the organization's success and growth Key Accountabilities Data Analysis and Modeling: Lead data scientists must have a confirmed foundation in data analysis and a deep understanding of various machine learning algorithms They should be able to apply these techniques to address sophisticated problems and extract valuable insights from data Out of Code computing: Use libraries that support out-of-core computing, such as Dask in Python These libraries can process data that doesn't fit into memory by reading it in smaller portions from disk Business Insight: Understanding the FDO's business objectives and aligning data initiatives with them Project Management: Being skilled in project management methodologies helps in planning and driving data science projects efficiently Machine Learning: Innovation and Strategy - Advanced Machine Learning Skills for complex models. Evaluation Collaboration and Communication: Effective communication with collaborators, Explain the modeling approach and results. Implement to privacy guidelines and recommendation with conscious balance Continuous Learning: Staying up-to-date in competitive edge. Apply methodologies to practical business challenges. Meeting with domain GPO's Data cleaning and preprocessing, analysis: The ability to clean and preprocess data effectively is a fundamental skill in any data science role Data Ethics and Privacy: Open communication with customer. Ethical considerations in algorithm design. Secure data handling, Data minimization Database Management: Proficient in database systems and SQL is required for data retrieval and storage Domain knowledge: Expertise in the domain they are working in to understand the context and requirements of the data projects better Statistical Analysis and Mathematics: Solid grasp of statistical methods and mathematical concepts is needed for data analysis, modeling, and drawing substantial insights from data Experience and Job Requirements Data Science Team plays a crucial role in driving data-informed decision-making and generating actionable insights to support the company's goals. This team is responsible for processing, analyzing, and interpreting large and complex datasets from multiple datasets to provide valuable insights and recommendations across various domains. Through advanced analytical techniques and machine learning models, the data science team helps optimize processes, predict trends, and build data-driven strategies. A bachelor's or equivalent experience or master's degree in quantitative, qualitative field such as Computer Science, Statistics, Mathematics, Physics, Engineering, or a related data field is often required Skills: Leadership role in Data Analysis, Programming proficiency in Python, SQL, Azure Databricks, Statistics & Mathematics. Leadership qualities to steer the team. Strategic direction and technical expertise Soft skills: Active listening, Translate business problems into data questions, Communication and collaboration, Presentation, Problem solving, Multi-functional, Team management, Partner management Data Sources: SAP, Concur, Salesforce, Workday, Excel files Other: Project management. Domain knowledge [Procurement, Finance, Customer], Business Insight, Critical thinking, Story telling Able to prepare analytical reports, presentations and/or visualisation dashboards to communicate findings, important metrics and insights to both technical and non-technical customers Stay up to date with industry trends, standard methodologies and new technologies in data analytics, machine learning, data science techniques.

Posted 1 month ago

Apply

10.0 - 14.0 years

30 - 45 Lacs

Hyderabad

Work from Office

Bachelors in computer science, Information Systems, or a related field Minimum of 10+ years of experience in data architecture with a minimum of 1-3 years of experience in healthcare domain Strong hands-on experience with Cloud databases such as Snowflake, Aurora, Google BigQuery etc. Experience in designing OLAP and OLTP systems for efficient data analysis and processing. Strong handson experience with enterprise BI/Reporting tools like (Looker, AWS QuickSight, PowerBI, Tableau and Cognos). A strong understanding of HIPAA regulations and healthcare data privacy laws is a must-have for this role, as the healthcare domain requires strict adherence to data privacy and security regulations. Experience in data privacy and tokenization tools like Immuta, Privacera, Privitar OpenText and Protegrity. Experience with multiple full life-cycle data warehouse/transformation implementations in the public cloud (AWS, Azure, and GCP) with Deep technical knowledge in one. Proven experience working as an Enterprise Data Architect or a similar role, preferably in large-scale organizations. Proficient in Data modelling (Star Schema (de-normalized data model), Transactional Model (Normalized data model) using tools like Erwin. Experience with ETL/ETL architecture and integration (Matillion, AWS GLUE, Google PLEX, Azure Data Factory etc) Deep understanding of data architectures that utilize Data Fabric, Data Mesh, and Data Products implementation. Business & financial acumen to advise on product planning, conduct research & analysis, and identify the business value of new and emerging technologies. Strong SQL and database skills working with large structured and unstructured data. Experienced in Implementation of data virtualization, and semantic model driven architecture. System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira Excellent written and oral communication skills to convey key choices, recommendations, and technology concepts to technical and non-technical audiences. Familiarity with AI/MLOps concepts and Generative AI technology.

Posted 1 month ago

Apply

7.0 - 12.0 years

16 - 31 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 1 month ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Pune

Work from Office

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 1 month ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Bengaluru

Work from Office

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Position Purpose Shared Data Ecosystem (SDE) is an ITG-FRS department hosting various applications relating to the filire unique program in charge of collecting Accounting and Risk data from local entities in an unique stream. All the Accounting and Risk data is loaded into the SRS, which is a data warehouse storing all the group information at granular level. Accounting and Risk datamarts are fed by this data warehouse, and restitutions tools are plugged to these datamarts. Our goal is to deliver an efficient access to SRS data, for both local and central users, covering multiple use cases in a coherent way and data model. Enable the Filire (1800 users from Entities to Central teams) to contribute smoothly to the closing process with: Datamarts build-up consistently to allow data exposition Consistent and user-friendly BI tools Industrial accesses to produce granular analyses and Financial & Regulatory reportings As a business analyst, your main activities are to: Analyze business needs and write business/functional requirements Explain the needs/changes required in the application to Technical Teams Test the delivery/results built by Technical Teams Build BO reports to fulfill the needs Help SRS users on their daily work on SRS Exposition layer Production monitoring (quarterly closing), with the possibility of on-call period Responsibilities Direct Responsibilities The following deliverables are the main outputs of the previous scope definition in terms of responsibility for the BA. It should be taken into consideration that during Project Mode or accordingly to other recurrent work, new deliverables can be defined. The main deliverables are: Produce Functional requirements Write and execute tests cases Participate in designing innovative solutions aligned with banks informational architecture Build new BO queries based on Finance or RISK team requirement Assist Finance on their daily production work Root cause analysis of any production incident/defects raised by user. It is expected that he/she can ensure proper support to users of the tool, as well as providing high quality work and deliverables, on the execution of his/her job. Working knowledge in Microsoft Office Suite (Excel, PowerPoint, Word) and SharePoint. Good to have skills SQL (mandatory) Restitution tools (Business Object, Power BI and cubes SSAS) Business Intelligence (data modelling) Experience in process Finance/ Accounting domain as a Business Analyst Contributing Responsibilities Contribute to overall FRS as directed by Team and Department Management Technical & Behavioral Competencies Ability to simplify complex information in a clearly organized and visually interesting manner Pro-active behavior regarding the ability to work in a fast changing and demanding environment At ease with multi-tasking Strong analytical mind and problem-solving skills Ensure a high service level for all Customers of the tool Assure a high communication level with Customers and other teams Improve process that delivery users value Mind-set on getting better all the time, ongoing effort to improve Show the improve in the light of their efficiency effectiveness and flexibility Take pertinent proactive measures Be aligned with the BNP Values: Agility, Compliance Culture, Openness, Client Satisfaction Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Communication skills - oral & written Client focused Transversal Skills: (Please select up to 5 skills) Ability to understand, explain and support change Ability to develop and adapt a process Ability to manage / facilitate a meeting, seminar, committee, training Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level : At least 5 years

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Pune

Work from Office

Although the role category specified in the GPP is Remote, the requirement is for Hybrid. Job Summary: Responsible for building high-quality, innovative and fully performing software in compliance with coding standards and technical design. Design, modify, develop, write and implement software programming applications. Support and/or install software applications. Key participant in the testing process through test review and analysis, test witnessing and certification of software. Key Responsibilities: Develop software solutions by studying information needs; conferring with users; studying systems flow, data usage and work processes; investigating problem areas; following the software development lifecycle; Document and demonstrate solutions; Develops flow charts, layouts and documentation Determine feasibility by evaluating analysis, problem definition, requirements, solution development and proposed solutions; Understand business needs and know how to create the tools to manage them Prepare and install solutions by determining and designing system specifications, standards and programming Recommend state-of-the-art development tools, programming techniques and computing equipment; participate in educational opportunities; read professional publications; maintain personal networks; participate in professional organizations; remain passionate about great technologies, especially open source Provide information by collecting, analyzing, and summarizing development and issues while protecting IT assets by keeping information confidential; Improve applications by conducting systems analysis recommending changes in policies and procedures Define applications and their interfaces, allocate responsibilities to applications, understand solution deployment, and communicate requirements for interactions with solution context, define Nonfunctional Requirements (NFRs) Understands multiple architectures and how to apply architecture to solutions; understands programming and testing standards; understands industry standards for traditional and agile development Provide oversight and foster Built-In Quality and Team and Technical Agility; Adopt new mindsets and habits in how people approach their work while supporting decentralized decision making. Maintain strong relationships to deliver business value using relevant Business Relationship Management practices. External Qualifications and Competencies Competencies: Business insight - Applying knowledge of business and the marketplace to advance the organizations goals.Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences.Customer focus - Building strong customer relationships and delivering customer-centric solutions. Global perspective - Taking a broad view when approaching issues, using a global lens.Manages conflict - Handling conflict situations effectively, with a minimum of noise.Agile Architecture - Designs the fundamental organization of a system embodied by its components, their relationshipsto each other and to the environment to guide its emergent design and evolution. Agile Development - Uses API-First Development where requirements and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customer(s)/end user(s) to construct high-quality, well designed technical solutions; understands and includes the Internet of Things (IoT), the Digital Mesh, and Hyper Connectivity as inputs to API-First Development so solutions are more adaptable to future trends in Agile development.Agile Systems Thinking - Embraces a holistic approach to analysis that focuses on the way that a system's constituent parts interrelate and how systems work over time and within the context of larger systems to ensure the economic success of the solution. Agile Testing - Leads a cross-functional agile team with special expertise contributed by testers working at a sustainable pace, by delivering business value desired by the customer at frequent intervals to ensure the economic success of the solution.Regulatory Risk Compliance Management - Evaluates the design and effectiveness of controls against established industry frameworks and regulations to assess adherence with legal/regulatory requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Modeling - Creates, designs and formulates models, diagrams and documentation using industry standards, tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements.Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in Computer Science, Engineering, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: Experience working as a software engineer with the following knowledge and experiences are preferred:- Working in Agile environments;- Fundamental IT technical skill sets;- Taking a system from coping requirements through actual launch of the system;- Communicating with users, other technical teams and management to collect requirements, identify tasks, provide estimates and meet production deadlines; - Professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing and operations. Additional Responsibilities Unique to this Position Expertise in Oracle Data Integrator (ODI) 12c on Design, develop, and implement ETL processes.Collaborate with cross-functional teams to gather and analyse business requirements for data integration solutions. Experience in working with multiple source technologies like Oracle, SQL Server, Excel files, delimited files, RESTAPI. Exposure with Unix shell and Python scripting. Proficiency with working in SQL and PL/SQL (packages, stored procedures, triggers). Proficiency in data modelling concepts, terminology, and architecture. Knowledge of dimensional modelling. Experience working with multiple source/target systems such as Oracle, XML files, Json, flat files and excel documents.Expertise in ODI customization, migration between environments and load plan generation. Develop ETL mappings, workflows, and packages using ODI to integrate data from multiple sources. Knowledge of Oracle Database and related technologies.Create and maintain comprehensive documentation related to ETL processes, data mappings, and transformations. Optimize and tune ETL processes for optimal performance and efficiency. Troubleshoot and resolve data integration issues and errors.Collaborate with stakeholders to ensure data quality and integrity. Participate in code reviews and provide constructive feedback to peers. Excellent problem-solving, communication and collaboration skills. Understanding and experience with DevOps/DevSecOps. Experience with other ETL tools and technologies is a plus. Willingness to learn and expand into other Integration suite of Products like MuleSoft would be a big plus.

Posted 1 month ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Mumbai, Hyderabad

Work from Office

About the role As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights.As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities Data Pipeline Design Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills Educational Qualification B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 6 to 10 years of experience and at least 5 years or relevant work experience in Datawarehouse/mining/BI/MIS. Experience in Data Warehousing Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MSSQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team Regular interaction with business/product/functional teams to create mobility solutions. Certification Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications.. Communication skills Good oral and written communication skills.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Mode of work: Hybrid (2 days WFO) Mode of Interview: 2 Rounds (Virtual, F2F) Notice Period: Immediate-15 days We are looking for a highly skilled Senior Backend Developer with solid experience in developing and maintaining scalable backend systems using Go. You'll be part of a core engineering team building robust APIs anddistributed services. Key Responsibilities: Develop scalable and high-performance backend services. Write clean, efficient, and testable code. Optimize systems for latency, reliability, and cost. Collaborate closely with front-end engineers and product teams. Handle data modelling and database performance tuning. Required Skills: Strong in Go Solid understanding of: RESTful API design SQL (PostgreSQL, MySQL) NoSQL (MongoDB, Redis) Location: Bangalore, Hyderabad, Pune, Mumbai, Chennai, Ahmedabad

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies