Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Department: Development Location: Pune, India Description Our bright team FastTrack their career with international exposure and ways of working based on agile development best practices from globally renowned technology consultancies. Key Responsibilities Responsibilities: Data Architect Creating data models that specify how data is formatted, stored, and retrieved inside an organisation. This comprises data models that are conceptual, logical, and physical. Creating and optimising databases, including the selection of appropriate database management systems (DBMS) and the standardisation and indexing of data. Creating and maintaining data integration processes, ETL (Extract, Transform, Load) workflows, and data pipelines to seamlessly transport data between systems. Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements and align architecture with business objectives. Stay current with industry trends, best practices, and advancements in data management through continuous learning and professional development. Establishing processes for monitoring and improving the quality of data within the organisation. Implement data quality tools and practices to detect and resolve data issues. Requirements and Skills: Data Architect Prior experience in designing Data Warehouse, data modelling, database design, and data administration is required. Database Expertise: Knowledge of data warehousing ideas and proficiency in various database systems (e.g., SQL). Knowledge of data modelling tools such as Visual Paradigm is required. Knowledge of ETL methods and technologies (for example, Azure ADF, Events). Expertise writing complex stored procedures. Good understanding of Data Modelling Concepts like Star Schema ,SnowFlake etc Strong problem-solving and analytical skills are required to build effective data solutions. Excellent communication skills are required to work with cross-functional teams and convert business objectives into technical solutions. Knowledge of Data Governance: Understanding data governance principles, data security, and regulatory compliance. Knowledge of programming languages such as .net can be advantageous.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Data Engineer with 3.5-6 years of experience, you will be responsible for working on Data warehousing projects. You must have a good understanding of Data Warehouse concepts and hands-on experience with AWS and SQL Queries. Your role will involve having 3+ years of experience in Data warehousing projects, with a strong knowledge of databases, particularly Oracle. Proficiency in writing SQL Queries is a must, along with familiarity with basic UNIX commands. Experience in working on Production Support Projects and knowledge of any Job Scheduling tool, preferably Control-M, will be essential. You will be responsible for timely allocation and support to teams for fixing production issues and user problems. Flexibility to work in shifts based on project requirements is crucial for this role. Your technical skills should include proficiency in databases like Oracle/SQL Server, SQL/PLSQL, and familiarity with tools like Snowflake Data platform, UNIX commands, and Scheduling Tool (Control-M Preferred). Experience with ETL tools, especially Informatica PowerCenter, would be an added advantage. If you are someone who can join within 0-15 days and possess the mentioned skills and experience, we encourage you to apply for this Full-Time position located in Gr. Noida.,
Posted 3 weeks ago
8.0 - 13.0 years
14 - 24 Lacs
Gurugram
Work from Office
I. The role holder is responsible to manage various reporting requirements like Business performance, KPIs, Performance Management, Incentives, CPP & PIP etc To provide insights within organisation across all hierarchy levels right from ExCO to ground level staff and to distribution spread across Bank Staff, Business partners. This role requires maintenance of existing reporting application Saral which is built on SQL & Pentaho. Also person is required to build / automate new dashboards on this reporting application, hence extensive knowledge of various databases & ETL tool is required. Objective of the role is to reduce execution time and associated risk, by automating existing manual reports and to create process and control over various aspects of business intelligence framework, Reporting automation platforms, data dictionary, requirement documents and traceability documents. III. IV. Use Powerpoint, MS Excel, MS Access, PL/SQL, ETL, OLTP System, Macro scripting, BI tools, data warehouse, and all available resources to create / automate reports and its presentation to stakeholders V. Monitoring various sales KPI such as issuance target achievement, login, pipeline cases, commission earned, new business and renewal business, branch & people activation and productivity, etc to the distribution partners and business development teams VI. Creation of Sales force management policy for each and every level of hierarchy for one or more distribution channels VII. Creation of incentive structure for each and every level of hierarchy for one or more distribution channels and management of budgets and opex VIII. Creation of CPP / PIP for each and every level of hierarchy for one or more distribution channels IX. Automation of new reports and continuous tuning / enhancement of already automated reports for improvement opportunities. Gather requirements from in-house teams by understanding existing reporting process and creating automated reports for same I. Handling end to end delivery from Requirement understanding, Development, SIT/Dev testing before releasing for Business UAT, UAT testing support & production Go Live II. Should have hands on experience of SQL & ETL tools for faster learning experience and delivery III. Knowledge of insurance industry is MUST to have quick hand holding. IV. Providing qualitative and timely information to decision makers on all of the aforesaid parameters in form of various dashboards, such as HO business update (automated), early claims dashboard, surrender payout monitoring, etc V. responsible for publishing sales metrics in terms of timely and accurate incentive calculation for 5000+ sales staff on monthly basis approved sales incentive scheme as a part of SFM (Sales Force Management) I. Develop and implement BI strategies aligned with business objectives to leverage the automation in organisation. II. Ensure that automated outputs match with the existing manual reports. However, with an objective of reducing manual efforts & execution time for reports streamlining and efficiency. III. Provide training, guidance and support to ensure team performance and development. IV. Identification of potential risk items, which may lead to data inconsistency & suggest mitigation plan. Define & revise key measures used across organization and collaborate for documentation V. Manage and maintain BI in-house tools and systems for seamless delivery of multiple dashboards. VI. Develop and implement automated workflows for data ingestion, transformation, loading (ETL) and maintenance of data sanity. VII. Manage the projects of design & deploy new reports and tracking dashboards, within least possible turn around and should have hands on experience in all or most of below mentioned tools:-
Posted 3 weeks ago
8.0 - 10.0 years
10 - 20 Lacs
Kolkata, Pune, Bengaluru
Hybrid
Job Title: ETL Data Modeller Experience: 8 to 10 years Location: Pan India Employment Type: Full-Time Notice Period : Immediate to 30 days Technology: Data Modelling, IICS/any leading ETL tool, SQL, Python (Nice to have) Key Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow.
Posted 3 weeks ago
4.0 - 8.0 years
5 - 15 Lacs
Chennai, Delhi / NCR, Mumbai (All Areas)
Hybrid
Job Description (JD): Azure Databricks / ADF / Synapse , with strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Azure (Databricks / ADF / Synapse) Experience: 4 to 7 Years Location: Pan India Employment Type: Full-Time Notice Period: Immediate to 30 Days Job Summary: We are looking for a skilled and experienced Data Engineer with 4 to 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Azure Databricks , Azure Data Factory (ADF) , or Azure Synapse Analytics , along with Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.
Posted 3 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Bengaluru, India
Work from Office
Looking for OBIA with 5 - 10 years of experience in OBIEE, Oracle BI Apps implementation/customization, ETL (Informatica/ODI), DAC, PL/SQL, data modeling, and functional knowledge of EBS, Siebel, and PeopleSoft modules.
Posted 3 weeks ago
2.0 - 7.0 years
25 - 30 Lacs
Bengaluru
Work from Office
This role involves working closely with cross-functional teams to develop reporting systems, identify trends, and provide actionable insights to support strategic objectives.
Posted 4 weeks ago
7.0 - 10.0 years
0 Lacs
Hyderabad
Work from Office
Responsibilities: * Conduct business analysis using BRD, FRD, SQL & Power BI. * Collaborate with stakeholders on requirements gathering. * Manage UAT process from planning to delivery. Domain:- Investment banking or Finance is mandatory Office cab/shuttle Food allowance Annual bonus Health insurance Provident fund
Posted 4 weeks ago
5.0 - 9.0 years
9 - 16 Lacs
Chennai, Bengaluru
Hybrid
Dear Talent, Greetings from HCL Technologies!! We are pleased to inform you that we currently have opening for Data warehouse testing US Shift Skill Required: Data warehouse, ETL Testing, Hadoop, SQL, Data quality, Data Integrity, Data reconciliation, Data models Location Chennai/Bangalore Experience 5-10 years Notice: Immediate- 30 Days Please revert with the below details along with your Update CV. Name: Contact Number: Email ID: Total Exp: Relevant Exp: Must ready work with 7:00 AM-4:00 PM Australia shift: Yes/No Current CTC: Expected CTC: Current Org: Current Location: Preferred location: Notice Period: EX-HCL Employee: Yes/No If yes SAP code-HCL: Available Date: Yes/No Available Timing (10:00 AM- 4:00 PM): Job Purpose Ready to work on Australia Shift from 7:00 AM- 4:00 PM Plan, schedule, coordinate & execute testing activities Responsible for testing of systems to ensure delivery of Quality assured applications to production. Support Test Lead /Senior Test Lead in upward stakeholder management and day-to-day management of the QA & Test organization Job Responsibilities Assist with the planning and execution of Testing for a Project or BAU stream Develop Test Plans, Status Reporting, Test Readiness Reviews and Test Completion Report as appropriate Work closely with off-shore team and on-shore team(s) to deliver projects to time, cost and quality requirements. Create data, conduct tests and analyse results to ensure that software meets or exceeds specified standards and /or customer requirements. Ensure that tests are successfully completed and documented and all problems are resolved Manage Risks and Issues associated with testing engagements Qualifications Minimum of 5-8 years of Testing experience Must have strong analytical skills, reporting and problem solving skills Excellent verbal and written communication skills (Must) Experience in Business intelligence and Data warehouse testing in Hadoop Data Hub environment (Must) Expert in writing SQL queries and testing data flow across the data layers (Must) Expert in testing data quality, data integrity, data reconciliation and reporting solutions (Must) Good understanding on Data warehouse data models and creating test scenarios and cases (Preferred) Experience in working with Agile teams (Preferred) Experience in continuous integration and test automation with relevant tools. Interested candidates please share your resume to s.yamuna@hcltech.com
Posted 4 weeks ago
7.0 - 12.0 years
22 - 32 Lacs
Hyderabad
Hybrid
Join our dynamic team as a Data Engineer Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in driving the execution of our data and technology strategy. This role is crucial in driving digital transformation and operational efficiency across Investment Management. As part of this role, you will engage in building data solutions including streaming and batch pipelines, data marts & data warehouse. You will be responsible for establishing robust data collection and processing pipelines to fulfill Investment Management business requirements. The Team You'll be an integral part of our esteemed Corporate Technology Team, comprised of 6 stacks: Investments, Finance, Risk & Law, HR & Employee Experience (EE), Data Engineering & Analytics, Portfolio, and Strategy. Our team operates on a global scale, driving innovation and excellence across diverse areas of expertise. As a Data Engineer, you'll play a critical role in high impact Corporate Technology Investment Initiatives, ensuring alignment with organizational objectives and driving impactful outcomes. This is an opportunity to collaborate closely with Corporate Technology Data and Analytics team and Investment management business stakeholders. Our team thrives on collaboration, innovation, and a shared commitment to excellence. Together, we're shaping the future of technology within our organization and making a lasting impact on a global scale. Join us and be part of a dynamic team where your contributions will be valued and your potential unleashed. The Impact: • Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset. • Executes and provides feedback for data modeling policies, procedure, processes, and standards. • Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems. • Develop comprehensive data quality standards and implement effective tools to ensure data accuracy and reliability. • Collaborate with various Investment Management departments to gain a better understanding of new data patterns. • Collaborate with Data Analysts, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. • Translate high-level business requirements into detailed technical specs. The Minimum Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems or related field. Experience: Years of experience with data analytics, data modeling, and database design. Years of coding and scripting (Python, Java, Scala) and design experience. Years of experience with Spark framework. Experience with ELT methodologies and tools. Years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC.
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Pune
Work from Office
We are looking for a highly skilled and experienced Data Engineer with over 5 years of experience to join our growing data team. The ideal candidate will be proficient in Databricks, Python, PySpark, and Azure, and have hands-on experience with Delta Live Tables. In this role, you will be responsible for developing, maintaining, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. You will collaborate with cross-functional teams to build robust data infrastructure and enable data-driven decision-making. Key Responsibilities: .Design, develop, and manage scalable and efficient data pipelines using PySpark and Databricks .Build and optimize Spark jobs for processing large volumes of structured and unstructured data .Integrate data from multiple sources into data lakes and data warehouses on Azure cloud .Develop and manage Delta Live Tables for real-time and batch data processing .Collaborate with data scientists, analysts, and business teams to ensure data availability and quality .Ensure adherence to best practices in data governance, security, and compliance .Monitor, troubleshoot, and optimize data workflows and ETL processes .Maintain up-to-date technical documentation for data pipelines and infrastructure components Qualifications: 5+ years of hands-on experience in Databricks platform development. Proven expertise in Delta Lake and Delta Live Tables. Strong SQL and Python/Scala programming skills. Experience with cloud platforms such as Azure, AWS, or GCP (preferably Azure). Familiarity with data modeling and data warehousing concepts.
Posted 1 month ago
7.0 - 11.0 years
30 - 35 Lacs
Bengaluru
Work from Office
1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python
Posted 1 month ago
12.0 - 18.0 years
50 - 65 Lacs
Bengaluru
Work from Office
Oversee the delivery of data engagements across a portfolio of client accounts, understanding their specific needs, goals & challenges Provide mentorship & guidance for the Architects, Project Managers, & technical teams for data engagements Required Candidate profile 12+ years of experience and should be hands on in Data Architecture and should be an expert in DataBricks or Azure Should be in data engineering leadership or management roles
Posted 1 month ago
4.0 - 8.0 years
16 - 27 Lacs
Pune
Work from Office
Job Summary: We are seeking a skilled ETL Developer with 5+ years of experience in designing, developing, and maintaining ETL processes using SSIS and working with large-scale MS SQL Server data warehouses. The ideal candidate will have a strong techno-functional understanding of data warehousing concepts and best practices. Key Responsibilities: ETL Development: Design, develop, and maintain ETL processes to extract data from various sources, transform it according to business rules, and load it into target databases or data warehouses using SSIS or other ETL tools. SQL Development: Develop and optimize SQL queries, stored procedures, and functions for efficient data retrieval and manipulation. Data Management: Manage data warehousing concepts, including data governance, data architecture, data profiling, and data quality validation. Documentation: Document data flows using Visio, create business requirement documents in Word, and maintain source-to-target mapping documents in Excel. Troubleshooting: Identify and resolve production issues effectively. Engineering: Experience in both forward engineering (requirements to SQL logic) and reverse engineering (converting SQL logic to business requirements). Communication: Strong ability to communicate technical concepts to non-technical stakeholders. Required Skills: Proficient hands-on experience with SSIS and SQL Server. Strong expertise in developing SSRS reports. Solid understanding of data warehousing concepts and principles. Excellent documentation skills, including ETL processes and source-to-target mappings. Strong communication and collaboration skills. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 5+ years of relevant ETL and data warehousing experience.
Posted 1 month ago
6.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job location Bangalore Job Title: Module Lead - SnowFlake Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory
Posted 1 month ago
5.0 - 10.0 years
16 - 27 Lacs
Hyderabad
Work from Office
Assistant Manager- SQL developer Key skills T-SQL, MS SQL Server, SSIS. Database Developer Required Skills: SQL(Expert)& Excel (Intermediate) Location- Hyderabad (Return to office) We are seeking a Data Analyst for Hyderabad location Experience: 5+ years Role that involves client interaction Responsibilities Writing T-SQL Queries using joins and Sub queries in MS SQL Server. T-SQL development skills to write complex queries involving multiple tables, Joins. Creating & modifying of tables, fields, constraints in MS SQL Server Strong hands-on experience in optimizing data base performance and Query performance Must have good experience in testing and troubleshooting database issues Advance functions Creating / modifying stored procedures in MS SQL Server. Scheduling / modifying Jobs to automate in MS SQL Server. Knowledge on creating Views / Triggers / Functions Analytical mindset and sound knowledge of MS-Excel, MS-Access is Preferred Knowledge on designing Data model (Star / Snowflake Schema). You will work on SQL server and engaged in writing complex queries, creating/amending procedures, packages, triggers, Indexes, Functions, and transactions. Responsible for designing Complex SQL queries, Stored Procedures, cursors, triggers, ETL etc. Need to Import/Export data on SQL Server Data analysis, data verification and problem-solving abilities. Should know how to build functions, procedures, triggers, indexes, cursor etc. Very good command over SQL and write complex queries. Good technical knowledge in data mining, reporting tools. Good knowledge of databases and data warehouse structures Qualifications we seek in you. Minimum qualifications BE/B- Tech, BCA, MCA, MBA, BSc/MSc
Posted 1 month ago
8.0 - 13.0 years
20 - 35 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities Position : Senior Snowflake Developer Primary Skill: Snowflake + Advance SQL (5+ yrs) + DWH (Data warehouse) Work Mode: Hybrid (2 days from office) Job Description : The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions. Requirements Bachelor's degree in computer science, Engineering, or related fields or equivalent practical experience. 8+ years of industry experience with hands-on managing projects in Data Warehousing. Minimum 4 years of experience working on Snowflake. Responsibilities Lead in developing project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Make necessary ongoing updates to modeling principles, processes, solutions, and best practices to ensure alignment with business needs. Design and build manual or auto ingestion data pipeline using Snowpipe. Develop stored Procedures and write Queries to analyze and transform data. Good-to-have DBT / Matillion Interested candidates can apply to kinnera259@gmail.com Regards, HR Manager
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
The engineer/developer is working closely with his team members and business teams to develop features according to a requirement. He must also ensure that developments are aligned with best practices and monitor what has been promoted on higher environments. Production or tests environments supports activities are also part of the position (job monitoring, issues solving, ) Responsibilities Direct Responsibilities - Participate to all the Agile ceremonies of the squad (Dailys, Sprint plannings, backlog refinements, reviews, etc) - Communicate ASAP on the blocking points - Estimate, design and build technical solution for business needs & requirements according to the Jira requirements (Change, bug fixing, ). - Do the unit tests of the code developed to deliver the code for user acceptance test - Design technical solution for business needs & requirements - Maintain a clean code or a robust infrastructure (depending on developer / OPs expertise) - Raise technical improvements and impediments Contributing Responsibilities - Accountable to deliver the amount of jira tickets assigned to him/her during a sprint - Accountable to do the daily support/monitoring - Accountable to contribute to the Engineer Chapter events and Tribe community (Guild with the Engineer Techlead) - Accountable on the platform behavior ensure platform is up and running (stability) with the tech lead and other members of the guild - Proficiency in Oracle ODI tools and Oracle PL/SQL. - In-depth knowledge of ODI Master and Work Repositories. - Integrating ODI with multiple sources/targets. - Managing errors using ODI and PL/SQL. - Strong database development skills (SQL/PLSQL) for PL/SQL-based applications. - Creating packages, procedures, functions, triggers, views, materialized views, and exception handling. - Data migration using SQL*Loader, import/export. - Solid understanding of ELT/ETL concepts, design, and coding. - Experience with partitioning and indexing strategies for optimal performance. Technical & Behavioral Competencies Be autonomous on his asset (Senior to expert level) and motivated on the daily tasks. Be an active member of the squad (often multi-technologies). Be proactive by raising alerts when its flagged, try to understand the global context and suggest how to solve an issue (without focusing only on his subpart). It will also help to understand better the rest of the team (who will not necessarily work on the same tool) Be able to provide proof that developments match the requirement (tests cases) Have proper communication with the Business. (English language) Have some knowledge regarding deployment/versioning tools (Git, Jenkins, ) Have some knowledge on project tracking software (Jira, QC, ) Have some knowledge regarding monitoring tools (Centreon, Dynatrace, ) Have a strong background in SQL and be able to check some test cases on his own directly by launching some SQL requests on the different databases Provide his technical expertise to suggest optimization and technical enhancements Unix/Windows Specific Technical skills required for this role: Oracle Data Integrator (currently v12.2.1.4) - Having some admin notions is not mandatory but can be an advantage Some knowledge of BI (data warehouse oriented) can be an advantage. SQL Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Attention to detail / rigor Adaptability Creativity & Innovation / Problem solving Ability to deliver / Results driven Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to anticipate business / strategic evolution Ability to understand, explain and support change Ability to develop others & improve their skills Education Level: Bachelor Degree or equivalent Experience Level At least 5 years
Posted 1 month ago
5.0 - 8.0 years
8 - 18 Lacs
Mumbai, Hyderabad, Pune
Hybrid
Role & responsibilities Design, implement, and manage cloud-based solutions on AWS and Snowflake. Work with stakeholders to gather requirements and design solutions that meet their needs. Develop and execute test plans for new solutions Oversee and design the information architecture for the data warehouse, including all information structures such as staging area, data warehouse, data marts, and operational data stores. Ability to Optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. Deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support Significant experience working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models) Maintain Documentation: Develop and maintain detailed documentation for data solutions and processes. Provide Training: Offer training and leadership to share expertise and best practices with the team Collaborate with the team and provide leadership to the data engineering team, ensuring that data solutions are developed according to best practices Preferred candidate profile Snowflake, DBT, Data Architecture Design experience in Data Warehouse. Good to have Informatica or any ETL Knowledge or Hands-On Experience Good to have Databricks understanding 5+ years of IT experience with 3+ years of Data Architecture experience in Data Warehouse, 4+ years in Snowflake
Posted 1 month ago
6.0 - 8.0 years
13 - 14 Lacs
Mumbai
Work from Office
Job Title: Deputy Manager 1 Department: Information Technology Reports To: Sr. Manager Experience: 6to 8 years Preferred Qualification:BE/B Tech/BSc (with PG Dip Computers)/ BSc (IT) / MCA Required Qualification: BE/B Tech/BSc (with PG Dip Computers)/ BSc (IT) /BCA/MCA Skill, Knowledge & Trainings Hands-on experience with SQL Server 2016 or higher Writing optimized T-SQL Queries Development of Reports in SSRS / Tableau MSBI Technology Stack Good Communication and Presentation skills Core Competencies : SQL Server Development Strong T-SQL skills Report Development Querying Oracle 11g or higher Functional Competencies Exposure to post trade applications in Capital Markets/Treasury Should be able to work in a team as well as individually SSAS - Cube MDX Power BI Job Purpose: The successful candidate would join the IT Department of a critical Financial Market Infrastructure (FMI) organization which serves as a Qualified Central Counterparty (QCCP) for Government Securities, Money Market, FX and Derivatives transactions in the Indian financial markets. The successful candidate would play a key role in the "Data Warehouse" which serves the needs of analytical data of all business segments. The candidate should be able to design fact & dimension tables, create & deploy SSRS reports and SSIS catalogues, create and modify cube design. The successful candidate would be responsible for application development using Bl tools through all stages of the development life cycle while also working closely with all other project stakeholders. Area Of Operations Data Warehouse Key Responsibility Member of data warehouse team developing the solution using SQL Server Scripts, Stored Procedures, Functions. Any Other Requirement: Should be a good team player. Would be required to work with multiple projects / teams concurrently
Posted 1 month ago
8.0 - 10.0 years
12 - 20 Lacs
Noida, New Delhi, Gurugram
Work from Office
Role & responsibilities :- 8+ years of related experience The Business Analysts should have sound knowledge on Business Analysis and System Design concepts, Data Warehouse and data mart, database architecture, ETL, development activities and document preparation (SRS, FRD, RTM,Change request etc.). They have to attend the meetings with domain to understand the requirements, prepare the scope document, Business Requirement document and Functional Requirement document. They will act as an interface between Domain and Technical team to understand and convey the requirements to technical team. They have to work on MS Visio, MS Office, UML Modeling. They have to prepare all types of documents viz. SRS, Design Document, Requirement Traceability Matrix (RTM) document. Communications skills:- Excellent verbal and written
Posted 1 month ago
1.0 - 4.0 years
3 - 6 Lacs
Gurugram
Work from Office
Will mainly work on developing dashboards and reports for the Workforce Management Will work closely with internal and external stakeholders (Operations, Support groups,Clients Reporting Team, etc.) to make sure that correct and updated information on all dashboards and reports is available to the business at all times. Ability to work with large amounts of information and see the bigger picture Excellent analytical and problem solving skills Strong attention to detail Critical thinking Experience in Power BI , Amazon Connect, Kibana, Teleopti (Preferred) exposure to ETL, databases / data warehouse that relates to developing reports using Power BI Experience using Google Sheets Location : - Navi Mumbai,Gurugram,Indore,Mohali
Posted 1 month ago
1.0 - 4.0 years
3 - 6 Lacs
Navi Mumbai
Work from Office
Will mainly work on developing dashboards and reports for the Workforce Management Will work closely with internal and external stakeholders (Operations, Support groups,Clients Reporting Team, etc.) to make sure that correct and updated information on all dashboards and reports is available to the business at all times. Ability to work with large amounts of information and see the bigger picture Excellent analytical and problem solving skills Strong attention to detail Critical thinking Experience in Power BI , Amazon Connect, Kibana, Teleopti (Preferred) exposure to ETL, databases / data warehouse that relates to developing reports using Power BI Experience using Google Sheets Location : - Navi Mumbai,Gurugram,Indore,Mohali
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Gurugram
Work from Office
Overall 5+ years of experience in WFM 2 years of experience of leading Reporting team Ability to demonstrate and articulate understanding of key workforce management concepts Proficient with phone system reporting and scheduling tools (Verint, Nice, IEX, Cisco) as well as Microsoft Office (Excel, VBA, Macros) Excellent quantitative, Problem solving & analytical skills Strong attention to detail Critical thinking Experience in Power BI , Amazon Connect, Kibana, Teleopti (Preferred) exposure to ETL, databases / data warehouse that relates to developing reports using Power BI Experience using Google Sheets Location : - Navi Mumbai,Gurugram,Indore,Mohali
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Navi Mumbai
Work from Office
Overall 5+ years of experience in WFM 2 years of experience of leading Reporting team Ability to demonstrate and articulate understanding of key workforce management concepts Proficient with phone system reporting and scheduling tools (Verint, Nice, IEX, Cisco) as well as Microsoft Office (Excel, VBA, Macros) Excellent quantitative, Problem solving & analytical skills Strong attention to detail Critical thinking Experience in Power BI , Amazon Connect, Kibana, Teleopti (Preferred) exposure to ETL, databases / data warehouse that relates to developing reports using Power BI Experience using Google Sheets Location : - Navi Mumbai,Gurugram,Indore,Mohali
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough