Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 9.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. About The Role We are looking for a highly motivated individual with strong organizational and technical skills for the position of Lead Data Engineer/ Data Engineering Manager (Snowflake). You will play a critical role working on cutting edge of Data Engineering and analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.Effectively communicate across various levels, including Executives, and functions within the global organization.Demonstrate strong leadership skills with ability to drive projects/tasks to delivering valueEngage with stakeholders, business analysts and project team to understand the data requirements.Design analytical frameworks to provide insights into a business problem.Explore and visualize multiple data sets to understand data available and prepare data for problem solving.Design database models (if a data mart or operational data store is required to aggregate data for modeling). About You You're a fit for the Lead Data Engineer/ Data Engineering Manager (Snowflake), if your background includes:QualificationsB-Tech/M-Tech/MCA or equivalentExperience7-9 years of corporate experienceLocationBangalore, IndiaHands-on experience in developing data models for large scale data warehouse/data Lake Snowflake, BWMap the data journey from operational system sources through any transformations in transit to itsdelivery into enterprise repositories (Warehouse, Data Lake, Master Data, etc.)Enabling on the overall master and reference data strategy, including the procedures to ensure the consistency and quality of Finance reference data.Experience across ETL, SQL and other emerging data technologies with experience in integrations of a cloud-based analytics environmentBuild and refine end-to-end data workflows to offer actionable insightsFair understanding of Data Strategy, Data Governance ProcessKnowledge in BI analytics and visualization toolsPower BI, Tableau #LI-NR1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted -1 days ago
10.0 - 15.0 years
12 - 17 Lacs
Pune, Bengaluru, Hinjewadi
Work from Office
Software Required Skills: Deep experience with Murex (version 3.1 or higher) in a production environment focusing on reports and Datamart modules Strong SQL proficiency for data querying, issue analysis, and troubleshooting Shell scripting (Bash/sh) skills supporting issue investigation and automation Use of incident management tools such as ServiceNow or JIRA for tracking and reporting issues Familiarity with report development and data analysis in financial contextsPreferred Skills: Experience with other reporting tools or frameworks, such as Tableau, PowerBI, or QlikView Knowledge of data warehousing concepts and architecture Basic scripting knowledge in other languages (Python, Perl) for automationOverall Responsibilities Lead and oversee the support activities for Murex Datamart and reporting modules, ensuring operational stability and accuracy Provide L2/L3 technical support for report-related incidents, resolve complex issues, and perform root cause analysis Monitor report generation, data extraction, and reconciliation processes, ensuring timely delivery Collaborate with business stakeholders to address reporting queries, anomalies, and data discrepancies Support and coordinate system upgrades, patches, and configuration changes affecting reporting modules Maintain comprehensive documentation of system configurations, incident resolutions, and process workflows Lead problem resolution initiatives, including performance tuning and automation opportunities Manage support teams during shifts (24x5/24x7), ensuring effective incident escalation and stakeholder communication Drive continuous improvement initiatives to enhance report accuracy, data quality, and operational efficiencyStrategic objectives: Maximize report availability, accuracy, and reliability Reduce incident resolution times and recurring issues Strengthen reporting processes through automation and data quality enhancementsPerformance outcomes: Minimal unplanned downtime of reporting systems High stakeholder satisfaction with timely, accurate reporting Clear documentation and proactive communication with stakeholdersTechnical Skills (By Category)Reporting & Data Analysis (Essential): Extensive experience supporting Murex Datamart, reports, and related workflows SQL proficiency for data extraction, troubleshooting, and validation Understanding of report structures for P&L, MV, Accounting, Risk, etc.Scripting & Automation (Essential): Shell scripting (Bash/sh) for automation, issue diagnosis, and process automation Experience in automating routine report checks and data validationsDatabases & Data Management (Essential): Relational database management, data querying, and reconciliation Knowledge of data warehousing concepts and architectureSupport Tools & Incident Management (Essential): Hands-on experience with ServiceNow, JIRA, or similar platformsAdvanced & Cloud (Preferred): Familiarity with cloud data hosting, deployment, or cloud-based reporting solutions Experience with other programming languages (Python, Perl) for automationExperience Over 10+ years supporting Murex production environments with a focus on Datamart and reporting modules Proven expertise in resolving complex report issues, data discrepancies, and interface problems Demonstrated leadership with experience managing or supporting L2/L3 teams Support support in high-pressure environments, including escalations Industry experience within financial services, especially trading, risk, or accounting, is preferredAlternative experience pathways: Extensive scripting, data support, and operational expertise supporting financial reports may qualify candidates with fewer years but equivalent depth of knowledgeDay-to-Day Activities Monitor system dashboards, reports, and logs for anomalies or failures Troubleshoot report data issues, interface failures, and system errors Lead incident investigations, performed root cause analysis, and document resolutions Collaborate with business units to clarify reporting needs and resolve discrepancies Support deployment, configuration changes, and upgrades affecting Report and Datamart modules Automate repetitive tasks, batch jobs, and data validation workflows Create and maintain documentation, runbooks, and best practices Conduct shift handovers, incident reviews, and process improvement sessions Proactively identify improvement opportunities in reporting reliability and performanceQualifications Bachelors degree in Computer Science, Finance, Data Management, or related discipline Strong expertise in SQL, shell scripting, and report troubleshooting Deep understanding of financial reporting, P&L, MV, Risk, and accounting data flows Support experience in high-availability, high-pressure settings Willingness to work shifts, including nights, weekends, or holidays as neededProfessional Competencies Strong analytical and problem-solving skills for resolving complex issues Excellent communication skills for engaging with technical teams, business stakeholders, and vendors Leadership qualities to support and mentor support teams Ability to work independently and prioritize effectively under pressure Adaptability to evolving systems and technological environments Focus on continuous improvement and operational excellence
Posted 4 days ago
8.0 - 13.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description Our Customer is a Leading bank in Australia that provides a front to back integrated platform for straight-through processing and risk management. This is a multi-year initiative where different projects run in concurrence under the program's variety milestones. These streams include new product initiatives, new entity roll-outs, and regulatory compliance. We will have key roles in projects such as managing the scope, design, and delivering requirements from front to back office with Excelian. We are looking for talented and ambitious people. The roles are in the respective Functional, Test Management, Development, Test Support, Environment Management and Release teams. These units will collectively undertake scoping, design, building, testing, and implementation phases to deliver the variety program milestones. Looking for an experienced technical business analyst for the core Treasury IT team to deliver projects for the bank's treasury division for the business with a focus on Commodities, FX, and MM products. Responsibilities The Senior Technical Business Analyst role looks after the business engagement, functional requirements, solution design, and some system configuration for delivery of the migration projects. The role will require engagement with relevant business stakeholders for the initiatives in the approved scope and then work closely with the delivery team as well as relevant Technology partners, to ensure timeliness and quality of the delivery. The role is hence expected to have excellent Business Analysis abilities, as well as the ability to project manage small to medium initiatives. This will involve leading the implementation of regional rollouts in parallel with other sub-streams. The role would include solution design and technical configuration of the Murex 3.1 platform in cooperation with other technical teams. Hands-on work on the application would be required. Skills Must have 8+ years of relevant Murex (and/or other Primary Trading System) Front Office experience. Good/Expert knowledge of at least IRD, FI, CRD, Commodities, and/or FXMM implementation on Murex. Extensive experience in dealing with front-office trading & sales stakeholders in Markets or Treasury divisions. Good hands-on knowledge of FO configurationinstruments, generators, curves, market data, market conventions, etc. Good understanding of FO modulesPretrade workflow, Simulation screens, Simulation Viewer, eTradepad, P&L notepad, market operations, etc. Experience in the implementation of Murex 3.1 with regard to front office capabilities. Nice to have Experience on MReport / Datamart, postTrade workflows, and interfaces is nice to have. Other Languages EnglishC1 Advanced Seniority Senior
Posted 4 days ago
10.0 - 15.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Req ID: 322003 We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karntaka (IN-KA), India (IN). Strong hands-on experience in SQLs, PL/SQLs [Procs, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp preferred]"‚"‚"‚"‚ Experience on MS-SQL [preferred], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred)"‚"‚"‚"‚"‚"‚"‚"‚"‚ Total Exp7 "“ 10 Yrs.
Posted 6 days ago
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 1 week ago
1.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows
Posted 1 week ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Primary Strong expereince in DW testing Strong Experience in testing ETL jobs Experience in writing test scripts for Java/Python scripts Strong Experience in writing complex SQL queries Strong Understanding of data warehouse concepts and data mart testing; To ensure Date integrity is not compromised Test Case Preparation and Execution Transform complex Business logic into SQL or PL/SQL queries Defect tracking experience (i.e. AzDO, Jira, Rally, etc) Exposure to large data sets and understand Data Quality Framework • Automates applicable test cases for regression testing • Good understanding of software testing methodologies, processes and quality metrics • Performs testing • Programming skills in Java • Identifies defects and work with scrum team on resolution • Experience in Selenium, SoapUI or similar tools or as stated in Program tech stack requirements • Accountable for overall test quality (functional & regression) • Promotes environment of collaboration within the test team • Coordinates test planning and execution (FFT, SIT, CVT; performance tests, load tests, etc.) • Provides reports related to product quality metrics • Provides quality related inputs on go/no-go for every release (incl. promotion to INT, CVT & prod environments) • Attends scrum ceremonies • Updates status in Rally on a daily basis • Ensures test cases cover 100% of new functionality
Posted 1 week ago
6.0 - 10.0 years
2 - 5 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1999_JOB Date Opened 17/06/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Tester City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
8.0 - 12.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
7.0 - 11.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Murex Front Office Finance Good to have skills : Murex Back Office Workflows Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will play a crucial role in developing and implementing solutions that enhance business operations and drive efficiency. Roles & Responsibilities: Key liaison with Front Office user base and working with traders/end users to understand their requirements and provide timely solutions Hands-on Knowledge on Rate Curves Setup Hands-on knowledge in MX Market Risk Configure from scratch all FO Modules PreTrade, E-tradepad, Events, Simulation, Market Data etc. Performing detailed P&L, cash flow analysis and understanding of RFR Instruments post Libor Transformation POC for all FO Queries from the User side. Train the traders/end users on Mx.3 FO functionality Professional & Technical Skills: 8+yrs of exp in Murex system-Front office modules of Mx 3.1 platform Deep Understanding of Treasury Product like FX,MM,FI,IRS,Murex FO & risk modules Experience on scalable, resilient transaction processing system in the Financial markets Strong analytical & logical approach to problem solving & system development, trade lifecycle across FO,BO& MO tiers Perform Requirement Analysis in FO space for various asset classes, initial analysis of existing production data/test cases suite Analyse/understand product requirement & offer solution/support to facilitate rollouts Know FO business to design & build pricing/booking capabilities in Murex system Participate with internal business partners on cross functional project to provide STP solution for pricing, distribution & execution capabilities Additional Information: Give STP solution-pricing, distribution/execution capabilities on cross functional project This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 week ago
7.0 - 11.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Murex Front Office Finance Good to have skills : Murex Back Office Workflows, AEM 6 Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will play a crucial role in developing and implementing solutions that enhance business operations and drive efficiency. Roles & Responsibilities: Key liaison with Front Office user base and working with traders/end users to understand their requirements and provide timely solutions Hands-on Knowledge on Rate Curves Setup Hands-on knowledge in MX Market Risk Configure from scratch all FO Modules PreTrade, E-tradepad, Events, Simulation, Market Data etc. Performing detailed P&L, cash flow analysis and understanding of RFR Instruments post Libor Transformation POC for all FO Queries from the User side. Train the traders/end users on Mx.3 FO functionality Professional & Technical Skills: 8+yrs of exp in Murex system-Front office modules of Mx 3.1 platform Deep Understanding of Treasury Product like FX,MM,FI,IRS,Murex FO & risk modules Experience on scalable, resilient transaction processing system in the Financial markets Strong analytical & logical approach to problem solving & system development, trade lifecycle across FO,BO& MO tiers Perform Requirement Analysis in FO space for various asset classes, initial analysis of existing production data/test cases suite Analyse/understand product requirement & offer solution/support to facilitate rollouts Know FO business to design & build pricing/booking capabilities in Murex system Participate with internal business partners on cross functional project to provide STP solution for pricing, distribution & execution capabilities Additional Information: Give STP solution-pricing, distribution/execution capabilities on cross functional project This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 week ago
7.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Murex Front Office Finance Good to have skills : Murex Back Office Workflows Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will play a crucial role in developing and implementing solutions that enhance business operations and drive efficiency. Roles & Responsibilities: Key liaison with Front Office user base and working with traders/end users to understand their requirements and provide timely solutions Hands-on Knowledge on Rate Curves Setup Hands-on knowledge in MX Market Risk Configure from scratch all FO Modules PreTrade, E-tradepad, Events, Simulation, Market Data etc. Performing detailed P&L, cash flow analysis and understanding of RFR Instruments post Libor Transformation POC for all FO Queries from the User side. Train the traders/end users on Mx.3 FO functionality Professional & Technical Skills: 8+yrs of exp in Murex system-Front office modules of Mx 3.1 platform Deep Understanding of Treasury Product like FX,MM,FI,IRS,Murex FO & risk modules Experience on scalable, resilient transaction processing system in the Financial markets Strong analytical & logical approach to problem solving & system development, trade lifecycle across FO,BO& MO tiers Perform Requirement Analysis in FO space for various asset classes, initial analysis of existing production data/test cases suite Analyse/understand product requirement & offer solution/support to facilitate rollouts Know FO business to design & build pricing/booking capabilities in Murex system Participate with internal business partners on cross functional project to provide STP solution for pricing, distribution & execution capabilities Additional Information: Give STP solution-pricing, distribution/execution capabilities on cross functional project This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 week ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
Responsibility: Developand set up the transformation of data from sources to enable analysis anddecision making. Maintain data flow from source to the designated target without affecting the crucialdata flow and to play a critical part in the data supply chain, by ensuringstakeholders can access and manipulate data for routine and ad hoc analysis. Implement projects focused on collecting, aggregating, storing, reconciling, and makingdata accessible from disparate sources. Provide support during the full lifecycle of data from ingesting through analytics toaction. Analyzeand organize raw data. Evaluate business needs and objectives. Interprettrends and patterns. Conduct complex data analysis and report on results. Coordinate with source team and end-user and develop solutions. Implementdata governance policies and support data-versioning processes. Maintain security and dataprivacy. Requirements Must Have: Proven hands-on experience inbuilding complex analytical queries in Teradata. 4+ years of extensive programming experience in Teradata Tools and Utilities. Hands-on experience in Teradata utilities such as Fast Load, Multi Load, BTEQ, and TPT. Experience in data quality management and best practices across data solution implementations. Experience in development testing and deployment, coding standards, and best practices. Experience in preparing technical design documentation. Strong team collaboration and experience working with remote teams. Knowledge in data modelling and database management such as performance tuning of the Enterprise Data Warehouse, Data Mart, and Business Intelligence Reporting environments, andsupport the integration of those systems with other applications. Good to have: Should be good in Unix Shellscripting. Experience in DataTransformation using ETL/ELT tools. Experience in differentrelational databases (i.e. Teradata, Oracle, PostgreSQL). experience with CI/CD development and deployment tools (i.e. Maven, Jenkins, Git, Kubernetes).
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru, Bangalaore
Work from Office
ETL Data Engineer - Tech Lead Bangalore, India Information Technology 16748 Overview We are seeking a skilled and experienced Data Engineer who has expertise in playing a vital role in supporting data discovery, creating design document, data ingestion/migration, creating data pipelines, creating data marts and managing, monitoring the data using tech stack Azure, SQL, Python, PySpark, Airflow and Snowflake. Responsibilities 1. Data DiscoveryCollaborate with source teams and gather complete details of data sources and create design diagram. 2. Data Ingestion/MigrationCollaborate with cross-functional teams to Ingest/migrate data from various sources to staging area. Develop and implement efficient data migration strategies, ensuring data integrity and security throughout the process. 3. Data Pipeline DevelopmentDesign, develop, and maintain robust data pipelines that extract, transform, and load (ETL) data from different sources into GCP. Implement data quality checks and ensure scalability, reliability, and performance of the pipelines. 4. Data ManagementBuild and maintain data models and schemas, ensuring optimal storage, organization, and accessibility of data. Collaborate with requirement team to understand their data requirements and provide solutions by creating data marts to meet their needs. 5. Performance OptimizationIdentify and resolve performance bottlenecks within the data pipelines and data services. Optimize queries, job configurations, and data processing techniques to improve overall system efficiency. 6. Data Governance and SecurityImplement data governance policies, access controls, and data security measures to ensure compliance with regulatory requirements and protect sensitive data. Monitor and troubleshoot data-related issues, ensuring high availability and reliability of data systems. 7. Documentation and CollaborationCreate comprehensive technical documentation, including data flow diagrams, system architecture, and standard operating procedures. Collaborate with cross-functional teams, analysts, and software engineers, to understand their requirements and provide technical expertise. Requirements Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience as a Data Engineer Technical lead, with a focus on output driven. - Strong knowledge and hands-on experience with Azure, SQL, Python, PySpark, Airflow, Snowflake and related tools. - Proficiency in data processing and pipeline development. Solid understanding of data modeling, database design, and ETL principles. - Experience with data migration projects, including data extraction, transformation, and loading. - Familiarity with data governance, security, and compliance practices. - Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. - Excellent communication and interpersonal skills, with the ability to articulate technical concepts to non-technical stakeholders.
Posted 2 weeks ago
7.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
As a BI Qliksense Developer with a Visualization focus, you will be instrumental in transforming raw data into compelling visual stories that empower our leadership team to make informed strategic decisions. You will leverage your deep expertise in Qlik Sense, data modeling, and business intelligence principles to deliver high-quality, user-centric solutions. Your ability to understand complex business requirements, design intuitive visualizations, and optimize performance will be crucial to your success. You will also contribute to data governance, security, and the overall health of our BI environment. Responsibilities - Design, develop, and maintain complex Qlik Sense reports, dashboards, and applications with a strong focus on delivering insights for Executive and Leadership stakeholders. - Utilize advanced Qlik Sense features and scripting to create highly interactive and insightful visualizations. - Implement On-Premises Qlik Sense architectures, ensuring scalability and stability. - Configure and manage Qlik Sense Enterprise, including security rules and access controls. - Develop and manage Qlik Sense extensions and integrations as needed. - Perform Qlik Sense administration tasks, including user management, license allocation, and system monitoring. - Implement and manage N-Printing for report distribution and scheduling. - Configure and manage Qlik Sense Alerting to proactively notify stakeholders of critical data changes. - Take a proactive and leading role in gathering detailed requirements from product owners, internal analysts, and business stakeholders, particularly focusing on the needs of Executive and Leadership users. - Translate complex business requirements and acceptance criteria into effective and efficient Qlik Sense solutions. -
Posted 3 weeks ago
6.0 - 9.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Primary Skills Good knowledge and expertise in datamart Knowledge of datamart reporting , feeders, batch of feeders Strong analytical and debugging ability, modifying and enhancing existing complex Datamart objects Good exposure dynamic tables, pre and post filters, feeders, batch of feeders, extractions, reporting tables and processing scripts Experience in Simulation based Reports and Risk Matrix based reports, complex reports would be required Able to configure, execute, and troubleshoot batch reports in MXG Able to design and optimize the usage of dynamic tables Experience of Sybase DB or oracle database Experience in Datamart/EOD solution design and effort estimation with limited support required Knowledgeable in Unix shell scripting Knowledge and hands on experience in implementing, developing and supporting MX.3 End of Day processes and Trade Life Cycle Management using workflows engine. (At least 5 years) Experience in leading the integration stream for a MX.3 implementation for Integration or Reporting stream, including leading and coordinating Design sessions and sprint showsessions. Strong Knowledge of SQL/RDBMS technology Secondary Skills Good to have knowledge on GOM definition, MxML development. Experience on different Asset classes, Trade workflows, trade attributes and Financial and Non-Financial static data Good understanding of both exchange traded and OTC Derivatives with specific focus on Credit and Rates products with clarity on their life cycle Understanding of Murex BO functionalities like Confirmation and settlements. Skills (competencies) Verbal Communication Written Communication
Posted 3 weeks ago
2.0 - 5.0 years
2 - 4 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Work from Office
Role & responsibilities 3 to 4+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud and Data Governance domains. • Take ownership of the technical aspects of implementing data pipeline & migration requirements, ensuring that the platform is being used to its fullest potential through designing and building applications around business stakeholder needs. • Interface directly with stakeholders to gather requirements and own the automated end-to-end data engineering solutions. • Implement data pipelines to automate the ingestion, transformation, and augmentation of both structured, unstructured, real-time data, and provide best practices for pipeline operations • Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers. Implement Data Governance best practices. • Create and maintain clear documentation on data models/schemas as well as transformation/validation rules. • Implement tools that help data consumers to extract, analyze, and visualize data faster through data pipelines. • Implement data security, privacy, and compliance protocols to ensure safe data handling in line with regulatory requirements. • Optimize data workflows and queries to ensure low latency, high throughput, and cost efficiency. • Leading the entire software lifecycle including hands-on development, code reviews, testing, deployment, and documentation for batch ETL's. • Work directly with our internal product/technical teams to ensure that our technology infrastructure is seamlessly and effectively integrated • Migrate current data applications & pipelines to Cloud leveraging technologies in future Preferred candidate profile • Graduate with Engineering Degree (CS/Electronics/IT) / MCA / MCS or equivalent with substantial data engineering experience. • 3+ years of recent hands-on experience with a modern programming language (Scala, Python, Java) is required; Spark/ Pyspark is preferred. • Experience with configuration management and version control apps (ie: Git) and experience working within a CI/CD framework is a plus. • 3+ years of recent hands-on SQL programming experience in a Big Data environment is required. • Working knowledge of PostgreSQL, RDBMS, NoSQL and columnar databases. • Experience developing and maintaining ETL applications and data pipelines using big data technologies is required; Apache Kafka, Spark, Airflow experience is a must. • Knowledge of API and microservice integration with applications. • Experience with containerization (e.g., Docker) and orchestration (e.g., Kubernetes). • Experience building data solutions for Power BI and Web visualization applications. • Experience with Cloud is a plus. • Experience in managing multiple projects and stakeholders with excellent communication and interpersonal skills. • Ability to develop and organize high-quality documentation. • Superior analytical skills and a strong sense of ownership in your work. • Collaborate with data scientists on several projects. Contribute to development and support of analytics including AI/ML. • Ability to thrive in a fast-paced environment, and to manage multiple, competing priorities simultaneously. • Prior Energy & Utilities industry experience is a big plus. Experience (Min. Max. in yrs.): 3+ years of core/relevant experience Location: Mumbai (Onsite)
Posted 3 weeks ago
7.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
About The Role As a BI Qliksense Developer with a Visualization focus, you will be instrumental in transforming raw data into compelling visual stories that empower our leadership team to make informed strategic decisions. You will leverage your deep expertise in Qlik Sense, data modeling, and business intelligence principles to deliver high-quality, user-centric solutions. Your ability to understand complex business requirements, design intuitive visualizations, and optimize performance will be crucial to your success. You will also contribute to data governance, security, and the overall health of our BI environment. Responsibilities - Design, develop, and maintain complex Qlik Sense reports, dashboards, and applications with a strong focus on delivering insights for Executive and Leadership stakeholders. - Utilize advanced Qlik Sense features and scripting to create highly interactive and insightful visualizations. - Implement On-Premises Qlik Sense architectures, ensuring scalability and stability. - Configure and manage Qlik Sense Enterprise, including security rules and access controls. - Develop and manage Qlik Sense extensions and integrations as needed. - Perform Qlik Sense administration tasks, including user management, license allocation, and system monitoring. - Implement and manage N-Printing for report distribution and scheduling. - Configure and manage Qlik Sense Alerting to proactively notify stakeholders of critical data changes. - Take a proactive and leading role in gathering detailed requirements from product owners, internal analysts, and business stakeholders, particularly focusing on the needs of Executive and Leadership users. - Translate complex business requirements and acceptance criteria into effective and efficient Qlik Sense solutions. - Apply expert knowledge of Business Intelligence, Data Modeling (dimensional and relational), and Data Warehousing concepts to design robust Qlik Sense data models. - Build efficient BI data models within Qlik Sense, ensuring data accuracy, performance, and usability. - Implement and enforce data security and data Governance policies within the Qlik Sense environment. - Manage data integrations and data preparations within Qlik Sense, leveraging various data sources. - Possess a strong understanding of traditional Data warehouses and Data Marts and how they integrate with Qlik Sense. - Demonstrate expert knowledge of authoring and troubleshooting complex SQL queries to extract and prepare data for Qlik Sense. - Optimize SQL queries for performance and efficiency. Connectivity, Data Modeling & Source Data Procurement - Exhibit expert knowledge and application of general and tool-specific connectivity methods to various data sources (databases, flat files, APIs, etc.). - Configure and manage Qlik Sense security settings to ensure appropriate data access and governance. - Apply expert-level performance tuning techniques within Qlik Sense applications and the overall environment. - Demonstrate expert knowledge and application of using the right visuals, optimizing layout for clarity and impact, ensuring mobility and responsiveness, and integrating advanced analytics into Qlik Sense - Design intuitive and user-friendly dashboards tailored for Executive and Leadership consumption. - Apply strong Business and Data analysis skills to understand underlying data and business processes. - Possess a working understanding of Trading concepts, particularly Market Risk (added advantage). - Maintain knowledge of upcoming trends in BI tools and technology, particularly within the Qlik ecosystem. - Demonstrate familiarity with user-centered design and testing methodologies. - Address usability and accessibility concerns in Qlik Sense development. - Possess strong communication skills to effectively engage with technical developers, architects, and business stakeholders, including Executive and Leadership teams. - Exhibit good project management and organizational skills to manage development tasks and timelines effectively. - Be self-motivated, proactive in taking initiative, and innovative in finding solutions. - Demonstrate the ability to drive and mentor junior team members. - Must have experience in working with AGILE and KANBAN methodologies. - Be capable of running and facilitating sprint activities. Requirements - Minimum of 7 years of overall experience in the IT industry. - Expert-level knowledge and extensive experience in Qlik Sense development and administration. - Proven experience implementing On-Premises Qlik Sense architectures. - Hands-on experience with N-Printing and Qlik Sense Alerting. - Demonstrated experience in designing and developing Qlik Sense reports and dashboards specifically for Executive and Leadership use cases. - Expert knowledge and practical application of data visualization best practices. - Expert knowledge of Business Intelligence, Data Modeling, and Data Warehousing principles. - Proven ability to gather and translate business requirements into technical solutions. - Expert experience in building BI data models, implementing data security, and adhering to data Governance policies. - Experience with data integrations and data preparation within Qlik Sense. - Expert knowledge of authoring and troubleshooting complex SQL queries. - Proven ability to perform performance tuning on Qlik Sense applications and environments. - Excellent verbal and written communication skills. - Good project management and organizational skills. - Experience working with AGILE and KANBAN methodologies. - Self-motivated with a strong ability to take initiative. Location Bangalore Notice Period Immediate Joiner Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject
Posted 3 weeks ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 4 weeks ago
10 - 15 years
30 - 35 Lacs
Noida
Remote
SR. DATA MODELER FULL-TIME ROLE REMOTE OR ONSITE Job Summary: We are seeking an experienced Data Modeler to support the Enterprise Data Platform (EDP) initiative, focusing on building and optimizing curated data assets on Google BigQuery. This role requires expertise in data modeling, strong knowledge of retail data, and an ability to collaborate with data engineers, business analysts, and architects to create scalable and high-performing data structures. Required Qualifications: 5+ years of experience in data modeling and architecture in cloud data platforms (BigQuery preferred). Expertise in dimensional modeling (Kimball), data vault, and normalization/denormalization techniques. Strong SQL skills, with hands-on experience in BigQuery performance tuning (partitioning, clustering, query optimization). Understanding of retail data models (e.g., sales, inventory, pricing, supply chain, customer analytics). Experience working with data engineering teams to implement models in ETL/ELT pipelines. Familiarity with data governance, metadata management, and data cataloging. Excellent communication skills and ability to translate business needs into structured data models. Key Responsibilities: 1. Data Modeling & Curated Layer Design Design logical, conceptual, and physical data models for the EDPs curated layer in BigQuery. Develop fact and dimension tables, ensuring adherence to dimensional modeling best practices (Kimball methodology). Optimize data models for performance, scalability, and query efficiency in a cloud-native environment. Work closely with data engineers to translate models into efficient BigQuery implementations (partitioning, clustering, materialized views). 2. Data Standardization & Governance Define and maintain data definitions, relationships, and business rules for curated assets. Ensure data integrity, consistency, and governance across datasets. Work with Data Governance teams to align models with enterprise data standards and metadata management policies. 3. Collaboration with Business & Technical Teams Engage with business analysts and product teams to understand data needs, ensuring models align with business requirements. Partner with data engineers and architects to implement best practices for data ingestion and transformation. Support BI & analytics teams by ensuring curated models are optimized for downstream consumption (e.g., Looker, Tableau, Power BI, AI/ML models, APIs). Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)
Posted 1 month ago
6 - 11 years
14 - 24 Lacs
Bengaluru
Remote
Role & responsibilities MUST HAVE Knowledge :- Data Warehousing (Redshift , Azure Synpase) Database :- Oracle/SQL server/ etc Data Modeling (Erwin/DBVm) Data lake Data mart SQL Good to have - Nosql (Cosmos /dynamo/mongo) JD for Data Manager The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The successful candidate will: - Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). - Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities - Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). - Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. - Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. - Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills - Bachelors or master’s degree in computer/data science technical or related experience. - 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). - Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. - Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. - Experience in team management, communication, and presentation. Preferred candidate profile
Posted 1 month ago
5 - 7 years
15 - 19 Lacs
Bengaluru
Work from Office
Murex Responsibility: Multi-skilled across DM and MxML. not necessarily very experienced and general skillset Write transformation logic for source data format to Murex understandable format (MxML) Create MxML import and export workflows using MXML Exchange Update and Create Datamart Reports including managing EOD changes Build a reconciliation process across source and destination Configure Messaging queues for real-time interfacing Document Functional, Technical Specifications and Test Cases for integration Skills Must have Murex Knowledge of around 5+ years on Murex/MxML Exchange, Contract, or Deliverable workflows Good exposure to writing/coding MXML Formulae Has previously developed interfaces (Deals, Static Data) via Murex (both upstream and Downstream) Has Knowledge of XML Transformations, Document and template generation from MXML Has experience and knowledge of Datamart Report Development and EOD processing Has knowledge of various tasks in MxML and how they work Nice to have Murex Dev priority is MxML with exposure to Datamart. Mid/Senior level Knowledge around SWIFT message generation MT300, MT305, MT540 , MT202 , MT103 DevOps on Murex experience (GIT, Jenkins, JIRA, etc) Technical solution design experience and start-to-end solution ownership Experience with Interest Rate Derivatives, FX Derivatives DataMart nice to have
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2