Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeler at our company, you will play a crucial role in designing conceptual, logical, and physical models for Azure Databricks and Azure Data Lake to support structured, semi-structured, and unstructured data. Your responsibilities will include: - Utilizing your 6+ years of experience in data modeling, with a preference for insurance industry datasets such as policies, claims, customer, or actuarial data. - Demonstrating advanced skills in data modeling tools like Erwin, ER/Studio, PowerDesigner, or Microsoft Visio, and version control using GitHub. - Applying deep understanding of relational, dimensional, and data lake modeling techniques optimized for Databricks/Spark-based processing. - Modeling and documenting metadata, reference data, and master data with Informatica to support robust data governance and quality. - Utilizing strong SQL and Spark skills for data profiling, validation, and prototyping in Databricks environments. - Ensuring compliance with regulatory and compliance requirements for insurance data, such as IFRS 17 and Solvency II. Regarding the company, Virtusa values teamwork, quality of life, and professional development. With a global team of 27,000 professionals, we are dedicated to providing exciting projects, opportunities, and working with cutting-edge technologies to support your career growth. At Virtusa, we foster a collaborative team environment that encourages new ideas and excellence. Join us to unleash your potential and contribute to our dynamic work culture.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Role Overview: You will be joining our dedicated software design team as a BI Developer. Reporting directly to the Technical Manager, your role will involve contributing to all aspects of software coding and design. Collaboration with Business Analysts will be crucial as you translate design specifications into application, dashboard, or reporting solutions. This position offers the opportunity to work on projects with a global impact and be part of a hardworking team. Key Responsibilities: - Develop visual reports, dashboards, and KPI scorecards using Power BI desktop & Power Service. - Create writeback tool using Power App and Automate for various business requirements. - Connect to data sources, import data, and transform data for Business Intelligence purposes. - Develop tabular and multidimensional models in compliance with warehouse standards. - Integrate Power BI reports into other applications using embedded analytics and API automation. - Implement row level security on data and understand application security layer models in Power BI. - Collaborate with business professionals, application developers, and technical staff within an agile process environment to successfully implement solutions. Qualifications Required: - Bachelor's degree in computer science or related fields. - 5+ years of experience in Power Toolkit (Power BI, Power Apps, and Automate). - 5+ years of experience in scripting languages like DAX and Python. - Proven working knowledge of TSQL, stored procedures, and database performance tuning. - Good to have experience in Databricks, Big Data, and Gen AI technologies. - Excellent UI design skills and hands-on experience in designing and developing entities in PowerApps. - Expertise in Data modeling, Prototyping, performance tuning, and data analysis techniques. - Ability to quickly learn new software and technologies and thrive in a fast-paced environment. (Note: Omitted the additional details of the company as they were repetitive),
Posted 1 day ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
You are a strategic thinker passionate about driving solutions in BI and Analytics (Alteryx, SQL, Tableau), and you have found the right team. As a BI Developer Senior Associate within the Asset and Wealth Management Finance Transformation and Analytics team, you will spend each day defining, refining, and delivering set goals for our firm. **Key Responsibilities:** - Design the technical and information architecture for the MIS (DataMarts) and Reporting Environments. - Focus on data modeling and database design for the AWM LOB. - Support the MIS team in query optimization, and deployment of BI technologies including but not limited to Alteryx, Tableau, Databricks, MS SQL Server (T SQL programming) /SSIS and SSRS. - Design and develop complex dashboards from large and/or different data sets. - Scope, prioritize and coordinate activities with the product owners. - Partner with technology teams to identify solutions required to establish a robust MIS environment. - Design and develop complex queries which cater to data inputs for the dashboards/reports from large data sets. - Work on the agile improvements by sharing experiences and knowledge with the team. Advocate and steer the team to implement CI/CD (DevOps) workflow. - Overall, the ideal candidate for this position will be highly skilled in reporting methodologies, data manipulation & analytics tools and have expertise in the visualization and presentation of enterprise data. **Qualifications Required:** - Bachelor's Degree in MIS, Computer Science, or Engineering. Different fields of study, with significant professional experience in BI Development, are acceptable. - 12+ years of experience in Data warehousing, ETL, and visualization. - Strong work experience in data wrangling tools like Alteryx. - Working proficiency in Data Visualization Tools. Experience with BI technologies including Alteryx, Tableau, MS SQL Server (SSIS, SSRS), Databricks, ThoughtSpot. - Working knowledge of querying data from databases such as MS SQL Server, Snowflake, Databricks, etc. - Strong knowledge of designing database architecture and building scalable visualization solutions. Ability to write complicated yet efficient SQL queries and stored procedures. - Experience in building end-to-end ETL processes. - Experience in working with multiple data sources and handling large volumes of data. - Experience in the conversion of data into information. - Experience in the end-to-end implementation of Business Intelligence (BI) reports & dashboards. - Good communication and analytical skills. The job description does not provide any additional details about the company.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
In the Revenue and Operations department, your mission is to promote, sell, onboard, and service the products built by the technical teams and departments. This involves focusing on a 360-degree commercial strategy to bring new products to the market, promote existing products, and maximize client usage and experience. As a BI/Data Analyst, your role is crucial in bridging the gap between business stakeholders and the development team. Your responsibilities include gathering and analyzing business requirements, translating them into technical specifications, and ensuring data quality for effective reporting processes. You will collaborate closely with Data engineers and BI developers to establish a smooth development and deployment process. Additionally, you will play a key role in assuring the data quality of BI reports. Key Responsibilities: - Work closely with other team members in the BI (Business Intelligence) team to stay aligned and share knowledge. - Turn data into useful insights to support better decision-making within the company. - Build and manage BI tools and reports to meet business needs. - Communicate with global stakeholders to understand their requirements and share findings. - Utilize strong Tableau skills to create clear and helpful data visualizations. - Communicate effectively with both technical and non-technical individuals. Qualifications Required: - 2 to 4 years of proven experience working as an individual contributor in the field of business intelligence. - Excellent communication skills. - Ability to analyze business requirements and translate them into data solutions. - Proficiency in Tableau (Desktop, Prep, Server, and Cloud). - Strong analytical and problem-solving skills with the ability to work with large datasets. - Proficiency in SQL for querying and manipulating data. - Basic understanding of data modeling concepts and best practices. - Attention to detail and commitment to delivering high-quality, accurate results. In your role, you will have the opportunity to cooperate closely with other BI team members, contribute to data-driven decision-making, and work with global stakeholders to transform complex data into meaningful insights. Your attitude and commitment to delivering high-quality results will be key to success in this role. At Adform, we offer growth opportunities, an informal work environment, premium health insurance, generous vacation days, paid maternity and paternity leave, an annual learning budget, rewarding referral programs, global perks, and more. We are committed to diversity and inclusion, creating an environment where employees feel valued and free from discrimination. Join us at Adform to explore a dynamic, inspiring, and international work experience.,
Posted 1 day ago
1.0 - 5.0 years
0 Lacs
bangalore, karnataka
On-site
At Goldman Sachs, as an Engineer, you play a crucial role in making things possible by connecting people and capital with ideas, solving challenging engineering problems, and leveraging technology to turn data into action. Join our engineering teams to build scalable software, architect infrastructure solutions, guard against cyber threats, and explore a world of opportunity in the fast-paced financial markets environment. As a Site Reliability Engineer (SRE) on the Data Engineering team at Goldman Sachs, you will be responsible for ensuring observability, cost management, and capacity planning for some of the largest data platforms. Your role involves engaging in the full lifecycle of platforms, from design to decommissioning, with a tailored SRE strategy throughout. **Key Responsibilities:** - Drive adoption of cloud technology for data processing and warehousing - Develop SRE strategy for large platforms like Lakehouse and Data Lake - Collaborate with data consumers and producers to meet reliability and cost requirements - Lead strategic initiatives with a focus on data - Utilize technologies such as Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, and Gitlab **Qualifications Required:** - Bachelor or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or related discipline) - 1-4+ years of work experience in a team-focused environment - Hands-on developer experience for 1-2 years - Understanding and experience in DevOps and SRE principles, automation, and managing technical and operational risk - Familiarity with cloud infrastructure (AWS, Azure, or GCP) - Proven track record in driving strategy with data - Proficiency in data curation, data quality, relational and columnar SQL databases, data warehousing concepts, and data modeling - Excellent communication skills and ability to collaborate with subject matter experts - Strong analytical, problem-solving skills, and a sense of ownership - Ability to build partnerships and drive quantifiable commercial impact **Additional Company Details:** Goldman Sachs is dedicated to providing clean, organized, and impactful data to empower its core businesses. The Data Engineering group focuses on offering the platform, processes, and governance necessary to scale and streamline data for all business units. As an Engineer at Goldman Sachs, you will have the opportunity to innovate, adapt to changes, and thrive in a dynamic global environment.,
Posted 1 day ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As a Lead Data Engineer at Srijan Technologies PVT LTD, you will play a crucial role in designing and developing scalable data pipelines within Microsoft Fabric. Your responsibilities will include: - Designing and Developing Data Pipelines: Develop and optimize scalable data pipelines within Microsoft Fabric using fabric-based notebooks, Dataflows Gen2, Data Pipelines, and Lakehouse architecture. Build robust pipelines for batch and real-time processing. Integrate with Azure Data Factory or Fabric-native orchestration for seamless data movement. - Microsoft Fabric Architecture: Implement scalable, governed data architectures within OneLake and Microsoft Fabric's unified compute and storage platform. Ensure alignment with business needs while promoting performance, security, and cost-efficiency. - Data Pipeline Optimization: Continuously monitor, enhance, and optimize Fabric pipelines, notebooks, and lakehouse artifacts for performance, reliability, and cost. Implement best practices for managing large-scale datasets and transformations. - Collaboration with Cross-functional Teams: Work closely with analysts, BI developers, and data scientists to gather requirements and deliver high-quality datasets. Enable self-service analytics via certified and reusable Power BI datasets connected to Fabric Lakehouses. - Documentation and Knowledge Sharing: Maintain clear documentation for all data pipelines, semantic models, and data products. Share knowledge of Fabric best practices and mentor junior team members. - Microsoft Fabric Platform Expertise: Utilize your expertise in Microsoft Fabric, including Lakehouses, Notebooks, Data Pipelines, and Direct Lake, to build scalable solutions integrated with Business Intelligence layers and other Microsoft data services. Required Skills And Qualifications: - Experience in Microsoft Fabric / Azure Eco System: 7 years working with Azure ecosystem, Relevant experience in Microsoft Fabric, including Lakehouse, OneLake, Data Engineering, and Data Pipelines components. - Proficiency in Azure Data Factory and/or Dataflows Gen2 within Fabric. - Advanced Data Engineering Skills: Extensive experience in data ingestion, transformation, and ELT/ETL pipeline design. - Cloud Architecture Design: Experience designing modern data platforms using Microsoft Fabric, OneLake, and Synapse or equivalent. - Strong SQL and Data Modelling: Expertise in SQL and data modeling for data integration, reporting, and analytics. - Collaboration and Communication: Ability to work across business and technical teams. - Cost Optimization: Experience tuning pipelines and cloud resources for cost-performance balance. Preferred Skills: - Deep understanding of Azure, Microsoft Fabric ecosystem, including Power BI integration, Direct Lake, and Fabric-native security and governance. - Familiarity with OneLake, Delta Lake, and Lakehouse architecture. - Experience using Power BI with Fabric Lakehouses and DirectQuery/Direct Lake mode for enterprise reporting. - Working knowledge of PySpark, strong SQL, and Python scripting within Fabric or Databricks notebooks. - Understanding of Microsoft Purview, Unity Catalog, or Fabric-native governance tools. - Experience with DevOps practices for Fabric or Power BI. - Knowledge of Azure Databricks for building and optimizing Spark-based pipelines and Delta Lake models.,
Posted 1 day ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
As a VP Hedge Accounting Transformation at Barclays, you will embark on a transformative journey by designing and delivering systemic solutions to the accounting specialism of Hedge Accounting. Your role involves expanding the existing product offering under IAS39, considering accounting legislation in different jurisdictions, and adopting IFRS9 and Dynamic Risk Management in the longer term. You will work on delivering extensions to the existing platform while ensuring alignment with finance architecture strategy, standardization, efficiency of operation, and meeting business requirements. Key Responsibilities: - Become a trusted advisor to Treasury, Finance, PIO, and technology colleagues regarding the Hedge Accounting Transformation programme and wider Finance business architecture strategy. - Actively drive transformation outcomes for the function through a strategic lens. - Proactively identify opportunities for improvement, develop conversations, and challenge the status quo. - Champion the transformation journey. - Provide guidance and support to Treasury transformation teams and business users across Treasury. - Present and influence key stakeholders at the Design Authority, Project Forums, and other project meetings. Qualifications Required: - Demonstrable track record within a Hedge Accounting, Treasury, or MTM Product Control environment, working on relevant projects. - Knowledge of interest rate derivatives, risk drivers, Finance process, systems, and technologies. - Professional Accounting qualification. - Range of leadership and communication styles and techniques, including influencing and negotiating with stakeholders. - Appreciation of data principles, including data modeling and design. - Strong data manipulation skills with Excel and experience using data manipulation tools (e.g., Qlikview, Business Objects, Lumira, SmartView, SQL, SAS). - Excellent Power-point skills for storyboard and presentations. Additional Company Details: The location of the role is Noida, IN. Purpose of the role: To develop business capabilities for Finance through functional design, data analysis, end-to-end processes, controls, delivery, and functional testing. Accountabilities: - Functional Design: support options analysis and recommendations, in collaboration with Line SMEs. - Data Analysis/Modelling/Governance: design conceptual data model and governance requirements. - End-to-End Process & Controls: develop target process and controls design/documentation. - Delivery/Implementation Support: update design/functional requirements, resolve RAIDS, and manage change programs. - Functional Testing: develop scripts and data for testing alignment to requirement definitions. Vice President Expectations: - Contribute to strategy, drive requirements, and make recommendations for change. - Manage policies and processes, deliver continuous improvements, and escalate policy breaches. - Advise key stakeholders and demonstrate leadership in managing risk and strengthening controls. - Collaborate with other areas, create solutions based on analytical thought, and build trusting relationships. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 1 day ago
5.0 - 10.0 years
5 - 10 Lacs
bengaluru
Remote
Key Responsibilities Advanced Statistical Analysis & Modeling Conduct sophisticated statistical analyses, including multivariate analysis, time series modeling, survival analysis, and causal inference studies. Design and implement A/B testing frameworks with proper statistical rigor, power analysis, and multiple testing corrections. Build advanced machine learning models for customer segmentation, personalization, recommendation systems, and operational optimization. Perform causal inference analysis using techniques like difference-in-differences, instrumental variables, and propensity score matching to measure true business impact. Business Intelligence & Insights Generation Partner with executive leadership to identify strategic questions and translate business challenges into analytical frameworks. Develop comprehensive dashboards and reporting systems that provide actionable insights for various stakeholders. Conduct deep-dive analyses on customer behavior, market trends, competitive positioning, and operational performance. Generate data-driven recommendations that directly influence product development, marketing strategy, and operational decisions. Customer Analytics & Behavioral Insights Design and implement customer segmentation strategies using advanced clustering techniques, behavioral analysis, and predictive modeling. Develop customer lifetime value models and retention strategies based on a comprehensive analysis of customer interactions and transactional data. Analyze customer journey optimization across all touchpoints to identify friction points, improvement opportunities, and personalization strategies. Build predictive models for customer behavior, including churn prediction, upsell/cross-sell opportunities, and satisfaction forecasting. Experimentation & Causal Analysis Design and analyze controlled experiments to measure the impact of product changes, marketing campaigns, and operational improvements. Implement advanced experimental designs, including factorial experiments, multi-armed bandits, and switchback tests. Conduct causal inference studies to understand the true impact of business initiatives and separate correlation from causation. Develop measurement frameworks for attribution modeling, incrementality testing, and marketing mix optimization. Required Skills & Experience Machine Learning & Predictive Modeling Classical ML Algorithms: Advanced understanding of linear/logistic regression, decision trees, ensemble methods, SVM, and clustering. Deep Learning: Experience with neural networks, CNNs, RNNs, and transformers for both structured and unstructured data. Advanced Ensemble Methods: Knowledge of stacking, blending, Bayesian model averaging, and custom ensemble architectures. Feature Engineering: Proficiency in automated feature generation, selection techniques, and dimensionality reduction. Programming & Data Manipulation Python Expertise: Advanced proficiency with the scientific computing stack, including NumPy, Pandas, SciPy, and Scikit-learn. Statistical Software: Expert-level proficiency in R for statistical analysis and visualization. SQL Mastery: Experience with complex query optimization, window functions, and database design principles. Big Data Technologies: Familiarity with Spark (PySpark/SparkR), the Hadoop ecosystem, and other distributed computing frameworks. Cloud Platforms: Experience with AWS (SageMaker, Redshift, S3), GCP (BigQuery, Vertex AI), or Azure (Synapse, ML Studio). Modern AI & Generative AI Capabilities Large Language Models: Practical experience with GPT-4, Claude, and Gemini for data analysis augmentation and insight generation. Prompt Engineering: Advanced prompting techniques for analytical tasks, report generation, and hypothesis formulation. AI-Assisted Analytics: Using LLMs for data exploration, code generation, and analytical workflow automation. Education & Experience Experience: 5-8 years of progressive experience in data science, analytics, or quantitative research roles. Education: A master's degree in Statistics, Mathematics, Economics, Computer Science, Physics, or a related quantitative field is required. Proven Track Record: A proven history of delivering high-impact analytical projects that directly influenced business decisions and outcomes is essential. Industry Experience: Experience in customer service, telecommunications, SaaS, or related customer-centric industries is strongly preferred
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Role Overview: Join our dedicated software design team as a BI Developer at Adobe. You will report directly to the Technical Manager and play a crucial role in software coding and design. Your collaboration with Business Analysts will be essential in understanding design specifications and converting requirements into application, dashboard, or reporting solutions. This role offers you the opportunity to contribute to projects with a global impact while being part of a hardworking team. Key Responsibilities: - Develop visual reports, dashboards, and KPI scorecards using Power BI desktop & Power Service - Create writeback tool using Power App and Automate for various business requirements - Connect to data sources, import data, and transform data for Business Intelligence - Develop tabular and multidimensional models compatible with warehouse standards - Integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation - Implement row level security on data and understand application security layer models in Power BI - Collaborate with business professionals, application developers, and technical staff in an agile process environment to successfully implement solutions Qualifications Required: - Bachelors in computer science or related streams - 5+ years of work experience in Power Toolkit (Power BI, Power Apps, and Automate) - 5+ years of work experience in scripting language like DAX, Python - Proven working knowledge of TSQL, stored procedures, database performance tuning - Good to have experience in Databricks, Big Data, and Gen AI technologies - Excellent UI design skill and hands-on experience in designing and developing entities in PowerApps - Expertise in Data modeling, Prototyping, performance tuning, and data analysis techniques - Ability to learn new software and technologies quickly and adapt to an ambitious and fast-paced environment Additional Company Details: At Adobe, you will be immersed in an exceptional work environment recognized globally. You will work alongside colleagues committed to mutual growth through our unique Check-In approach where ongoing feedback is encouraged. If you are seeking to make an impact, Adobe is the place for you. Explore the meaningful benefits we offer and discover what our employees are saying about their career experiences on the Adobe Life blog.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Solution Design Business Analyst - Vice President in our company, you will play a crucial role in driving key strategic change initiatives for regulatory deliverables across Risk, Finance, and Treasury. To excel in this role, you should have the following skills and experience: - Required experience in business/data analysis to present complex data issues in a simple and engaging manner. - Proficiency in front to back system designing and complex business problem solutioning, including data gathering, data cleansing, and data validation. - Ability to analyze large volumes of data, identify patterns, address potential data quality issues, conduct metrics analysis, and turn analysis into actionable insights. - Experience in capturing business requirements and translating them into technical data requirements. - Strong collaboration skills to work with stakeholders and ensure proposed solutions meet their needs and expectations. - Capability to create operational and process designs to ensure proposed solutions are delivered within the agreed scope. Additionally, highly valued skills may include working experience in the financial services industry, familiarity with data analysis tools like SQL, Hypercube, Python, and data visualization/reporting tools such as Tableau, Qlikview, Power BI, and Advanced Excel, as well as expertise in data modeling and data architecture. In this role, you will be based in Pune and Chennai and will function as an Individual Contributor. The purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and providing solutions to address business problems and opportunities. Your key responsibilities will include: - Identifying and analyzing business problems and client requirements that necessitate change within the organization. - Developing business requirements to tackle business problems and opportunities. - Collaborating with stakeholders to ensure proposed solutions align with their needs. - Supporting the creation of business cases justifying investment in proposed solutions. - Conducting feasibility studies to assess the viability of proposed solutions. - Creating reports on project progress to ensure timely and on-budget delivery of proposed solutions. - Providing support for change management activities and ensuring successful implementation and embedding of proposed solutions in the organization. As a Vice President, you are expected to contribute to setting strategy, driving requirements, and making recommendations for change. You will be responsible for planning resources, budgets, and policies, managing and maintaining policies/processes, delivering continuous improvements, and escalating breaches of policies/procedures. If you have leadership responsibilities, you are expected to demonstrate leadership behaviors that create an environment for colleagues to excel. The four LEAD behaviors include Listening and being authentic, Energizing and inspiring, Aligning across the enterprise, and Developing others. Overall, as a valuable member of our team, you are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, and demonstrate the Barclays Mindset to Empower, Challenge, and Drive in your daily interactions.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Analyst in the Solution Design team at Barclays, your role involves supporting the organization in defining and designing technology and business solutions to meet organizational goals. This includes requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs aligned with both business and technical needs. Key Responsibilities: - Deliver large-scale change in complex environments, acting as a thought leader in requirements documentation and workshop facilitation to gather, clarify, and communicate business needs effectively. - Utilize strong data analysis and data modeling skills to perform data validations, anomaly detection, and make sense of large volumes of data to support decision-making. - Demonstrate advanced SQL proficiency for querying, joining, and transforming data to extract actionable insights, along with experience in data visualization tools such as Tableau, Qlik, and Business Objects. - Act as an effective communicator, translating complex technical concepts into clear, accessible language for diverse audiences, and liaising between business stakeholders and technical teams to achieve a mutual understanding of data interpretations, requirements definition, and solution designs. - Apply experience in Banking and Financial services, particularly in wholesale credit risk, and implement data governance standards including metadata management, lineage, and stewardship. Qualifications Required: - Experience in Python data analysis and associated visualization tools. - Familiarity with external data vendors for sourcing and integrating company financials and third-party datasets. - Experience with wholesale credit risk internal ratings-based (IRB) models and regulatory frameworks. In this role based in Chennai/Pune, you will be responsible for implementing data quality processes and procedures to ensure reliable and trustworthy data. Your tasks will include investigating and analyzing data issues related to quality, lineage, controls, and authoritative source identification, executing data cleansing and transformation tasks, designing and building data pipelines, and applying advanced analytical techniques like machine learning and AI to solve complex business problems. Additionally, you will document data quality findings and recommendations for improvement. As a Vice President, you are expected to contribute to strategy, drive requirements, and make recommendations for change. You will manage resources, budgets, and policies, deliver continuous improvements, and escalate breaches of policies and procedures. If you have leadership responsibilities, you will demonstrate leadership behaviours focused on creating an environment for colleagues to thrive and deliver to an excellent standard. All colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Role Overview: You will be joining as a full-time Research AVP in the Coalition Greenwich Investment Management business located in Saki Vihar, Mumbai. Your main responsibility will be to support new product development by conducting industry and financial research to analyze and estimate revenues, AUM, and funds analysis of Asset Managers across the US, Europe, and Asia. You will work closely with Product owners, the Head of Product Development, and the Co-Heads of IM globally to shape market intelligence offerings. Key Responsibilities: - Gather, structure, and consolidate quantitative and qualitative data relevant to the asset management industry in the US, Europe, and Asia - Conduct market, industry, and competitor analysis across intermediary and institutional channels, ensuring accuracy and consistency of datasets - Build and maintain efficient and robust models and data sets to support forecasting and trend analysis - Collaborate with internal teams to develop benchmark report deliverables and methodologies for new analysis - Prepare market insights, charts, and visualizations for client presentations and internal reports, including qualitative write-up and messaging - Coordinate delivery of data and insights to meet project milestones, and contribute to overall product development workstream activity Qualification Required: - Strong intellectual curiosity, problem-solving skills, and ability to quickly learn new business domains, tools, and processes - Proficiency in data analysis tools including Excel (advanced formulas, pivot tables, data modeling), knowledge of data visualization tools (e.g., Power BI, Tableau) is a plus - Strong quantitative and analytical skills with experience in forecasting and market data gathering - Extremely high attention to detail and adaptability to a variety of tasks - MBA or post-graduate degree in finance, business management, statistics, economics, or similar analytical fields - Ability to make decisions while working with unstructured and limited information - Prior experience of 2 years or more in financial research/analytics - Familiarity with the asset management industry preferred but not mandatory - Experience working in agile environments preferred - Excellent command and knowledge on Asset Management Products and Industry - Sound knowledge of global capital markets products preferred - Fluent in spoken and written English - Excellent working skills with MS Office tools, especially advanced Excel and PowerPoint - Working knowledge of professional information services like Bloomberg, Thomson Reuters, etc., is preferred,
Posted 2 days ago
1.0 - 8.0 years
0 Lacs
bangalore, karnataka
On-site
Role Overview: As a Cloud Technical Lead specializing in Azure Data Engineering with hands-on experience in Microsoft Fabric, you will play a crucial role in leading end-to-end Microsoft Fabric implementations for enterprise clients. Your expertise in building and maintaining ETL/data pipelines using Azure Data Factory, Databricks, and Fabric Data Pipelines will be essential in designing and delivering large-scale data solutions on Azure. Collaborating with stakeholders to translate business needs into scalable Fabric-based data solutions and providing architectural input for enterprise cloud data platforms will be key responsibilities in this role. Key Responsibilities: - Lead end-to-end Microsoft Fabric implementations for enterprise clients. - Build and maintain ETL/data pipelines using Azure Data Factory, Databricks, and Fabric Data Pipelines. - Design, develop, and optimize large-scale data solutions on Azure (Fabric, Synapse, Data Lake, SQL DB). - Implement data models and data warehousing solutions using Fabric Lakehouse, Synapse, and SQL. - Collaborate with stakeholders to translate business needs into scalable Fabric-based data solutions. - Ensure high-performance, secure, and compliant data solutions. - Mentor junior engineers on Fabric, Databricks, and ADF best practices. - Provide architectural input for enterprise cloud data platforms. Qualifications Required: - Bachelor's degree in computer science, IT, or a related field. - 8+ years of experience in data engineering, including 5+ years of hands-on experience with Azure Databricks, ADF, and Synapse. - Minimum 1 year of mandatory hands-on experience with Microsoft Fabric, demonstrated through client project implementations. - Strong experience in data modeling, data architecture, and database design. - Proficiency in SQL, Python, and PySpark. - Familiarity with data governance, security, and compliance practices, with hands-on experience in tools such as Microsoft Purview or Unity Catalog. - Experience with Azure DevOps CI/CD for data solutions. - Strong interpersonal and communication skills, with the ability to lead teams. Insight at a Glance: With 14,000+ engaged teammates globally and operations in 25 countries, Insight has received 35+ industry and partner awards in the past year. Generating $9.2 billion in revenue, Insight is recognized as #20 on Fortune's World's Best Workplaces list, #14 on Forbes World's Best Employers in IT 2023, and #23 on Forbes Best Employers for Women in IT- 2023. With a total charitable contribution of $1.4M+ in 2023, Insight believes in unlocking the power of people and technology to accelerate transformation and achieve extraordinary results.,
Posted 2 days ago
7.0 - 10.0 years
18 - 20 Lacs
bengaluru
Work from Office
Candidate Specifications: Notice Period - Immediate to 30 days Develop and maintain complex data models for business data analysis and reporting purposes. Collaborate with stakeholders and cross-functional teams to understand data requirements and design appropriate data models that align with business needs. Create and maintain data dictionaries and metadata repositories to ensure consistency and integrity of data models. Identify and resolve data model performance issues to optimize database performance and enhance overall system functionality. Proficiency in data modelling tools. Contact Person- Swathikumar A Email id- swathikumar@gojobs.biz
Posted 2 days ago
11.0 - 20.0 years
45 - 50 Lacs
bengaluru
Work from Office
Technologies Used: Odoo Platform v15+ Based Development. Experience with Odoo development and customization. Odoo User base (Logged-in users) > 1000 Users . Odoo on Kubernetes (Microservices Based Architecture) with DevOps understanding. Knowledge of Odoo modules, architecture, and APIs. Ability to integrate Odoo with other systems and data sources. Capable of creating custom modules. Scale Odoo deployments for a large number of users and transactions. Programming. Languages: Proficiency in Python is essential. Experience with other programming languages (e.g., Java, Scala) is a plus. Data Analysis and Reporting: Ability to analyse and interpret complex data sets. Your future duties and responsibilities: Experience with data visualization tools (e.g., Superset). Experience in Cassandra (4.0+) along with Query Engine like Presto. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Experience with ETL tools and processes. Data Structure & Data Modelling Knowledge of data warehousing concepts and technologies. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Experience in managing and processing large Datasets DevSecOps: Experience with containerization, Docker, and Kubernetes clusters. CI/CD with GitLab. Methodologies: Knowledge and experience of SCRUM and Agile methodologies. Operating Systems: Linux/Windows OS. Tools Used: Jira, GitLab, Confluence. Other Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Attention to detail and a commitment to data quality. Ability to work in a fast-paced, dynamic environment. Skills: English ERP System CSB Postgre SQL Python Hadoop Ecosystem (HDFS) Java
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
Role Overview: At EY, you will have the opportunity to shape a unique career with the global scale, support, inclusive culture, and technology to help you become the best version of yourself. Your voice and perspective are valued to contribute to making EY even better. Join EY to create an exceptional experience for yourself and contribute to building a better working world for all. Key Responsibilities: - Utilize tools and techniques to analyze data collection, updates, storage, and exchange - Define and apply data modeling and design standards, tools, best practices, and development methodologies - Design, review, and maintain data models - Perform data analysis to capture data requirements and visualize them in data models - Manage the data model lifecycle from requirements to design, implementation, and maintenance - Collaborate with data engineers to create optimal physical data models - Identify opportunities to leverage data for enhancing business activities Qualification Required: - Bachelor's degree in Computer Science or equivalent with 3-7 years of industry experience - Experience in Agile-based delivery methodology is preferable - Strong analytical skills with a proactive problem-solving approach - Proficiency in Software Development Best Practices - Excellent debugging and optimization skills - Experience in Enterprise-grade solution implementations and converting business challenges into technical solutions - Strong communication skills, both written and verbal, formal and informal - Participation in all phases of the solution delivery life cycle, including analysis, design, development, testing, deployment, and support - Client management skills Additional Details: EY aims to build a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Through data and technology, diverse EY teams worldwide offer assurance and support clients in growth, transformation, and operations across various sectors. Working in assurance, consulting, law, strategy, tax, and transactions, EY teams tackle complex global challenges by asking better questions to find innovative solutions.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Tech Lead specializing in Data & Analytics, you will be responsible for the following: **Role Overview:** You will play a crucial role in handling and processing data efficiently, ensuring optimal performance and data integrity. Additionally, you will be involved in data analysis, statistical modeling, and visualization to derive meaningful insights for the organization. **Key Responsibilities:** - Proficient in SQL Server with a focus on query optimization. - Expertise in application data design and process management. - Extensive knowledge of data modeling techniques. - Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric. - Experience working with Azure Databricks. - Expertise in data warehouse development, including SSIS and SSAS. - Proficiency in ETL processes, data cleaning, and normalization. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Understanding of data governance, compliance, and security within Azure environments. - Experience in data analysis, statistical modeling, and machine learning techniques. - Proficiency in analytical tools such as Python, R, and libraries like Pandas and NumPy. - Strong expertise in Power BI for data visualization, data modeling, and DAX queries. - Experience in implementing Row-Level Security in Power BI. - Ability to work with medium-complex data models and understand application data design quickly. - Familiar with industry best practices for Power BI and performance optimization. - Understanding of machine learning algorithms, including supervised, unsupervised, and deep learning techniques. **Qualifications Required:** - Ability to lead a team of 4-5 developers and take ownership of deliverables. - Commitment to continuous learning and staying updated with new technologies. - Strong communication skills in English, both written and verbal. - Effective interaction with customers during project implementation. - Capability to explain complex technical concepts to non-technical stakeholders. In addition to technical skills, the following skills are preferred for this role: - Data Management: SQL, Azure Synapse Analytics, Azure Analysis Service, Data Marts, Microsoft Fabric - ETL Tools: Azure Data Factory, Azure Databricks, Python, SSIS - Data Visualization: Power BI, DAX This comprehensive role requires a blend of technical expertise, leadership skills, and effective communication to drive successful data and analytics projects within the organization.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Technical Solution Architect, your role involves proposing technical options and solutions with thorough comparative analysis. You will guide the team in design and implementation, interact with clients to create end-to-end specifications for PIM solutions, and define implementation processes, quality gates, and standards. Additionally, you will perform data analysis and troubleshooting to resolve data quality, data integrity, and system performance issues. Your support will be crucial in assisting development and test teams with the installation & configuration of the Stibo STEP platform. Key Responsibilities: - Propose technical options and solutions with thorough comparative analysis. - Guide the team in design and implementation. - Interact with clients to create end-to-end specifications for PIM solutions. - Define implementation processes, quality gates, and standards. - Perform data analysis and troubleshooting to resolve data quality, data integrity, and system performance issues. - Support development and test teams in the installation & configuration of the Stibo STEP platform. Qualifications Required: - 5-8 years of hands-on experience with Stibo STEP Master Data Management (MDM) platform. - Proficiency in JavaScript or Java/J2EE. - Experience configuring and customizing Stibo STEP MDM across domains like Product, Customer, Supplier. - Strong understanding of data modeling concepts and experience designing data models within Stibo STEP. - Strong knowledge of data integration tools/techniques: ETL, REST APIs, 3rd-party integrations using web services. - Database & SQL knowledge. - Proficiency with IDEs and debugging code. - Understanding of ER model. - Familiarity with XML, XSD, JSON, CSV, and other data formats. - Stibo STEP certification (preferred). - Informatica PIM knowledge (a plus). Location: Pune/Bengaluru,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Modeller Architect at Capco, you will be a part of the business architecture team responsible for supporting key front office to back-office transformation priorities. Your primary focus will be on defining data models that align with business processes, ensure data lineage, effective control, and implement client strategy and reporting solutions. Building strong relationships with key stakeholders and delivering tangible value will be crucial in this role. Key Responsibilities: - Define and manage data models to automate business processes and controls. - Ensure adherence to the bank's data modeling standards and principles while influencing them as necessary. - Collaborate with various functional leads and teams to promote the adoption and execution of front-to-back solutions through socializing the data models. Qualifications Required: - 8+ years of experience in financial services, particularly in strategy and solutions within the Corporate and Investment Banking domain. - Strong knowledge of transaction banking domain processes and controls for banking and trading businesses to effectively engage with business subject matter experts. Experience in developing models for transaction banking products is preferred. - Familiarity with the lifecycle of loans, cash/deposits, or customer lifecycle, and the related business data necessary for managing operations and analytics would be advantageous. - Proficiency in business requirements analysis, including excellent communication skills (both verbal and listening) and stakeholder management at all levels. Additional Details: Capco, a Wipro company, is a global technology and management consulting firm recognized for its deep transformation execution and delivery. With a presence in 32 cities worldwide, Capco supports over 100 clients in the banking, financial, and energy sectors. The company values diversity, inclusivity, and creativity, fostering an open culture that encourages individuality and career advancement without forced hierarchy. Joining Capco means making an impact through innovative thinking, delivery excellence, and thought leadership in collaboration with clients and industry partners to drive disruptive change in energy and financial services.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You are a highly skilled and motivated Power BI Developer with 4-6 years of experience in designing, developing, and deploying Power BI solutions. Your role involves transforming raw data into actionable insights through interactive dashboards and reports while ensuring secure and scalable data access. **Responsibilities:** - Develop and maintain Power Query (M) scripts for data transformation and ingestion. - Integrate data from multiple sources including SQL Server, Excel, APIs, cloud platforms (Azure, AWS), and third-party connectors. - Configure and manage Power BI Gateways for scheduled data refreshes. - Implement Row-Level Security (RLS) to ensure secure data access. - Publish and manage reports in Power BI Service, including workspace management and app deployment. - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Create wireframes and mockups using Figma, Miro or similar tools to visualize dashboard layouts and user journeys. - Optimize performance of reports and datasets. - Stay updated with the latest Power BI features and best practices. **Key Responsibilities:** - Develop and maintain Power Query (M) scripts for data transformation and ingestion. - Integrate data from multiple sources including SQL Server, Excel, APIs, cloud platforms (Azure, AWS), and third-party connectors. - Configure and manage Power BI Gateways for scheduled data refreshes. - Implement Row-Level Security (RLS) to ensure secure data access. - Publish and manage reports in Power BI Service, including workspace management and app deployment. - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Create wireframes and mockups using Figma, Miro or similar tools to visualize dashboard layouts and user journeys. - Optimize performance of reports and datasets. - Stay updated with the latest Power BI features and best practices. **Key Skills:** - Strong proficiency in DAX and Power Query (M). - Experience with data modeling, star/snowflake schemas, and normalization. - Hands-on experience with Power BI Service, including dashboard publishing and workspace management. - Strong Understanding of Power BI Gateways and data refresh scheduling. - Hands-on experience implementing Row-Level Security (RLS). - Familiarity with data ingestion from various sources (SQL, Excel, REST APIs, cloud storage). - Experience in wireframing tools like Miro, Figma, or Balsamiq. - Understanding of version control and deployment pipelines (Dev/Test/Prod). - Excellent problem-solving and communication skills. - Microsoft Certified: Power BI Data Analyst Associate (PL-300) or equivalent. - Exposure to embedding Power BI dashboards into external platforms (e.g., web apps, SharePoint). **About the Company:** NeuIQ is a new-age technology services firm specializing in solving enterprise business transformation and experience challenges through cutting-edge, AI-powered data and technology solutions. Their vision is to build a scalable and profitable technology implementation business with data engineering as its foundation. NeuIQ's expertise lies in implementing enterprise SaaS platforms such as Qualtrics, ServiceNow, Snowflake, and Databricks, enabling organizations to unlock actionable insights and maximize the value of their AI and technology investments. They are committed to empowering enterprises to stay relevant, impactful, and ahead in today's dynamic landscape.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a BI Architect at InfoCepts, you will be responsible for designing, developing, supporting, and steering end-to-end business intelligence solutions using Strategy. InfoCepts is a global leader in data and analytics solutions with a focus on enabling customers to derive value from various data-driven capabilities. You will be part of a team of over 1200 professionals working on cutting-edge technology solutions with a mission to transform customers" journeys through data-driven modernization. Key Responsibilities: - Architect and design solutions by conducting current state assessments, prescribing D&A modernization strategies, and creating data-culture transformation roadmaps. - Provide advisory and consulting services by driving engagements in areas of expertise, supporting pre-sales activities, and developing offerings. - Systematically develop and maintain offerings by supporting offering development, leading pilot implementations, and creating implementation methodologies. - Contribute to organizational strategy by providing thought leadership, establishing innovation labs, and supporting GTM partnerships. Qualifications Required: - Technical expertise in Strategy - Leadership skills to guide development teams and assist with technical blockers - Experience in developing reusable artifacts/frameworks and industry solutions - Strong written and verbal communication skills in English - Decent understanding of data warehousing concepts Good to Have: - Expertise in analytics platforms such as Tableau, Power BI, Qlik, ThoughtSpot, Domo, SSRS - Strong experience in designing end-to-end Strategy projects - Identifying and exploiting new technologies for new and existing accounts - Building capacity and developing talent for the Competency Centre Additional Company Details: InfoCepts has been recognized as Gartner Peer Insights Customers" Choice for two consecutive years in 2020 and 2021. The company is certified by Great Place to Work, India in 2021 & 2022, highlighting the high-trust and high-performance work culture. The award-winning reusable solutions approach is well-recognized in the D&A industry, allowing associates to leverage collective expertise and deliver exceptional customer experiences. If you possess a Bachelor's degree in computer science, engineering, or a related field (a master's degree is a plus), along with at least 8+ years of relevant experience, and demonstrated continued learning through certifications, you are encouraged to apply. The ideal candidate will be self-motivated, possess strong interpersonal skills, and be able to work effectively in cross-functional teams and different time zones, while quickly acquiring and developing new capabilities and skills.,
Posted 3 days ago
3.0 - 5.0 years
5 - 12 Lacs
bengaluru
Work from Office
3+yrs SAP mobile technologies such as SAP BTP, Fiori, and Mobile Development Kit (MDK). Exp with SAP ABAP, SAP Web Dynpro, and related SAP mobile development tools is a plus. Familiarity with SAP HANA and data modeling for mobile applications.
Posted 3 days ago
5.0 - 10.0 years
10 - 20 Lacs
bengaluru
Work from Office
Position: Solutions Architect Experience: 5+ Years Notice: 0 - 15 Days Location: Bangalore Company: UNO Bank Overview: We seek an experienced solution architect with knowledge in Banking domain Role & responsibilities • Understand and create an Enterprise Architecture framework. • Data Modelling with various tools such as Visio, Aris, Lucid, etc • Lead the design, development, and deployment of solutions. • Build and maintain relationships with business stakeholders to identify internal opportunities that bring value to the company. • Coach, train, and mentor internal and external engineers to follow IT Architecture. • Develop and maintain the Architecture Repository Qualifications and Skills: • 5 years of experience • Knowledge in Banking Domain (Must) • Understanding in Data Modeling • Working Experience in Microservices with Spring Boot Working Experience with API Mgmt (any tool) • Experience with any modeling tool (Visio, Aris, Lucid etc) • Working knowledge in DevOps/CI-CD • Understanding of Development Best practice • Understanding of Domain Driven Design Principles • Working knowledge in Workflow, Orchestration and Automation Tools (any tool) • Working knowledge in ESB/Middleware • Knowledge of AWS Ecosystem and service Contact: 91 97041 22348 / hr@singhtechservices.com
Posted 3 days ago
10.0 - 15.0 years
25 - 35 Lacs
chennai, thiruvananthapuram
Work from Office
What You Will Do : Design and lead end-to-end architecture for modern, cloud-native data engineering, AI/ML and analytics platforms across the full data lifecycle, including ingestion, storage, transformation, and consumption. Architect high-performance data solutions using Azure Databricks, Fabric, Snowflake, AWS Redshift, Power BI, Tableau, Python, and other relevant technologies. Collaborate with technology leadership and engineering teams to align solutions with enterprise strategy, business goals, and innovation roadmaps. Define and enforce standards for data quality, metadata, lineage, and governance in partnership other teams using market leading tools. Provide architectural guidance for AI/ML integrations, including data preparation, feature engineering, and model deployment support. Conduct design reviews, architectural assessments, and performance tuning to ensure system reliability, scalability, and maintainability. Develop and maintain reusable patterns, frameworks, and coding standards in ADF, Python, PySpark, and SQL. Collaborate with product managers, engineering leads, analysts, and data scientists to deliver high-impact, cross-functional solutions. Drive the evaluation and adoption of emerging technologies in cloud data platforms, streaming analytics, and intelligent automation. Mentor data engineers and oversee best practices in solution design, code quality, documentation, and continuous improvement. Support DevOps and DataOps initiatives by integrating CI/CD pipelines, Git workflows, and automated testing into engineering workflows. What You Will Need : Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. 10-12 years of progressive experience in data engineering, data modelling including at least 5 years in an architecture-focused or lead technical role. Proven experience architecting enterprise-grade cloud data platforms using Azure (and AWS). Deep expertise in technologies such as Azure Data Factory, Snowflake, Databricks, Spark/PySpark, and SQL-based processing. Strong grasp of modern data architecture paradigms including data lakehouse, data fabric, microservices, and event-driven design patterns. Solid understanding of DevOps and Infrastructure-as-Code practices, including CI/CD pipelines, Docker/Kubernetes, and automated deployment frameworks. What Would Be Nice To Have : Familiarity with modern data architecture frameworks, including data mesh, data fabric, and data lakehouse. Industry experience in Healthcare, Finance or other highly data-driven domains. Experience with Database Lifecycle Management (DLM) tools and strong understanding of CI/CD pipelines, branching strategies, and collaborative DevOps practices. Hands-on experience integrating with data governance and master data management platforms s. Proficiency in Python, Scala, or similar programming languages commonly used in large-scale data engineering. Understanding of AI/ML model lifecycle architecture, including data preparation, model training, and production deployment best practices. Relevant certifications in cloud architecture (e.g., Azure Solutions Architect, AWS Certified Data Analytics) or enterprise architecture (e.g., TOGAF Framework
Posted 3 days ago
5.0 - 9.0 years
10 - 20 Lacs
bengaluru
Hybrid
Position: Senior Software Developer-Analytics *** JOB DESCRIPTION *** Overview: The Senior Software Developer will work closely with product manager, Implementation Consultants (ICs) and clients to gather requirements to meet the data analysis need of a company or a client. They must have good collaboration skills. The Senior Software Developer will provide direction on analytics aspects to the team on various analytics related activities. Key Tasks & Responsibilities: Experienced in Qlik Sense Architecture design and good knowledge on load script implementation and best practices. Hands on experience in Qlik Sense development, dashboarding, data-modelling and reporting techniques. Experienced in data integration through extracting, transforming, and loading (ETL) data from various sources. Good at Data transformation, the creation of QVD files and set analysis. Data Modelling using Dimensional Modelling, Star schema and Snowflake schema. Strong SQL skills (SQL Server) to validate the Qlik sense dashboards and to work on internal applications. Knowledge on deploying of Qlik Sense application using Qlik Management Console (QMC) is a plus. Work with Implementation consultants (ICs), product manager and clients to gather the requirements. Configuration, migration, and support of Qlik Sense applications. Thoughtful implementation of Qlik Sense best practices for efficiency and re-usability. Research and utilize new technologies. Collaborate with the Software Quality Assurance (SQA) team to test the applications functionality. Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures. Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned. Other duties as assigned. Position: Senior Software Developer-Analytics Education/Language: BTech / MTech / Master of Science degree in Computer Science and/or equivalent work experience Good verbal and written communication skills Professional Skills & Experience: Minimum of 3-5 years of experience in implementing end to end business intelligence using Qlik Sense. Thorough experience in Qlik Sense architecture, design, develop, test and deployment process. Thorough understanding of Qlik Sense best practices (re-usability, efficiency, optimization). Knowledge on Clinical Datasets and Standards is a plus. (eg: SDTM, CDISC (92,45,101), Q Format, Customized Data formats ..etc). Excellent understanding of relational database concepts, data modelling, and design. Excellent knowledge on writing SQL code and ETL procedures using MS-SQL Server. Strong Software Development Lifecycle experience (Agile methodology experience is a plus). Strong technical project management experience and team leadership skills including scope management, work planning and work delegation. Strong troubleshooting skills and use of defect/feature management systems. Proven ability to work independently and with technical team members (Startup environment experience is a plus). Good verbal and written communication skills. Strong analytical skills and strong decision-making capabilities. Technical Skills & Experience 3+ years of experience in Qlik Sense architecture and design 3+ years of experience in develop, test and deploy of Qlik Sense applications. 3+ years with SQL Server and ETL process. 3+ years with Data modelling (physical & logical). Experience with Performance tuning and best practices of Qlik Sense. Experience with Dimensional modelling, Star Schema and Snowflake Schema. Knowledge of clinical trial data and SDTM standards is a plus.
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |