Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.5 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration and data quality processes. - Experience with data modeling and metadata management. - Familiarity with ETL processes and data warehousing concepts. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 7.5 years of experience in Informatica MDM. - This position is based in Mumbai. - A 15 years full time education is required.
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
kolkata, west bengal
On-site
The role requires you to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will closely collaborate with data architects to create customized databases utilizing a blend of conceptual, physical, and logical data models. As a data modeler, your responsibilities include designing, implementing, and documenting data architecture and modeling solutions across various database types to support enterprise information management, business intelligence, machine learning, and data science initiatives. Your key responsibilities will involve implementing business and IT data requirements by devising new data strategies and designs for different data platforms and tools. You will engage with business and application teams to execute data strategies, establish data flows, and develop conceptual, logical, and physical data models. Moreover, you will define and enforce data modeling standards, tools, and best practices, while also identifying architecture, infrastructure, data interfaces, security considerations, analytic models, and data visualization aspects. Additionally, you will undertake hands-on tasks such as modeling, design, configuration, installation, performance tuning, and sandbox proof of concept. It is crucial to work proactively and independently to fulfill project requirements, communicate challenges effectively, and mitigate project delivery risks. Qualifications required for this role include a BE/B.Tech degree or equivalent and a minimum of 8 years of hands-on experience in relational, dimensional, and/or analytic domains, involving RDBMS, dimensional, NoSQL platforms, and data ingestion protocols. Proficiency in data warehouse, data lake, and big data platforms within multi-data-center environments is essential. Familiarity with metadata management, data modeling tools (e.g., Erwin, ER Studio), and team management skills are also necessary. Your primary skills should encompass developing conceptual, logical, and physical data models, implementing RDBMS, ODS, data marts, and data lakes, and ensuring optimal data query performance. You will be responsible for expanding existing data architecture and employing best practices. The ability to work both independently and collaboratively is vital for this role. Preferred skills for this position include experience with data modeling tools and methodologies, as well as strong analytical and problem-solving abilities.,
Posted 5 days ago
2.0 years
0 Lacs
Gujarat, India
Remote
About The Job About CloudLabs : CloudLabs Inc was founded in 2014 with the mission to provide exceptional IT & Business consulting services at a competitive price, to help clients realize the best value from their investments. Within a short span, CloudLabs evolved from pure-play consulting into a transformative partner for Business Acceleration Advisory, Transformative Application Development & Managed Services - enabling digital transformations, M&A transitions, Automation & Process-driven optimizations & complex Integration initiatives for enterprises across the globe. As a Strategic Planning & Implementation Partner for global companies, CloudLabs has seen a 200% uptake in winning high-value, high-impact and high-risk projects that are critical for the business. With offices in the US, Canada, Mexico & India and with the team of 200+ experienced specialists, CloudLabs is now at an inflection point and ready for its next curve of progress. What We Offer We welcome candidates rejoining the workforce after career break/parental leave and support their journey to reacclimatize too corporate. Flexible remote work. Competitive pay package. Attractive policy, medical insurance benefits, industry leading trainings. Opportunity to work remotely is available. Experience : Minimum 2-3 years of relevant experience. Job Type : Onsite Location : Gujarat. Job Description We are looking for a motivated and technically sound Data Engineer with 2 to 3 years of experience to join our data engineering team. The ideal candidate will have a solid understanding of database systems, strong SQL/PLSQL skills, and a willingness to grow in modern cloud data technologies like Snowflake. Duties And Responsibilities Design, develop, and maintain robust data pipelines and workflows. Write optimized SQL/PLSQL scripts to extract, transform, and load data. Support data integration across systems and ensure high data quality. Collaborate with cross-functional teams to understand data needs and deliver solutions. Participate in performance tuning, data modeling, and code reviews. Continuously explore and adopt cloud data technologies to improve systems and workflows. Ensure timely delivery of data solutions and documentation. Work from the Gujarat office (minimum 4 days per week) as part of a collaborative team environment. What Were Looking For 2 to 3 years of experience in data engineering or database development roles. Strong understanding of database concepts and relational data modeling. Ability to write and troubleshoot complex SQL and PL/SQL queries. Hands-on Experience in Python. This role requires working from our Gujarat office 4 days a week. Preferred But Not Required Qualifications Exposure to ETL processes and tools. Experience working with Snowflake or other cloud data warehouse platforms. Strong written and verbal communication skills. Willingness to learn and complete certifications in cloud data warehouse technologies (e. , Snowflake) with minimal supervision (ref:hirist.tech)
Posted 5 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for datavisualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Experience in data integration or migrations or ELT or ETL tooling is mandatory Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
8.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
About The Job Asymbl is an innovative technology company that combines industry-specific products, digital workforce transformation, and deep Salesforce consulting expertise to drive growth and innovation. We deliver both advanced software and strategic services to help organizations modernize how work gets done. We pride ourselves on a culture of relentless curiosity and belief, grounded in trust and integrity, driven by a bias to action and willingness to fail fast while remaining unwaveringly customer-focused and dedicated to fostering the potential of our people. Position Overview The Senior Salesforce BI & Analytics Architect will lead the design and implementation of robust analytics solutions, leveraging Salesforce Data Cloud (formerly Customer Data Platform) to empower data-driven decision-making. This role focuses on integrating complex data sources, architecting scalable solutions, and delivering actionable insights through Salesforce Tableau CRM, Tableau, and other advanced analytics tools. The ideal candidate combines deep expertise in Salesforce ecosystems, strong business acumen, and the ability to translate data into impactful strategies. Why Join Us ? Join Asymbl to shape the future of data-driven transformation. As a Senior Salesforce BI & Analytics Architect, youll work on challenging projects that leverage Salesforce Data Cloud to deliver next-generation analytics. Be part of a collaborative, innovative team where your expertise will drive real business impact. We offer competitive compensation, professional growth opportunities, and a vibrant company culture that values continuous learning and innovation. Responsibilities Lead the design and architecture of Salesforce analytics solutions, with a focus on Salesforce Data Cloud, Tableau CRM, and Tableau. Integrate and harmonize data from diverse sources, ensuring data quality, consistency, and scalability. Design and implement customer-centric data models, leveraging the capabilities of Salesforce Data Cloud for real-time analytics and insights. Build advanced dashboards, reports, and visualizations that provide actionable insights to business users. Collaborate with business and technical stakeholders to understand reporting and analytics requirements, translating them into scalable solutions. Implement data governance, security, and compliance best practices within the Salesforce ecosystem. Optimize the performance of analytics solutions, ensuring efficient data processing and timely delivery of insights. Provide technical leadership and mentorship to junior architects, developers, and analysts. Stay abreast of emerging trends and innovations in data analytics, ensuring solutions leverage the latest technologies and practices. Qualifications Bachelors degree in Computer Science, Data Analytics, or a related field. Advanced degrees preferred. 8+ years of experience in BI/Analytics architecture, with at least 3 years specializing in Salesforce Data Cloud and analytics tools. Expertise in Salesforce Data Cloud, Tableau CRM (formerly Einstein Analytics), Tableau, and data modeling within the Salesforce ecosystem. Strong knowledge of data integration techniques, ETL processes, and APIs within Salesforce. Proven experience working with large-scale, complex datasets and building real-time analytics solutions. Deep understanding of data governance, security, and compliance standards, especially within Salesforce environments. Hands-on experience with Salesforce Analytics Query Language (SAQL), Tableau Server/Online, and advanced dashboard design. Salesforce certifications such as Tableau CRM & Einstein Discovery Consultant, or Data Architect preferred. Excellent communication and stakeholder management skills, with the ability to present complex data concepts in a clear, concise manner. Familiarity with additional enterprise analytics tools or platforms (e.g., Power BI, Snowflake) is a plus. (ref:hirist.tech)
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Are you a data storyteller who thrives on solving real business challenges with elegant and scalable BI solutions? Do you have an eye for detail, a knack for visualizing insights, and hands-on experience with Power BI and Looker? If yes, we're looking for you! We are on the hunt for a Senior BI Developer who can bring clarity to complexity. You will be a strategic partner in our analytics ecosystem, delivering high-impact dashboards and reports that drive critical decision-making across business functions. What You'll Do Architect & Build : Develop intuitive and visually compelling dashboards using Power BI and Looker, tailored to various business use cases. DAX Wizardry : Write and optimize advanced DAX queries to drive complex metrics and calculations with precision. Cross-Functional Collaboration : Partner with business teams, analysts, and data engineers to understand requirements, model data, and translate them into clear and actionable BI deliverables. Data Integrity Champion : Ensure consistency, accuracy, and performance across BI assets. Own the quality and scalability of dashboards. Insight Generation : Go beyond dashboards-use your analytical mindset to discover trends, anomalies, and opportunities hidden in the data. Process Automation : Identify repetitive processes and automate them using Power Query, LookML, or SQL to improve reporting efficiency. Innovation & Best Practices : Stay abreast of BI trends, recommend tool enhancements, and drive BI maturity within the team. Must-Haves What You Bring to the Table : At least 5 years of relevant BI development experience (total exp : 6+ years preferred) Advanced expertise in Power BI - including Power Query, DAX, and report optimization Strong working knowledge of Looker and LookML Proficient in SQL for data modeling, transformations, and querying large datasets Solid understanding of data warehouse concepts, ETL pipelines, and data architecture Experience handling large-scale datasets and performance tuning Bachelor's Degree in Computer Science, Engineering, or related field Excellent verbal and written communication, with the ability to explain technical concepts to non- technical users Good-to-Haves Exposure to cloud platforms - Azure, GCP, or AWS Familiarity with Agile project management practices Knowledge of Python or R for deeper analytics and custom data wrangling Experience integrating BI tools with source systems and APIs Why You'll Love This Role Work on business-critical projects with full visibility and ownership Join a forward-thinking analytics team where your inputs shape data culture Flexibility through a hybrid model that respects your time and productivity Career growth through exposure to a diverse tech stack and real-time challenges Engage with leaders who appreciate innovation, transparency, and continuous learning (ref:hirist.tech)
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a SAP Data Conversion Consultant with 6 to 8 years of relevant experience, you will collaborate with cross-functional teams to understand data requirements and design effective conversion strategies. Your responsibilities will include analyzing and cleansing data to ensure accuracy, completeness, and integrity throughout the conversion process. You will be tasked with designing and implementing data validation procedures to verify the accuracy of converted data. In this role, you will work closely with clients to gather requirements, define conversion scope, and ensure alignment with business objectives. Additionally, you will provide guidance and support to junior team members involved in data conversion activities. Documenting conversion processes, methodologies, and best practices for future reference will also be part of your responsibilities. Monitoring conversion progress, identifying risks and issues, and implementing mitigation strategies as needed will be crucial aspects of your role as a SAP Data Conversion Consultant. The required skills for this position include expertise in ETL processes. Our hiring process for this position involves screening (HR Round), Technical Round 1, Technical Round 2, and Final HR Round. This is a full-time position located in Hyderabad, Bangalore, or Gurgaon. Please note that this position has already been filled. Thank you for your interest. Please let me know if you need any further information.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,
Posted 5 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Build cloud scale products with focus on efficiency, reliability and security Build and maintain end-to-end Build, Test and Deployment pipelines Deploy and manage massive Hadoop, Spark and other clusters Contribute to the architecture & design of the products Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 4+ years technical engineering experience with coding in languages like C#, React, Redux, TypeScript, JavaScript, Java or Python OR equivalent experience Experience in data integration or data migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an ETL Testing professional at CGI, you will be part of a dynamic team that is committed to helping clients succeed in their IT and business process services. With 5-8 years of experience, you will be based in Chennai or Bangalore, working full-time from the office on a Monday to Friday schedule from 12:30 PM to 9:30 PM. Your role will involve utilizing your analytics skills to understand requirements, develop test cases, and manage data effectively. You will need strong SQL skills and hands-on experience testing data pipelines built using Glue, S3, Redshift, and Lambda. Collaboration with developers to build automated testing and a solid understanding of data concepts like data lineage, data integrity, and quality are essential for success in this role. Previous experience in testing financial data is considered a plus. You will be expected to demonstrate expert-level analytical and problem-solving skills, flexibility in testing approaches, and awareness of Quality Management tools and techniques. Ensuring best practice quality assurance of deliverables, working within agreed architectural processes, data, and organizational frameworks will be crucial. Effective communication skills, proficiency in English (written/verbal), and local language as necessary are required. An open-minded approach to sharing information, transferring knowledge, and supporting team members will be key to your success. Must-have skills for this role include ETL and SQL proficiency, hands-on testing of data pipelines, experience with Glue, S3, Redshift, data lineage, and data integrity. Additionally, experience testing financial data will be advantageous. At CGI, we value ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to contribute meaningfully from day one, shaping the company's strategy and direction. Your work will create value through innovative solutions, collaboration with colleagues and clients, and access to global capabilities. You will have the chance to grow and develop your skills within a supportive environment that prioritizes your well-being and professional growth. Join CGI, one of the largest IT and business consulting services firms globally, and together, let's turn meaningful insights into action.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be an integral part of the Asset & Wealth Management Strategic Transformation Office at Goldman Sachs. Your main responsibility will involve extracting valuable insights from extensive datasets to assist senior leaders in expanding business relationships with clients. By developing an automation infrastructure for reporting initiatives, you will identify business trends and enhance scalability for client analysis. Your role will also focus on recognizing business gaps, implementing strategies for driving efficiencies, and maintaining the team's analytics engines. Key Responsibilities: - Design, build, and manage scalable, automated systems, reports, and dashboards that meet the firm's analytical and business requirements - Monitor and communicate project progress following the solution delivery lifecycle - Collaborate with fellow solution experts and advisors to exchange ideas and code by presenting outputs - Extract large datasets from various sources, transform and standardize the data, design dimensional data models, and load transformed data into relational databases - Generate, distribute, and analyze business performance metrics for regular reporting to management - Contribute significantly to global initiatives aimed at enhancing and streamlining critical business projects Basic Qualifications: - Possess a Master's degree in any discipline - Familiarity with industry-standard data transformation and reporting tools like Tableau, Alteryx, Power BI, and similar software - Strong analytical skills, comfortable handling extensive datasets, and presenting insights in a clear, compelling manner - Proficiency in MS Office applications (Excel, PowerPoint, Word, Outlook) - Ability to organize workload effectively, manage multiple priorities, and prioritize tasks - Excellent written and verbal communication skills with a strong interpersonal acumen - Comfortable leveraging data and technology to drive informed business decisions - Previous knowledge of the Asset & Wealth Management industry is advantageous Skills / Experience: - More than 3 years of experience in data analytics - Background in the financial services sector, preferably in an analytical role - Proficient in ETL development, SQL queries, and data analytic capabilities in relational database platforms - Possess a solid analytical and logical mindset with keen attention to detail - Team-oriented individual with a high sense of ownership and accountability - Curious, proactive, and self-motivated approach to work - Exceptional organizational skills, meticulous attention to detail, and a commitment to follow-through - Positive outlook and strong work ethic Goldman Sachs values diversity and inclusion, providing numerous opportunities for professional and personal growth. The organization is dedicated to accommodating candidates with special needs or disabilities during the recruitment process. For more information, visit: [Goldman Sachs Disability Statement](https://www.goldmansachs.com/careers/footer/disability-statement.html). Goldman Sachs is an equal employment and affirmative action employer, committed to supporting its people, capital, and ideas to drive growth for clients, shareholders, and communities worldwide.,
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
As a DECISION SCIENCE JUNIOR ANALYST at HSBC, you will play a crucial role in supporting the business by providing strategic input for senior management, enabling effective decision-making, and addressing unforeseen challenges. Leveraging data and analytics capabilities, you will contribute to smarter decisions and drive profitable growth across various domains such as Regulatory, Operations, Procurement, Human Resources, and Financial Crime Risk. Your responsibilities will include data analysis, model and strategy development & implementation, Business Intelligence, reporting, and data management. You will work on a variety of business problems related to business growth, customer experience enhancement, risk exposure limitation, capital quantification, and internal business process improvement. Proactively identifying emerging compliance risks and proposing innovative solutions will be part of your role. Leading cross-functional projects using advanced data modeling and analysis techniques, you will uncover insights to guide strategic decisions and identify optimization opportunities. In the midst of regulatory changes, you will maintain a strong understanding of regulatory developments and compliance risk management. Delivering repeatable and scalable analytics through the semi-automation of Financial Crime Risk and Regulatory Compliance Risk Assurance controls testing will also be a key aspect of your role. Requirements for this position include a Bachelor's degree in statistics, economics, or related quantitative fields, along with 1-4 years of experience in Automation & Analytics. Strong analytical skills, business analysis experience, and basic knowledge of financial services/banking operations are essential. Proficiency in Python, data science tools, visualization tools like QlikSense, SQL/ETL tools, big data tools (Teradata, Hadoop), cloud technologies (GCP/AWS/Azure), and data engineering skills are advantageous. Experience in data science, machine learning algorithms, and building data pipelines using modern tools/libraries will be beneficial. Join HSBC and be part of a team that values your contributions and offers opportunities for personal and professional growth. Your work will have a direct impact on enabling businesses to thrive, economies to prosper, and individuals to achieve their aspirations.,
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
Remote
Description There's likely a reason you've taken the time out of your busy day to review this opportunity at PulsePoint. Maybe you're in need of a change or there's “an itch you're looking to scratch.” Whatever may be the reason, listen to what some of our team members are saying about working here: "My manager takes the time to not only identify my next career move, but the steps that will take me there. I even have my own personal training budget that I'm encouraged to spend." "Our input is valued and considered. Everyone has a voice and that goes a long way in ensuring that we're moving towards a shared goal." "The Leadership team is incredibly open on their goals and how I contribute to the larger company mission. We all know where we fit and how we can make an impact every day." PulsePoint is growing, and we're looking for a Data Analyst to join our Data Analytics team! A BIT ABOUT US: PulsePoint is a fast-growing healthcare technology company (with adtech roots) using real-time data to transform healthcare. We help brands and agencies interpret the hard-to-read signals across the health journey and unify these digital determinants of health with real-world data to produce the most dimensional view of the customer. Our award-winning advertising platforms use machine learning and programmatic automation to seamlessly activate this data, making marketing, predictive analytics, and decision support easy and instantaneous. The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. Data Analyst Our Analysts take full ownership of complex data workflows and help drive innovation across PulsePoint's analytics products like Signal and Omnichannel. They build scalable solutions, automate manual processes, and troubleshoot issues across teams. By turning raw data into clear, actionable insights, they support both internal stakeholders and external clients. Data Analysts on the Data Analytics team work closely with Product Managers, BI, Engineering, and Client Teams to deliver high-impact analysis, reporting, and feature enhancements that shape the future of our data and analytics platforms. THE PRODUCT YOU’LL BE WORKING ON: You'll be working on HCP365, the core technology of PulsePoint’s analytics products - Signal, and Omnichannel. HCP365 (the first Signal product) is an AWARD-WINNING product having won a Martech Breakthrough Award and a finalist for the PM360 Trailblazer Award. It's the only health analytics and measurement solution that provides a complete, always-on view of HCP audience engagement across all digital channels with advanced data logic, automation, and integrations. This gives marketers and researchers unprecedented access to data insights and reporting used to inform and optimize investment decisions. You'll be helping scale the platform further with new features, cleaner attribution, and smarter automation to support client growth and product expansion. WHAT YOU'LL BE DOING: This is a hybrid role at the intersection of data analysis, operational support, and technical platform ownership. Your work will directly contribute to the accuracy, scalability, and performance of our attribution and measurement solutions. Take ownership of key workflows like HCP365 and Omnichannel Support Omnichannel at an enterprise level across the whole organization Build and improve Big Query SQL-based pipelines and Airflow DAGs Conduct R&D and contribute towards roadmap planning for new features in product Support client teams with data deep dives, troubleshooting, and ad hoc analysis Translate complex data into simple, client-facing insights and dashboards Dig into data to answer and resolve client questions REQUIRED QUALIFICATIONS: Minimum 3-5 years of relevant experience in: Understanding of deterministic and probabilistic attribution methodologies Proficiency in analyzing multi-device campaign performance and user behavior Excellent problem-solving and data analysis skills. Ability to organize large data sets to answer critical questions, extrapolate trends, and tell a story Writing and debugging complex SQL queries from scratch using real business data Strong understanding of data workflows, joins, deduplication, attribution, and QA Working with Airflow workflows, ETL pipelines, or scheduling tools Proficient in Excel (pivot tables, VLOOKUP, formulas, functions) Understanding of web analytics platforms (Google Analytics, Adobe Analytics, etc.) Experience with at least one BI Software (Tableau, Looker, etc.) Able to work 9am-6pm EST (6:30pm-3:30am IST); we are fine with remote work Note that this role is for India only and we do not plan on transferring hires to the U.S./UK in the future PREFERRED QUALIFICATIONS: Python for automation or workflow logic Basic experience with: Designing data pipelines and optimizing them Working on AI agents, automation tools, or workflow scripting Dashboard design and data storytelling And one of: ELT experience Experience with automation Statistics background Exposure to: Health related datasets, hashed identifiers Workflow optimization or code refactoring Project Management tools like JIRA or Confluence Bonus if you've worked on R&D or helped build data products from scratch WHAT WE'RE LOOKING FOR: We're looking for a hands-on, reliable, and proactive Analyst who can: Jump into complex workflows and own them end-to-end Troubleshoot issues and bring clarity in ambiguous situations Balance between deep technical work and cross-team collaboration Build scalable, automated, and accurate solutions SELECTION PROCESS: Initial Phone Screen SQL Screening Test via CodeSignal (35 minutes) SQL Live Coding Interview (60 minutes) Hiring Manager Interview (30 minutes) Team Interview (1:1s with Sr. Client Analyst, Team Manager, SVP of Data, Product Manager who built Signal) (3 x 45 minutes) RED FLAGS FOR US: Candidates won’t succeed here if they haven’t worked closely with data sets or have simply translated requirements created by others into SQL without a deeper understanding of how the data impacts our business and, in turn, our clients’ success metrics. Watch this video here to learn more about our culture and get a sense of what it’s like to work at PulsePoint! WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law.
Posted 5 days ago
3.0 - 10.0 years
0 Lacs
karnataka
On-site
You are an experienced Software Developer with a strong background in SQL databases, responsible for leading the development of SQL databases for various applications and business needs. Your expertise in data architecture and management will be crucial in designing and scaling SQL databases to meet the organization's requirements. You will also play a key role in writing SQL queries to store, sort, and retrieve a wide range of data. Your ability to think quickly, stay organized, and troubleshoot issues efficiently will be essential for day-to-day operations. Your responsibilities will include designing, developing, and maintaining robust SQL databases and database solutions in both on-premises and Cloud environments (AWS & Azure). You will provide technical expertise in migrating databases from on-premises to the Cloud and have knowledge of C++ as an added advantage. Additionally, you will lead a team of SQL developers, offer technical guidance, analyze and resolve issues in real-time, automate processes, track issues, and document changes. You will evaluate business data, recommend analytic strategies, perform statistical analysis, and work closely with development and architecture teams to optimize database schemas. To excel in this role, you should have a Bachelor's degree in Computer Science or a related field, along with 10+ years of experience in SQL development and database management (MS SQL, PostgreSQL). You should also possess 3+ years of experience in data analysis in an enterprise setting, a strong understanding of database design principles and data modeling, knowledge of ETL concepts, and excellent communication and presentation skills. Strong quantitative skills, attention to detail, problem-solving abilities, and the capacity to collaborate effectively with various teams are also essential. Working at LSEG, a leading global financial markets infrastructure and data provider, will offer you the opportunity to be part of a diverse workforce across 65 countries. You will contribute to a culture that values individuality, encourages new ideas, and is committed to sustainability. By helping to re-engineer the financial ecosystem for sustainable economic growth and supporting the transition to net zero, you will play a critical role in driving inclusive economic opportunity. LSEG provides a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. If you join us, you will be part of an organization that upholds values such as Integrity, Partnership, Excellence, and Change, guiding decision-making and actions on a daily basis.,
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As an ideal candidate for this position, you are expected to have a notice period of immediate to 15 days and be ready to work in a hybrid mode. Your responsibilities will include actively participating in requirements analysis, software/db design, and architectural discussions. Collaboration with cross-functional teams to produce scalable, efficient software solutions is a key aspect of this role. Your strong analytical and problem-solving skills will be crucial to the success of the projects you will work on. Effective communication and being a team player are qualities that are highly valued in this position. You will be required to write clean, effective, and reusable code that supports scalability and easy maintenance. Additionally, testing, debugging, and troubleshooting applications/scripts to resolve technical challenges will be part of your routine tasks. Improving the functionality of existing systems with a focus on efficiency and user experience is another important responsibility. You should have a knowledge of being detail-oriented with a focus on data integrity and performance. This is a full-time position that requires a total of 3 years of experience, with 3 years in SQL, 1 year in MongoDB, and 1 year in ETL. The work location for this role is in person.,
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
India
Remote
There's likely a reason you've taken the time out of your busy day to review this opportunity at PulsePoint. Maybe you're in need of a change or there's “an itch you're looking to scratch.” Whatever may be the reason, listen to what some of our team members are saying about working here: "My manager takes the time to not only identify my next career move, but the steps that will take me there. I even have my own personal training budget that I'm encouraged to spend." "Our input is valued and considered. Everyone has a voice and that goes a long way in ensuring that we're moving towards a shared goal . " "The Leadership team is incredibly open on their goals and how I contribute to the larger company mission. We all know where we fit and how we can make an impact every day." PulsePoint is growing, and we're looking for a Data Analyst to join our Data Analytics team! A BIT ABOUT US: PulsePoint is a fast-growing healthcare technology company (with adtech roots) using real-time data to transform healthcare. We help brands and agencies interpret the hard-to-read signals across the health journey and unify these digital determinants of health with real-world data to produce the most dimensional view of the customer. Our award-winning advertising platforms use machine learning and programmatic automation to seamlessly activate this data, making marketing, predictive analytics, and decision support easy and instantaneous. The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. Data Analyst Our Analysts take full ownership of complex data workflows and help drive innovation across PulsePoint's analytics products like Signal and Omnichannel. They build scalable solutions, automate manual processes, and troubleshoot issues across teams. By turning raw data into clear, actionable insights, they support both internal stakeholders and external clients. Data Analysts on the Data Analytics team work closely with Product Managers, BI, Engineering, and Client Teams to deliver high-impact analysis, reporting, and feature enhancements that shape the future of our data and analytics platforms. THE PRODUCT YOU’LL BE WORKING ON : You'll be working on HCP365, the core technology of PulsePoint’s analytics products - Signal, and Omnichannel. HCP365 (the first Signal product) is an AWARD-WINNING product having won a Martech Breakthrough Award and a finalist for the PM360 Trailblazer Award . It's the only health analytics and measurement solution that provides a complete, always-on view of HCP audience engagement across all digital channels with advanced data logic, automation, and integrations. This gives marketers and researchers unprecedented access to data insights and reporting used to inform and optimize investment decisions. You'll be helping scale the platform further with new features, cleaner attribution, and smarter automation to support client growth and product expansion. WHAT YOU ' LL BE DOING: This is a hybrid role at the intersection of data analysis, operational support, and technical platform ownership. Your work will directly contribute to the accuracy, scalability, and performance of our attribution and measurement solutions. Take ownership of key workflows like HCP365 and Omnichannel Support Omnichannel at an enterprise level across the whole organization Build and improve Big Query SQL-based pipelines and Airflow DAGs Conduct R&D and contribute towards roadmap planning for new features in product Support client teams with data deep dives, troubleshooting, and ad hoc analysis Translate complex data into simple, client-facing insights and dashboards Dig into data to answer and resolve client questions REQUIRED QUALIFICATIONS: Minimum 3-5 years of relevant experience in: Understanding of deterministic and probabilistic attribution methodologies Proficiency in analyzing multi-device campaign performance and user behavior Excellent problem-solving and data analysis skills. Ability to organize large data sets to answer critical questions, extrapolate trends, and tell a story Writing and debugging complex SQL queries from scratch using real business data Strong understanding of data workflows, joins, deduplication, attribution, and QA Working with Airflow workflows, ETL pipelines, or scheduling tools Proficient in Excel (pivot tables, VLOOKUP, formulas, functions) Understanding of web analytics platforms (Google Analytics, Adobe Analytics, etc.) Experience with at least one BI Software (Tableau, Looker, etc.) Able to work 9am-6pm EST (6:30pm-3:30am IST); we are fine with remote work Note that this role is for India only and we do not plan on transferring hires to the U.S./UK in the future PREFERRED QUALIFICATIONS : Python for automation or workflow logic Basic experience with: Designing data pipelines and optimizing them Working on AI agents, automation tools, or workflow scripting Dashboard design and data storytelling And one of: ELT experience Experience with automation Statistics background Exposure to: Health related datasets, hashed identifiers Workflow optimization or code refactoring Project Management tools like JIRA or Confluence Bonus if you've worked on R&D or helped build data products from scratch WHAT WE ' RE LOOKING FOR: We're looking for a hands-on, reliable, and proactive Analyst who can: Jump into complex workflows and own them end-to-end Troubleshoot issues and bring clarity in ambiguous situations Balance between deep technical work and cross-team collaboration Build scalable, automated, and accurate solutions SELECTION PROCESS: Initial Phone Screen SQL Screening Test via CodeSignal (35 minutes) SQL Live Coding Interview (60 minutes) Hiring Manager Interview (30 minutes) Team Interview (1:1s with Sr. Client Analyst, Team Manager, SVP of Data, Product Manager who built Signal) (3 x 45 minutes) RED FLAGS FOR US: Candidates won’t succeed here if they haven’t worked closely with data sets or have simply translated requirements created by others into SQL without a deeper understanding of how the data impacts our business and, in turn, our clients’ success metrics. Watch this video here to learn more about our culture and get a sense of what it’s like to work at PulsePoint! WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law.
Posted 5 days ago
10.0 - 14.0 years
0 - 0 Lacs
karnataka
On-site
You are invited to join Keyrus, a trusted leader in Data Intelligence with over 27 years of experience and a global presence in 26 countries. Keyrus specializes in providing strategic data engineering, data analytics, and data science solutions to clients in industries such as financial services, utilities, and energy. As part of our rapidly expanding team in the UK, we are currently looking for a skilled Business Analyst / Project Manager (BA/PM) to take on a key role in leading an ESG Regulatory Reporting initiative for a leading Global Custody Bank. In this role, you will collaborate closely with the Product Owner and a team of 3 Alteryx developers to define, plan, and deliver regulatory reporting requirements within an agile environment. Location: Bangalore, India Salary range: 38 to 42 LAKH Your responsibilities will include conducting thorough business analysis by working closely with stakeholders to gather and validate ESG regulatory reporting requirements. You will then translate these requirements into functional specifications and user stories for the development team. Additionally, you will manage the Agile delivery process, prioritize the product backlog, track sprint progress, and ensure timely delivery of milestones. Stakeholder management is a crucial aspect of this role as you will act as the central point of contact between the Product Owner, developers, and other key stakeholders. To be successful in this role, you should possess proficiency in SQL, have over 10 years of experience in Data, BI/Analytics, and Data Warehousing, and demonstrate a strong understanding of Agile methodologies. Prior experience as a BA and/or PM on regulatory reporting projects, particularly in ESG or financial services, is highly desirable. Furthermore, experience working with Alteryx developers or similar ETL/data pipeline teams, strong communication skills, and the ability to bridge the gap between technical and non-technical teams are essential. Nice-to-have qualifications include familiarity with ESG regulations, a background in custody banking or asset servicing, experience with tools like Jira and Confluence, and an understanding of data lineage, data quality, and control frameworks. By joining Keyrus, you will become part of a market leader in the Data Intelligence field, offering the opportunity to work with thought-leading professionals in an innovative and dynamic environment. Keyrus provides a range of benefits including a competitive holiday allowance, Private Medical Plan, Flexible working patterns, Workplace Pension Scheme, Sodexo Lifestyle Benefits, Discretionary Bonus Scheme, and Training & Development opportunities via KLX (Keyrus Learning Experience).,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Engineer, you will be responsible for developing data inbound and outbound patterns using Oracle OnPrem technology. Your key tasks will include ensuring data quality and integrity throughout the data lifecycle, proficiently using SQL for data extraction, transformation, and loading (ETL), monitoring and analyzing data workflows for bottlenecks and failures, optimizing data processing workflows for performance and efficiency, and integrating data into the Data Lake following architectural standards. Moreover, you will be expected to document and standardize data processes and workflows. In addition to the mandatory skills, having experience in implementing architectural best practices for data patterns and process design, providing technical support and training to internal stakeholders on data processing, and familiarity with process automation or tooling to enhance data workflows will be advantageous. The ideal candidate for this position should have 6-9 years of experience in the field. If you are passionate about data engineering and possess the required skills and experience, we encourage you to apply for this opportunity.,
Posted 5 days ago
4.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Position Type Full time Type Of Hire Experienced (relevant combo of work and education) Travel Percentage 0% As the world works and lives faster, FIS is leading the way. Our fintech solutions touch nearly every market, company and person on the planet. Our teams are inclusive and diverse. Our colleagues work together and celebrate together. If you want to advance the world of fintech, we’d like to ask you~ Are you FIS? This role will be located at either our Jacksonville Headquarters or our Brown Deer facility. We have a hybrid work environment. What you will be doing? Troubleshoot and resolve technical issues related to Azure and SQL Server Develop data solutions as a member of the infrastructure team Understand business requirements and existing ecosystems Organize, analyze, and transform data from different sources Design and implement ETL processes Work with cross-functional teams to develop solutions that meet business requirements Collaborate with other teams to ensure that solutions are scalable and maintainable Develop and maintain technical documentation What You Will Need Degreed in Computer Science preferred Minimum of 4 years of experience Proficient working knowledge of Azure, SQL and ETL Programming language (any will be an asset) Working knowledge of DWH Experience with JSON, XML data structures Experience working with APIs What We Offer You At FIS, you can learn, grow and make an impact in your career. Our benefits include~ Flexible and creative work environment Diverse and collaborative atmosphere Professional and personal development resources Opportunities to volunteer and support charities Competitive salary and benefits Current and future sponsorship are not available for this position Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As an AES Informatica Middleware Associate in the Middleware Integration Tower at PwC, you will be responsible for assisting in the design, development, and testing of ETL workflows using Informatica. Your role will involve supporting day-to-day operations and monitoring of ETL jobs, collaborating with data analysts and business teams to gather requirements, documenting mappings, data flows, and job schedules, participating in code reviews and unit testing, as well as troubleshooting issues with data loads and resolving failures. To excel in this position, you are expected to have at least 2-5 years of experience in Informatica-based ETL development, a good understanding of SQL and relational databases, exposure to job scheduling and monitoring tools, and a basic understanding of data warehousing concepts. In addition to the required skills, it would be advantageous to have experience with cloud platforms like AWS or Azure, familiarity with data governance tools and practices, experience in Agile delivery and DevOps practices, knowledge of integration with SAP or Oracle systems, and Informatica certifications. Your success in this role will be driven by your ability to apply a learning mindset, take ownership for your own development, appreciate diverse perspectives, sustain high performance habits, actively listen, ask questions, seek feedback, and gather information from various sources to analyze facts and discern patterns. By consistently delivering quality work that adds value for clients and collaborating effectively with team members, you will contribute to the success of the team and build a strong professional brand for yourself at PwC. This position requires a BE/B.Tech/ME/M.Tech/MBA/B.Sc/B.Com/BBA educational qualification and is based in India. If you are a curious individual who thrives in a fast-paced environment and enjoys working with a variety of clients and team members, this role offers you the opportunity to specialize in utilizing and managing Informatica software and solutions within an organization.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an ADW Senior Data Engineer, you will be responsible for providing prompt and effective support, maintenance, and development on OBIA based Analytics Datawarehouse using Oracle Data Integrator (ODI) as the underlying ETL Tool. Your role involves implementation, development, and maintenance of the ODI environment, including data warehouse design, dimensional modeling, ETL development & support, and ETL performance tuning. Your primary responsibilities will include solution design, implementation, migration, and support in Oracle BI Tool stack, especially ODI and SQL. You will be involved in ODI development in OBIA Environment, enhancements, support, and performance tuning of SQL programs. Additionally, you will work on data warehouse design, development, and maintenance using Star Schema (Dimensional Modeling). You will be responsible for production support of daily running ETL loads, monitoring, troubleshooting failures, and bug fixing across environments. Experience in Oracle BI Analytics Warehouse Methodology & Star Schema will be essential, along with working with different data sources such as Oracle, CRM, Cloud, Flat Files, Sharepoint, and other non-Oracle systems. Performance tuning of mappings in ODI and SQL query tuning will also be part of your expertise. Your expertise in data warehousing concepts like SCDs, Dimensional Modeling, Archive strategy, Aggregation, Hierarchy, and database concepts like Partitioning, Materialized views will be crucial. Migration and other deployment activities in Oracle BI tool stack (ODI), Kintana, and PVCS are also within your scope of responsibilities. Working knowledge of OBIEE, strong Oracle database experience, and understanding of BI/data warehouse analysis, design, development & testing are required. You should have a strong understanding of Change Management Processes and basic knowledge of SBM and ServiceNow. To excel in this role, you should have 5-7+ years of relevant experience working in OBIA on ODI as the ETL tool in BIAPPS environment. Strong written and oral communication skills, ability to work in a demanding user environment, and knowledge of tools like Serena Business Manager and ServiceNow are essential. Coordinating among various teams, working with Project Managers, designing and improving BI processes, and readiness to work in 24*7 environments are key aspects of this position. The required qualification for this role is B.Tech / MCA, and the desired competencies include being tech-savvy, effective communication, optimizing work processes, cultivating innovation, and being a good team player.,
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
A career within Functional and Industry Technologies services will provide you with the opportunity to build secure and new digital experiences for customers, employees, and suppliers. We focus on improving apps or developing new apps for traditional and mobile devices as well as conducting usability testing to find ways to improve our clients user experience. As part of our team, youll help clients harness technology systems in financial services focusing on areas such as insurance, sales performance management, retirement and pension, asset management, and banking & capital markets. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Use feedback and reflection to develop self awareness, personal strengths and address development areas. - Delegate to others to provide stretch opportunities, coaching them to deliver results. - Demonstrate critical thinking and the ability to bring order to unstructured problems. - Use a broad range of tools and techniques to extract insights from current industry or sector trends. - Review your work and that of others for quality, accuracy and relevance. - Know how and when to use tools available for a given situation and can explain the reasons for this choice. - Seek and embrace opportunities which give exposure to different situations, environments and perspectives. - Use straightforward communication, in a structured way, when influencing and connecting with others. - Able to read situations and modify behavior to build quality relationships. - Uphold the firm's code of ethics and business conduct. Years of Experience - 2 to 5 years of experience Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - - Hands-on Experience in P&C Insurance on Guidewire DataHub/InfoCenter Platform. - Experience in mapping Guidewire Insurance Suite of products (PC/BC/CC/CM) to DHIC. - Works with business in identifying detailed analytical and operational reporting/extracts requirements. - Able to create Microsoft SQL / ETL / SSIS complex queries. - Participates in Sprint development, test, and integration activities. - Creates detailed source to target mappings. - Creates and validates data dictionaries - Writes and validates data translation and migration scripts. - Communicating with business to gather business requirements. - Performs GAP analysis between existing (legacy) and new (GW) data related solutions. - Working with Informatica ETL devs. - Knowledge of Cloud AWS,
Posted 5 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Senior Solution Architect in Database Technology based in Bengaluru, India, with 7 to 10 years of experience, you will be responsible for leading the design and implementation of scalable, high-performance, and secure database solutions. Your expertise will be crucial in providing recommendations on the selection and integration of various database technologies, including relational, NoSQL, and cloud-native platforms. Developing detailed system architecture blueprints, data models, schemas, and integration flows will be part of your role to ensure alignment with business use cases and performance objectives. Collaboration with stakeholders, technical teams, and project managers will be essential to translate business requirements into technical solutions. Your involvement in pre-sales activities, technical presentations, estimations, and scoping documentation will support client consultations. Additionally, advising clients on best practices for database management, performance optimization, and scalability will contribute to successful project outcomes. Embracing technology leadership, you will lead the evaluation and adoption of emerging database technologies and trends while guiding junior architects and developers in applying best practices. Overseeing performance optimization and ensuring database systems meet defined SLAs will be critical to maintaining operational efficiency. Your role will involve ensuring successful end-to-end implementation of database solutions across multiple projects and managing third-party database vendors and tool integrations effectively. Contributing to project roadmaps and ensuring alignment with timelines, budgets, and architectural goals will be part of your responsibilities. Maintaining security and compliance will be a key aspect of your role, ensuring adherence to industry regulations such as GDPR, HIPAA, and SOC 2. Implementing data protection strategies, conducting regular security and performance audits, and overseeing disaster recovery plans for all database systems will be essential. To excel in this role, you should have strong hands-on experience with relational databases like Oracle, SQL Server, MySQL, and PostgreSQL, as well as experience with NoSQL databases such as MongoDB, Cassandra, and DynamoDB. Proficiency in cloud-based database services like AWS RDS, Azure SQL Database, and Google Cloud SQL is required. Expertise in designing and scaling transactional and analytical workloads, familiarity with ETL, data warehousing, and integration tools, and knowledge of architecture tools like Liquibase, Flyway, Docker, Kubernetes, and database monitoring tools are essential. Your leadership and communication skills will play a significant role in mentoring cross-functional teams, stakeholder management, and handling multiple projects effectively. Preferred certifications include Oracle Certified Architect or similar, as well as AWS/Azure/GCP Solutions Architect certifications. If you are a skilled professional with a passion for designing large-scale database architectures, optimizing performance, and ensuring security and compliance, this role offers an exciting opportunity to lead database technology solutions and drive innovation in a dynamic environment.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As the Global Reporting GRH at Deutsche Bank in Pune, India, you will play a crucial role in ensuring the bank understands the profitability of each business activity and manages its financial resources effectively. Your responsibilities will include producing and distributing LCR/NSFR reports for local entities, conducting product-level and metric-level analytics, and ensuring accurate financial details are overseen globally. Collaborating with cross-functional teams, you will define and implement strategic reporting and automation solutions to drive business adoption of reporting tools. Your role will involve evaluating and recommending tools based on cost, infrastructure readiness, and resource availability, as well as standardizing reporting frameworks to align with data governance and compliance standards. Preferred tools and technologies you will work with include SAP Business Objects, SAP Lumira, SAP Analytics Cloud for reporting and visualization, ETL tools for automation, and platforms like Tableau and Power BI for data visualization. To excel in this role, you should possess strong data analysis skills, attention to detail, and effective communication skills. Your experience with SAP Business Objects, ETL, and visualization tools will be valuable, along with knowledge of Financial Planning and Performance in a banking environment. A proactive approach, ability to work independently, and openness to feedback are essential qualities for success in this position. You should hold a bachelor's degree or equivalent qualification in a relevant financial discipline or engineering field. Training, coaching, and continuous learning opportunities will be provided to support your career progression, while a range of flexible benefits are available for you to tailor to your needs. Deutsche Bank encourages a culture of empowerment, responsibility, commercial thinking, initiative-taking, and collaboration. The company values inclusivity and fairness, striving for a positive work environment where employees can excel together. Join us in celebrating the successes of our diverse teams and be part of the Deutsche Bank Group's journey towards excellence. Apply now for this Internal Promotion opportunity and contribute to our global financial resource management efforts.,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As an Associate Director Oracle Technical Architect, you will be a certified Oracle Technical Architect proficient in architecting, solutioning, implementing, and developing various Oracle SaaS, PaaS, and IaaS solutions. You will be responsible for mentoring, guiding, and monitoring a skilled technical team proficient in technologies like Oracle Integration Cloud, Oracle BIP, Oracle FAW, Oracle APEX, Oracle Business Process Management, Oracle Java Cloud Services, Oracle VBCS, node.js, etc. Your key responsibilities will include architecting solutions for different customers" technical challenges and functional needs by identifying the best-in-class technology solutions leveraging the Oracle footprint. You will provide detailed technical architecture and roadmap to the development and functional team for seamless conversion, integration, reports, and workflow improvements. Additionally, you will design, develop, and maintain robust integrations using Oracle Integration Cloud, as well as metrics-based analytics and reporting solutions using technologies like FAW, OTBI, or BIP. You will also be responsible for providing insights and implementing best coding practices and standards, assisting with administration and provisioning of various servers and related services in OCI, fine-tuning codes for quality delivery to customers, and collaborating with stakeholders to gather and translate functional and technical requirements into effective solutions. Your ability to work across various functional areas like ERP, HCM, EPM, SCM, and CX on both cloud/SaaS and on-premises solutions of Oracle will be crucial. As a thought leader, you will mentor, guide, and monitor technical/development team members and ensure technical delivery for the entire organization across multiple customers. Your primary technical skills should include proficiency in Oracle Techno Functional applications, strong expertise in SQL, PL/SQL, Oracle database technologies, designing and developing integrations using OIC, Oracle APEX, ATP solutioning, data modeling, ETL, data warehousing, and experience in fine-tuning database queries. Strong analytical and solution-oriented skills are essential. Certifications required for this role include OCI Cloud Architect, preferred Oracle Cloud Fusion Analytics Warehouse Certified Implementation Professional, Oracle APEX developer, and Oracle SaaS-related certifications. Qualifications should include a Master's degree in computer science, Information Technology, or related field, a minimum of 6 years in Oracle Cloud applications technology field, added experience in Oracle on-premises solutions, excellent problem-solving abilities, strong communication skills, collaborative approach, and strong leadership skills aligned with organizational goals and success.,
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France